Data Fallacies to Avoid | An Illustrated Collection of Mistakes People Often Make When Analyzing Data
Recently my team embarked on a mission to make data more accessible and useful to everyone, creating free resources to utilise whilst studying, analyzing and interpreting data.
The first resource we created was ‘Data Fallacies to Avoid’, an illustrated collection of mistakes people often make when analyzing data.
1. Cherry Picking
The practice of selecting results that fit your claim and excluding those that don’t. The worst and most harmful example of being dishonest with data.
2. Data Dredging
Data dredging is the failure to acknowledge that the correlation was, in fact, the result of chance.
3. Survivorship Bias
Drawing conclusions from an incomplete set of data, because that data has ‘survived’ some selection criteria.
4. Cobra Effect
When an incentive produces the opposite result intended. Also known as a Perverse Incentive.
5. False Causality
To falsely assume when two events occur together that one must have caused the other.
6. Gerrymandering
The practice of deliberately manipulating boundaries of political districts in order to sway the result of an election.
7. Sampling Bias
Drawing conclusions from a set of data that isn’t representative of the population you’re trying to understand.
8. Gambler’s Fallacy
The mistaken belief that because something has happened more frequently than usual, it’s now less likely to happen in future and vice versa.
9. Hawthorne Effect
When the act of monitoring someone can affect that person’s behaviour. Also known as the Observer Effect.
10. Regression Fallacy
When something happens that’s unusually good or bad, over time it will revert back towards the average.
11. Simpson’s Paradox
A phenomenon in which a trend appears in different groups of data but disappears or reverses when the groups are combined.
12. McNamara Fallacy
Relying solely on metrics in complex situations can cause you to lose sight of the bigger picture.
13. Overfitting
A more complex explanation will often describe your data better than a simple one. However, a simpler explanation is usually more representative of the underlying relationship.
14. Publication Bias
How interesting a research finding is affects how likely it is to be published, distorting our impression of reality.
15. Danger of Summary Metrics
It can be misleading to only look at the summary metrics of data sets.
We then transformed the collection into a wall poster designed to hang in educational facilities, workplaces and other areas where these mistakes are often made.
Recent Comments