When what you see is all there is

Imagine a coin is flipped 20 times and heads comes up every time. You’re given a large sum of money to bet on the next flip. What would you call?  

Many of us would call tails on the basis that it seems ‘due’. After all, if a coin is flipped 21 times, then the probability of 21 heads is 1 in 2,097,152. Intuitively, we know that such streaks have low and declining probability. Critically, however, this has no bearing on the next flip of the coin. The coin, much like the roulette wheel that has had a run of red numbers, has no memory. The probability of flipping a head after 20 heads in a row is the same as always: 1/2. 

The flawed thinking that draws many of us to tails is known as the gambler’s fallacy. The mistake is to expect to see what happens over a much larger (or infinite) sample of flips to be represented at a much smaller sample size. For instance, the sequence H-T-H-T-T-H would be considered more likely by many people than H-H-H-T-T-T, simply because it seems more random, or more representative of a random process. 

The gambler’s fallacy is just one example of a broader cognitive trap known as representativeness. It describes our willingness to judge events by how they appear rather than by how likely they are actually according to the rules of probability. We use stereotypes, individualising or unrepresentative information at the expense of more accurate, yet complex calculations that we think are the sole realm of the statistician. The latter require careful thinking by the rational, calculating part of our brains and the evidence from behavioural surveys is that we much prefer jumping to conclusions than carrying out the math. 

The different way our minds treat readily available information and information that is unknown or hard to get is described by behavioural psychologist Daniel Kahneman in the acronym WYSIATI or “What You See Is All There Is”. In short, we jump to conclusions based on limited data – much of the time, the coherent story we put together in our heads is close enough to reality, enabling us to think fast in a complex world. But, other times the shortcuts we make lead us astray. 

This is a particular issue for investors who are faced with forming judgements about an uncertain future, based on how things appear now. Our judgements often depend on the coherence of a story that seems compelling to us, rather than the quality or the completeness of the data on which it is based, and the changes that can occur in that data.

Quite often, representativeness shortcuts work because there is truth in stereotypes. However, we need to be aware that these shortcuts are fallible – particularly when they cause us to neglect base rate information that points to a different conclusion.