A couple of weeks ago I blogged about the dangers of stories and how, if you’re unaware of the biases baked into our brains, stories and the storytellers who tell them can lure you into tar pits. In this post I’m going to write about another cognitive bias.
But first, of course, a story.
James, who manages our .NET division, is a smart guy. I grab him, and explain that I want to give him a quick test. I know he likes a good test. The way it works, I say, is that I write down sets of three numbers on a flip-chart. I have a rule in my head, which any set of three numbers either passes or fails. James’s task is to guess my rule. The way he’s got to do this is by giving me sets of three numbers. I will say whether each set passes or fails my rule. He can then guess the rule.
To get him started, I give two sets of three numbers. Both sets pass obey my rule:
1, 4, 7
9, 12, 15
James looks puzled. This is obviously a trick question, he thinks. He gives me three numbers:
5, 8, 11
These pass my rule, I say. At this point James is confident but, just to be sure, he gives three more numbers:
3, 6, 9
I tell him these pass my rule. James says he’s guessed the rule: each number is 3 higher than the previous number.
Wrong, I tell him.
Now James is confused. He guesses that it’s to do with the shape of the numbers. It isn’t. He gives 3 more numbers:
1, 2, 3
These pass my rule. James guesses that my rule is that numbers are increasing. Wrong, I say. James gives up, so I tell him what my rule is: the third number must be bigger than the second number.
This artificial experiment is an interesting illustration of a couple of human tendencies. First of all, we jump to conclusions. Secondly – more important, but also far more subtle – we tend to seek out evidence that confirms our hasty conclusions, rather than evidence that might contradict them.
James – understandably – had the hypothesis “each number is 3 higher than the previous number” in his head. He then tried to prove this by choosing sets of numbers that obeyed his hypothesis, rather than seek out sets of numbers that might falsify it. Every time he suggested a set of numbers that obeyed my rule he become more certain that he was correct. Wrongly.
My example of James and the numbers is contrived, but the principles
apply in real life too. If your software isn’t selling well this month
then you might jump to the conclusion that it’s because of a downturn
in the economy, and then seek to verify that. Instead, you should think
about how you would disprove your theory, or explore alternative
hypotheses. If two people struggle to use your software you might
conclude that they’re both idiots, and then seek out examples to prove it.
Instead, you should examine alternative explanations – the problem
might be that your software sucks. Or you might erroneously conclude that your
software sucks – and seek out evidence to verify that – when the
problem really is that the two people are idiots.
Why are we prey to this bias? Possibly, because our minds have evolved to make quick decisions based on scant data. On the savannah a hundred thousand years ago, if you’ve just seen your brother gored by a rhinoceros, the conclusion that “all rhinoceroses are dangerous” is a good one to jump to. Seeking counter-examples, or considering alternative hypotheses, would be logically correct, but evolutionarily limiting.
But we’re not in the savannah any more, and behaviour that was appropriate back then isn’t always useful now. The textbook example is our liking for sugar: a sweet tooth was useful back when food was scarce, but in an era of Coca Cola and Big Macs our instinct to grab calories whenever and wherever we can find them just makes us fat.
There are plenty of other examples of cognitive biases that sway the way we think. Over the next few weeks I’ll write some more about them and how they can help you in the business of software. Subscribe to my RSS feed to keep up to date.