Three thinking tools to overcome confirmation bias

Understanding that your reasoning is undermined by bias, is one of the most valuable insights into your own decision making capabilities. I wrote another piece on biases (Morality Binds and Blinds), but the title might have been such that it was my least successful blog entry measured by number of reads. Or, it might just be too unsettling to think and read about the flaws in your own reasoning. Nevertheless, I think understanding biases are hugely important for better decision making, and offer you three thinking tools to overcome the arguably most famous of all biases: confirmation bias.

Falling into the trap of confirmation bias, we all do it

An intriguing discussion of confirmation bias can be found in Jonathan Haidt’s The Righteous Mind:

(…) confirmation bias, the tendency to seek out and interpret new evidence in ways that confirm what you already think. People are quite good at challenging statements made by other people, but if it’s your belief, then it’s your possession – your child almost – and you want to protect it, not challenge it and risk losing it.

People outside Haidt’s realm of moral psychology and moral philosophy also figured out that people mainly reason to be right instead of reasoning to get to the truth. Here are some quotes that sum up human nature pretty well, I feel:

The eye sees only what the mind is prepared to comprehend. (Robertson Davies, fiction writer, in: Tempest-Tost.)

For it is a habit of humanity to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy. (Thucydides, chronicler of the Peloponnesian War, 4th century BCE.)

We human beings are egotistical animals; each of us wants to win the argument. (Garrett Hardin, human ecologist, in: Filters Against Folly.)

You would think that educating people will surely get rid of the confirmation bias. Or surely the more intelligent people are they will be able to come up with many more reasons for both sides of any argument? Think again. The opposite is quite true (again from The Righteous Mind):

The findings get more disturbing. Perkins found that IQ was by far the biggest predictor of how well people argued, but it predicted only the number of my-side arguments. Smart people make really good lawyers and press secretaries, but they are no better than others at finding reasons on the other side. Perkins concluded that “people invest their IQ in buttressing their own case rather than in exploring the entire issue more fully and even handedly”.

This is probably why rhetoric is simultaneously admired and met with scepticism. Is the rhetorician building a solid rational case? Or is she only trying to win people’s minds over for her own view? Plato famously bashed rhetoric and sophistry in his dialogues Gorgias and Republic, making the case that winning arguments with rhetoric is more about persuasion than knowledge. (Plato, in turn, was of course also accused of only coming up with supporting reasoning for his worldview. Even the greatest of thinkers are not immune to conformation bias it seems.)

If we agree that conformation bias is indeed a serious problem, it limits our capacity to reach the best outcomes by thinking that we are rational beings always arriving at rational conclusions. Haidt and others (e.g. John Gray; see my blog post on progress for a short discussion of his book The Silence of Animals – on Progress and Other Myths) even go so far as to call this the rationalist delusion. What to do?

Three thinking tools to overcome conformation bias

1. Put together a group consisting of members with different backgrounds.

Jonathan Haidt’s suggestion: always use several people with different ideologies and background when making a decision so they can disprove arguments put forward by individuals (emphasis added):

(..) each individual reasoner is really good at one thing: finding evidence to support the position he or she already holds, usually for intuitive reasons. We should not expect individuals to produce good, open-minded, truth-seeking reasoning, particularly when self-interest or reputational concerns are in play. But if you put individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasoning as an emergent property of the social system. This is why it’s so important to have intellectual and ideological diversity within any group (..).

This short passage reminded me of a method I used some time ago to try to get the most out of a group of experts who were all convinced of the superiority of their own solution to the problem at hand:

2. Use parallel thinking to have members look at the problem from the same angle and then switch angles a number of times using six different ‘hats’.

Once you have put together a diverse group of people, engage in parallel thinking. This method has been perfected by psychologist and management thinker Edward de Bono. In his book How to Have A Beautiful Mind, he echoes Haidt and others in noticing:

Argument is an excellent method and has served us well. At the same time (..) it is unsophisticated. Each side makes a ‘case’ and then seeks to defend that case and prove the other ‘case’ to be wrong. It says, in short: ‘I am right and you are wrong.’

De Bono gives a summary of his highly effective way of looking at decision making processes (emphasis added):

  • The direction of thinking is indicated by six coloured hats, each of which indicates a mode of thinking. At any moment everyone is ‘wearing’ the same colour ? hat. That is what is meant by ‘parallel thinking’.
  • The white hat indicates a focus on information. What do we have? What do we need? How are we going to get the information we need?
  • The red hat gives full permission for the expression of feelings, emotions and intuition without any need to give the reasons behind the feelings.
  • The black hat is for ‘caution’ and the focus is on faults, weaknesses, what might go wrong and why something does not ‘fit’. [This is the classical argumentative mode we often find ourselves in, trying to disprove other people’s arguments.]
  • With the yellow hat the focus is on values, benefits and how something can be done.
  • The green hat sets aside time, space and expectation for creative effort.
  • The blue hat is to do with the organization of thinking. This means setting up the focus and also putting together the outcome.
  • The hats make sure that everyone is using his or her own thinking fully to explore the subject. If you want to show off you now do this by out-performing others on each hat.

Using the six thinking hats is a useful approach to cure confirmation bias. The subject is really explored and should defuse argumentative behavior. The actual quality of the arguments put forward (checked through the black hat) can benefit from a third thinking tool proposed by Garret Hardin in Filters Against Folly:

3. For every argument put forward use the black hat to check against Hardin’s concepts ‘literacy’, ‘numeracy’, and ‘ecolacy’.

For everyone to understand what is put forward, Hardin proposes to pass an argument through three filters, because:

In the universal role of laymen we all have to learn to filter the essential meaning out of the too verbose, too aggressively technical statements of the experts. Fortunately this is not as difficult a task as some experts would have us believe.

Questions we should ask ourselves to understand any argument are:

  • On ‘literacy’. What are the words that we are using? What do they mean? What do these words mean in reality, if we start working with these concepts?
  • On ‘numeracy’. What are the numbers? What do the numbers mean? What is the relative size of quantifiable factors? Are there scale effects? Can we attach words to the numbers in order to convey meaning?
  • On ‘ecolacy’. In ecological thinking (or systems thinking) we introduce time into the equation of the words and numbers used. If we pursue this action, tactic, or strategy, what will happen next? And what if keep repeating this? What are the effects over time? What could the perverse effects be?

Hardin stresses the importance of using all three filters:

The skills of Readin’, Writin’, and ‘Rithmetic need to be combined with an attitudinal checklist that asks if the best words have been used, if quantities have been duly considered, and if the consequences of time and repetition have been taken into account. The “bottom line” of an analysis needs to be subjected to filtration that is simultaneously literate, numerate, and ecolate. (..) We use the ecolate filter to ferret out at least the major interconnections. Every proposal of a plausible policy must be followed by the question “And then what?” Not until we have asked this question (and answered it to the best of our ability) are we ready to put a plan into action.

By way of summary: understand what confirmation bias is, acknowledge that any individual falls victim to it, and then apply the three thinking tools discussed in this blog. Now you are well underway to better decision making.

Scroll to Top