How to Beat Morality Bias in Decision Making

June 2016 was one of those months where groups took center stage over individuals. As Euro 2016 got underway, people no longer supported Man United or Man City, Lazio or Roma, Barça or Real. Instead, they support another group, their country. Then, June 23rd witnessed a clash of supporters and opponents of a Brexit. The lead-up to the referendum made me think: it seems nearly impossible to be persuaded by rational arguments of the other side. Why are we so groupish in our thinking? And, if you consider yourself to be part of a group, will you agree with all or most of the group’s standpoints and base your decisions on those? An Economist article reported that even ‘economists tend to fall into rival camps defined by distinct beliefs’.

In his landmark contribution to humanity’s understanding of itself (according to The New York Times), moral psychologist Jonathan Haidt has some interesting thoughts on why we are so groupish in the first place. Haidt (in his book The Righteous Mind) proposes that natural selection in humans not only took place on the individual level but also on group level:

Most of human nature was shaped by natural selection operating at the level of the individual. Most, but not all. We have a few group-related adaptations too (. . .). We humans have a dual nature – we are selfish primates who long to be a part of something larger and nobler than ourselves.

When everyone in a group began to share a common understanding of how things were supposed to be done, and then felt a flash of negativity when any individual violated those expectations, the first moral matrix was born.

Natural selection favored increasing levels of (. . .) “group-mindedness”—the ability to learn and conform to social norms, feel and share group-related emotions, and, ultimately, to create and obey social institutions, including religion.

This is a huge insight. Once you are caught in group thinking it becomes really hard to see the other side of the story. The story of people in other groups with other moral matrices:

Moral matrices bind people together and blind them to the coherence, or even existence, of other matrices. This makes it very difficult for people to consider the possibility that there might really be more than one form of moral truth, or more than one valid framework for judging people or running a society.

One of the phrases Haidt uses throughout the book is therefore: ‘Morality Binds and Blinds.’

“Yes, but …” I hear you protesting just reading this. Because we are rational human beings and our rational nature will overcome our biases. We are surely always trying to get to absolute truth? Haidt’s research leads him to disagree with the rationalists:

We do moral reasoning not to reconstruct the actual reasons why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgment.

You’ll misunderstand moral reasoning if you think about it as something people do by themselves in order to figure out the truth.

What, then, is the function of moral reasoning? Does it seem to have been shaped, tuned, and crafted (by natural selection) to help us find the truth, so that we can know the right way to behave and condemn those who behave wrongly? If you believe that, then you are a rationalist, like Plato, [and] Socrates (. . .). Or does moral reasoning seem to have been shaped, tuned, and crafted to help us pursue socially strategic goals, such as guarding our reputations and convincing other people to support us, or our team, in disputes? If you believe that, then you are a Glauconian. [Glaucon, Plato’s brother, famously claims that people are only virtuous because of fear of a bad reputation. His argument can be found in Plato’s Republic.]

Haidt makes a very convincing case that our thinking is mainly an after the fact activity to justify our quick intuitive moral judgment. And moral judgments are based on moral matrices of the group you are a member of.

My point here is not to counter the idea that we at least try to make rational decisions. But it is worthwhile to keep Haidt’s warning in mind (‘Morality Binds and Blinds’) the next time you enter a project, program or decision making process in which several groups with different backgrounds take part. Ask yourself if your thinking is really an objective weighing of pros and cons, or your thoughts fall prey to a morality bias.

A good counter measure to prevent yourself from falling into the trap of a morality bias – and maybe other biases too – is a rule the investor and Berkshire Hathaway Vice Chairman Charlie Munger uses:

I have what I call an iron prescription that helps me keep sane when I naturally drift toward preferring one ideology over another. And that is I say “I’m not entitled to have an opinion on this subject unless I can state the arguments against my position better than the people do who are supporting it. I think that only when I reach that stage am I qualified to speak.” Now you can say that’s too much of an iron discipline..it’s not too much of an iron discipline. It’s not even that hard to do.

Scroll to Top