Holiday Reading List 2018 – The Rationalist Delusion

Finally, this last year, long overdue, I picked up Daniel Kahneman’s Thinking Fast and Slow (2011). And what a book it is. If you still thought you were a rational human being, deliberately making judgements, weighing pros and cons for every decision you make, consider this quote from Thinking Fast and Slow:

(..) emotion now looms much larger in our understanding of intuitive judgments and choices than it did in the past. The executive’s decision would today be described as an example of the affect heuristic [a mental shortcut], where judgments and decision are guided directly by feelings of liking and disliking, with little deliberation or reasoning.

This resonates so strongly with the work of Jonathan Haidt in The Righteous Mind, that it made me think of one of the themes of that book, the rationalist delusion:

As an intuitionist, I’d say that the worship of reason is itself an illustration of one of the most long-lived delusions in Western history: the rationalist delusion. It’s the idea that reasoning is our most noble attribute, one that makes us like the gods (for Plato) or that brings us beyond the “delusion” of believing in gods (for the New Atheists). The rationalist delusion is not just a claim about human nature. It’s also a claim that the rational caste (philosophers or scientists) should have more power, and it usually comes along with a utopian program for raising more rational children.

How’s that for some provocative ideas worth exploring this holiday?

Haidt’s The Righteous Mind is not on this year’s list because I used it in the past for a number of blogs on biases (see this one on morality bias; and this one on confirmation bias). But the ideas of that book strongly influenced the way I progressed onto the books of this year’s list. Here we go:

Thinking Fast and Slow. Daniel Kahneman. A landmark book if you want to make better decisions. Kahneman shows, that by relying mostly on system 1 (mental shortcuts based on feelings, emotions and morality) in decision-making, and not on system 2 (our rationalist selves), we make predictable errors of judgment. The intuitive system 1 is a lot more influential than your think. Kahneman:

This is the essence of intuitive heuristics [rules of thumb]: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.

Learn how you fool yourself and read about: the availability heuristic (or What-You-See-Is-All-There-Is heuristic), anchoring bias, the law of small numbers, availability bias, the halo effect, and many, many more.

The Economist have their own way of describing the rationalist delusion in a review of this outstanding book:

As Copernicus removed the Earth from the centre of the universe and Darwin knocked humans off their biological perch, Kahneman has shown that we are not the paragons of reason we assume ourselves to be.

The Master and His Emissary – The Divided Brain and the Making of the Western World. Iain McGilchrist. A different dichotomy than intuition and reason is discussed in this ‘fascinating book’ (Financial Times). The leading question here is: ‘Why is the brain divided?’ McGilchrist:

(..) the hierarchy of attention, for a number of reasons, implies a grounding role and an ultimately integrating role for the right hemisphere, with whatever the left hemisphere does at the detailed level needing to be founded on, and then returned to, the picture generated by the right.

This book is almost two books into one: the first part is steeped into neuroscience, tells us why the brain is divided, and which functions the left and right hemispheres perform. (If I would have to place Kahneman’s systems 1 and 2 in McGilchrist’s left and right hemispheres, system 1 would reside in the left, and system 2 in the right hemisphere.) In the second part of the book (called ‘How the Brain Has Shaped Our World’), the story unfolds in a dramatic way. McGilchrist takes us on a tour through the Ancient World (Plato, again, also see my blog on him here), the Renaissance, Enlightenment and the Industrial Revolution, to come to some daring propositions. One of the most striking ones is that the left hemisphere (the Emissary) has become so dominant that it has seized power over the right hemisphere (the Master), creating a Western culture with an obsession for structure, narrow self-interest and a mechanistic view of the world. I had to think of books on the 2016 reading list by John Gray and Matthew Crawford when I read this. True, or not, it makes for some great reading and stuff worth discussing over a good glass of wine during the Holidays.

Why Buddhism Is True. Robert Wright. No, I’m not going religious on you. And no, I’m not going Buddhist on you. Lauded by The New York Times Book Review, The Guardian, The New Yorker and Scientific American, this book is Darwinian in nature. There’s also a good deal of Kahneman and McGilchrist here:

Again, the part of the brain that controls language [system 2; left hemisphere] had generated a coherent, if false, explanation of behavior  ̶  and apparently had convinced itself of the truth of the explanation. The split-brain experiments powerfully demonstrated the capacity of the conscious self to convince itself that it’s calling the shots when it’s not. (..) In short, from natural selection’s point of view, it’s good for you to tell a coherent story about yourself, to depict yourself as a rational, self-aware actor. (..) It is possible to argue that the primary evolutionary function of the self is to be the organ of impression management [note: Haidt has a similar wording in that he talks about the press secretary].

With the help of modern evolutionary psychology, Wright explains that the mind is increasingly seen as having a modular design. Different modules were created by evolution to size up different situations and take action towards these situations. Much of this action goes on without you (the CEO) even knowing that action is being undertaken. Think about things such as fear, lust, love and many other feelings: are you calling the shots? From a very different angle than Kahneman’s, namely the angle from Buddhist mindfulness and meditation, Wright ends up at the same conclusion:

(..) our ordinary point of view, the one we’re naturally endowed with, is seriously misleading.

Wright goes on to explain why meditation can help us understand ourselves better:

Mindfulness meditation involves increased attentiveness to the things that cause our behavior  ̶  attentiveness to how perceptions influence our internal states and how certain internal states lead to other internal states and to behaviors.

This is an extraordinary book that takes mindfulness meditation out of the esoteric realm. It puts it straight into evolutionary psychology and hands us a tool to help us understand, and improve, our own decision-making.

Mindfulness for Creativity. Danny Penman. Now that I introduced mindfulness meditation above, there needed to be a book on the actual practice of meditation on this year’s list. Mindfulness meditation is still ‘weird’ enough that you have to explain to the world that you are not a tree-hugger, an anarchist or, well, a useless creature in general. Bill Gates, far from being a useless creature, put a book on meditation on his 5 best books of this year. However, even he still felt the needed to explain what the benefits of meditation for creativity are, and that it’s nothing to freak out over:

Back when I was avoiding music and TV in the hope of maintaining my focus, I knew that lots of other people were using meditation to achieve similar ends. But I wasn’t interested. I thought of meditation as a woo-woo thing tied somehow to reincarnation, and I didn’t buy into it. Lately, though, I’ve gained a much better understanding of meditation. I’m certainly not an expert, but I now meditate two or three times a week, for about 10 minutes each time. I now see that meditation is simply exercise for the mind, similar to the way we exercise our muscles when we play sports. For me, it has nothing to do with faith or mysticism. It’s about taking a few minutes out of my day, learning how to pay attention to the thoughts in my head, and gaining a little bit of distance from them.

Well, if it’s something that Bill Gates and Steve Jobs buy into (founders of two of the most valuable companies in the world), I think we should at least give it a try.

If you need more book recommendations, check out the summer reading lists of 2016, 2017 and 2018, and the holiday reading lists of 2016 and 2017.

Happy holidays, and happy reading!

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

How to Beat Morality Bias in Decision Making

June 2016 was one of those months where groups took center stage over individuals. As Euro 2016 got underway, people no longer supported Man United or Man City, Lazio or Roma, Barça or Real. Instead, they support another group, their country. Then, June 23rd witnessed a clash of supporters and opponents of a Brexit. The lead-up to the referendum made me think: it seems nearly impossible to be persuaded by rational arguments of the other side. Why are we so groupish in our thinking? And, if you consider yourself to be part of a group, will you agree with all or most of the group’s standpoints and base your decisions on those? An Economist article reported that even ‘economists tend to fall into rival camps defined by distinct beliefs’.

In his landmark contribution to humanity’s understanding of itself (according to The New York Times), moral psychologist Jonathan Haidt has some interesting thoughts on why we are so groupish in the first place. Haidt (in his book The Righteous Mind) proposes that natural selection in humans not only took place on the individual level but also on group level:

Most of human nature was shaped by natural selection operating at the level of the individual. Most, but not all. We have a few group-related adaptations too (. . .). We humans have a dual nature – we are selfish primates who long to be a part of something larger and nobler than ourselves.

When everyone in a group began to share a common understanding of how things were supposed to be done, and then felt a flash of negativity when any individual violated those expectations, the first moral matrix was born.

Natural selection favored increasing levels of (. . .) “group-mindedness”—the ability to learn and conform to social norms, feel and share group-related emotions, and, ultimately, to create and obey social institutions, including religion.

This is a huge insight. Once you are caught in group thinking it becomes really hard to see the other side of the story. The story of people in other groups with other moral matrices:

Moral matrices bind people together and blind them to the coherence, or even existence, of other matrices. This makes it very difficult for people to consider the possibility that there might really be more than one form of moral truth, or more than one valid framework for judging people or running a society.

One of the phrases Haidt uses throughout the book is therefore: ‘Morality Binds and Blinds.’

“Yes, but …” I hear you protesting just reading this. Because we are rational human beings and our rational nature will overcome our biases. We are surely always trying to get to absolute truth? Haidt’s research leads him to disagree with the rationalists:

We do moral reasoning not to reconstruct the actual reasons why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgment.

You’ll misunderstand moral reasoning if you think about it as something people do by themselves in order to figure out the truth.

What, then, is the function of moral reasoning? Does it seem to have been shaped, tuned, and crafted (by natural selection) to help us find the truth, so that we can know the right way to behave and condemn those who behave wrongly? If you believe that, then you are a rationalist, like Plato, [and] Socrates (. . .). Or does moral reasoning seem to have been shaped, tuned, and crafted to help us pursue socially strategic goals, such as guarding our reputations and convincing other people to support us, or our team, in disputes? If you believe that, then you are a Glauconian. [Glaucon, Plato’s brother, famously claims that people are only virtuous because of fear of a bad reputation. His argument can be found in Plato’s Republic.]

Haidt makes a very convincing case that our thinking is mainly an after the fact activity to justify our quick intuitive moral judgment. And moral judgments are based on moral matrices of the group you are a member of.

My point here is not to counter the idea that we at least try to make rational decisions. But it is worthwhile to keep Haidt’s warning in mind (‘Morality Binds and Blinds’) the next time you enter a project, program or decision making process in which several groups with different backgrounds take part. Ask yourself if your thinking is really an objective weighing of pros and cons, or your thoughts fall prey to a morality bias.

A good counter measure to prevent yourself from falling into the trap of a morality bias – and maybe other biases too – is a rule the investor and Berkshire Hathaway Vice Chairman Charlie Munger uses:

I have what I call an iron prescription that helps me keep sane when I naturally drift toward preferring one ideology over another. And that is I say “I’m not entitled to have an opinion on this subject unless I can state the arguments against my position better than the people do who are supporting it. I think that only when I reach that stage am I qualified to speak.” Now you can say that’s too much of an iron discipline..it’s not too much of an iron discipline. It’s not even that hard to do.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin