How to Read Companies’ Success Stories

As I argued before (read this blog post on the halo effect), we like simple stories that account for a company’s success or failure. These stories are almost always delusional. So, what should you keep in mind when you pick up the latest best-selling business book written by a famous CEO or management thinker? The answer: beware of biases. In a beautiful little chapter called ‘The Illusion of Understanding’ in Thinking Fast and Slow, Nobel Prize winner Daniel Kahneman gives an example of the biases involved when trying to explain Google’s tremendous business success:

The ultimate test of an explanation is whether it would have made the event predictable in advance. No story of Google’s unlikely success will meet that test, because no story can include the myriad of events that would have caused a different outcome. The human mind does not deal well with nonevents. The fact that many of the important events that did occur involve choices further tempts you to exaggerate the role of skill and underestimate the part that luck played in the outcome. Because every critical decision turned out well, the record suggests almost flawless prescience  ̶  but bad luck could have disrupted any one of the successful steps. The halo effect adds the final touches, lending an aura of invincibility to the heroes of the story.

Kahneman does not deny that there was skill involved in creating a great company like Google. But, since there are not many opportunities to practice how to build a great company, luck has a greater impact than skill in a business environment. (As opposed to for example Roger Federer’s grand slam record: there’s more skill than luck involved here.)

What you need to do, then, is be aware of a number of biases, fallacies and hidden statistical rules that could be at play when reading a business success story. The next sections will briefly explain the ones I found in Thinking Fast and Slow that relate to false explanations of business success.

The Halo Effect

Defined as the tendency to like or dislike everything about a person or a company (including things you have not observed), the halo effect can be an obvious bias in business books. In Rosenzweig’s book on the halo effect, he phrases it like this:

[The halo effect is] the tendency to look at a company’s overall performance and make attributions about its culture, leadership, values, and more. In fact, many things we commonly claim drive company performance are simply attributions based on prior performance.

In other words, we like the performance of the company and then attribute that performance to things like culture, leadership, management techniques and more. We start to like everything about the company, and, thus, are creating a halo. The actual driver of performance, the causal link, is usually not exposed: there is surprisingly little quantitative data that links performance to a leadership style or a management technique (including highly popular ones like Agile). The halo effect stands between us and judging different elements of the organization in isolation (leadership, strategy, structure, culture, etc.): it’s either all good, or all bad.

The Narrative Fallacy

A narrative fallacy is a flawed story of the past that strongly shapes our view of the world and, not unimportant, our expectations of the future. The problem is, of course, that when we construct why things happen in stories, we are often wrong, over simplistic, too concrete and this can thus not serve as a blueprint for future success. Kahneman:

Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen. Any recent salient event is a candidate to become the kernel of a causal narrative. (..) we humans constantly fool ourselves by constructing flimsy accounts of the past and believing they are true.

This is a very powerful trap: we just like stories too much. And once there is a convincing story, we fail to ask more questions.

WYSIATI

We actually can create better stories if we have less information (exacerbating the narrative fallacy!). Less data makes it easier to create a coherent story. What you see is all there is (Kahneman uses the rather ugly abbreviation of WYSIATI):

At work here is that powerful WYSIATI rule. You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it. Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.

Instead of asking yourself ‘What would I need to know to create an informed opinion?’ your brain will go along with many stories that sound intuitively right. But, as I argued elsewhere, intuition is usually wrong when it comes to explaining complex problems or environments. Since the business environment is indeed a complex environment (maybe even a complex adaptive system with lots of positive feedback loops), judging a company on little information only feeds the narrative fallacy and, possibly, the halo effect.

Hindsight Bias

We tend to change the beliefs we previously held in line with what actually happened. It is thus really hard for us to recall what our previously held beliefs were, once they have been changed by an actual outcome. If you ask people to assign probabilities to certain scenarios beforehand; then show them the actual outcomes and ask them again what their initial probability ratings were, they will overestimate the probability they assigned in the past to the scenario that actually played out. This is a problem. Because this so-called hindsight bias feeds the narrative fallacy; in the sense that CEOs or entrepreneurs might be portrayed, in hindsight, as having assigned the right probability to the chosen scenario that caused the company to thrive. Hindsight bias makes these leaders into true visionaries. They were probably just lucky according to Kahneman:

Leaders who have been lucky are never punished for having taken too much risk. Instead, they are believed to have had the flair and foresight to anticipate success, and the sensible people who doubted them [i.e. who assigned better probabilities beforehand] are seen in hindsight as mediocre, timid, and weak. A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.

Regression to The Mean

Extreme groups (including over- and under- performers in business) will regress to the mean over time. This is a statistical fact, there’s no cause. Kahneman, when discussing regression to the mean, directs his attention to some of the most lauded management books:

The basic message of Built to Last and other similar books is that good managerial practices can be identified and that good practices will be rewarded by good results. Both messages are overstated. The comparison of firms that have been more or less successful is to a significant extent a comparison between firms that have been more or less lucky. Knowing the importance of luck, you should be particularly suspicious when highly consistent patterns emerge from the comparison of successful and less successful firms. In the presence of randomness, regular patterns can only be mirages. Because luck plays a large role, the quality of leadership and management practices cannot be inferred reliably from observations of success. And even if you had perfect foreknowledge that a CEO has brilliant vision and extraordinary competence, you still would be unable to predict how the company will perform with much better accuracy than the flip of a coin.

Conclusion

What I am not suggesting is that leadership style and management techniques do not matter. Of course, they do: not implementing best business practice already puts your firm at a disadvantage. What I am suggesting, however, is that company performance is less influenced by management and leadership styles than you would like. It all boils down to two things (Kahnemann):

  • Is the environment sufficiently regular to be predictable?
  • Is there an opportunity to learn these regularities through prolonged practice?

If the answer is yes for both questions, you find yourself in an environment where you can acquire a skill. This is why you can acquire high proficiency in tennis, surgery and firefighting (and probably in some management techniques such as Agile or performance management), but not in overall business management: the business environment is not sufficiently predictable and there’s no opportunity for prolonged practice (how many chances does an entrepreneur get to build Google?). Sure, a good CEO makes a difference. And please read her latest book. But, while reading, remind yourself of the pitfalls described in this article:

  • the halo effect;
  • the narrative fallacy;
  • what you see is all there is;
  • hindsight bias;
  • regression to the mean.

To conclude, a remarkable quote from The Economist on managers in football that might back up my claims in this post:

Fans lay most of the credit or blame for their team’s results on the manager. So do executives: nearly half of clubs in top leagues changed coach in 2018. Yet this faith appears misplaced. After analyzing 15 years of league data, we found that an overachieving manager’s odds of sustaining that success in a new job are barely better than a coin flip. The likely cause of the “decline” of once-feted bosses like Mr Mourinho is not that they lost their touch, but their early wins owed more to players and luck than to their own wizardry.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

Holiday Reading List 2018 – The Rationalist Delusion

Finally, this last year, long overdue, I picked up Daniel Kahneman’s Thinking Fast and Slow (2011). And what a book it is. If you still thought you were a rational human being, deliberately making judgements, weighing pros and cons for every decision you make, consider this quote from Thinking Fast and Slow:

(..) emotion now looms much larger in our understanding of intuitive judgments and choices than it did in the past. The executive’s decision would today be described as an example of the affect heuristic [a mental shortcut], where judgments and decision are guided directly by feelings of liking and disliking, with little deliberation or reasoning.

This resonates so strongly with the work of Jonathan Haidt in The Righteous Mind, that it made me think of one of the themes of that book, the rationalist delusion:

As an intuitionist, I’d say that the worship of reason is itself an illustration of one of the most long-lived delusions in Western history: the rationalist delusion. It’s the idea that reasoning is our most noble attribute, one that makes us like the gods (for Plato) or that brings us beyond the “delusion” of believing in gods (for the New Atheists). The rationalist delusion is not just a claim about human nature. It’s also a claim that the rational caste (philosophers or scientists) should have more power, and it usually comes along with a utopian program for raising more rational children.

How’s that for some provocative ideas worth exploring this holiday?

Haidt’s The Righteous Mind is not on this year’s list because I used it in the past for a number of blogs on biases (see this one on morality bias; and this one on confirmation bias). But the ideas of that book strongly influenced the way I progressed onto the books of this year’s list. Here we go:

Thinking Fast and Slow. Daniel Kahneman. A landmark book if you want to make better decisions. Kahneman shows, that by relying mostly on system 1 (mental shortcuts based on feelings, emotions and morality) in decision-making, and not on system 2 (our rationalist selves), we make predictable errors of judgment. The intuitive system 1 is a lot more influential than your think. Kahneman:

This is the essence of intuitive heuristics [rules of thumb]: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.

Learn how you fool yourself and read about: the availability heuristic (or What-You-See-Is-All-There-Is heuristic), anchoring bias, the law of small numbers, availability bias, the halo effect, and many, many more.

The Economist have their own way of describing the rationalist delusion in a review of this outstanding book:

As Copernicus removed the Earth from the centre of the universe and Darwin knocked humans off their biological perch, Kahneman has shown that we are not the paragons of reason we assume ourselves to be.

The Master and His Emissary – The Divided Brain and the Making of the Western World. Iain McGilchrist. A different dichotomy than intuition and reason is discussed in this ‘fascinating book’ (Financial Times). The leading question here is: ‘Why is the brain divided?’ McGilchrist:

(..) the hierarchy of attention, for a number of reasons, implies a grounding role and an ultimately integrating role for the right hemisphere, with whatever the left hemisphere does at the detailed level needing to be founded on, and then returned to, the picture generated by the right.

This book is almost two books into one: the first part is steeped into neuroscience, tells us why the brain is divided, and which functions the left and right hemispheres perform. (If I would have to place Kahneman’s systems 1 and 2 in McGilchrist’s left and right hemispheres, system 1 would reside in the left, and system 2 in the right hemisphere.) In the second part of the book (called ‘How the Brain Has Shaped Our World’), the story unfolds in a dramatic way. McGilchrist takes us on a tour through the Ancient World (Plato, again, also see my blog on him here), the Renaissance, Enlightenment and the Industrial Revolution, to come to some daring propositions. One of the most striking ones is that the left hemisphere (the Emissary) has become so dominant that it has seized power over the right hemisphere (the Master), creating a Western culture with an obsession for structure, narrow self-interest and a mechanistic view of the world. I had to think of books on the 2016 reading list by John Gray and Matthew Crawford when I read this. True, or not, it makes for some great reading and stuff worth discussing over a good glass of wine during the Holidays.

Why Buddhism Is True. Robert Wright. No, I’m not going religious on you. And no, I’m not going Buddhist on you. Lauded by The New York Times Book Review, The Guardian, The New Yorker and Scientific American, this book is Darwinian in nature. There’s also a good deal of Kahneman and McGilchrist here:

Again, the part of the brain that controls language [system 2; left hemisphere] had generated a coherent, if false, explanation of behavior  ̶  and apparently had convinced itself of the truth of the explanation. The split-brain experiments powerfully demonstrated the capacity of the conscious self to convince itself that it’s calling the shots when it’s not. (..) In short, from natural selection’s point of view, it’s good for you to tell a coherent story about yourself, to depict yourself as a rational, self-aware actor. (..) It is possible to argue that the primary evolutionary function of the self is to be the organ of impression management [note: Haidt has a similar wording in that he talks about the press secretary].

With the help of modern evolutionary psychology, Wright explains that the mind is increasingly seen as having a modular design. Different modules were created by evolution to size up different situations and take action towards these situations. Much of this action goes on without you (the CEO) even knowing that action is being undertaken. Think about things such as fear, lust, love and many other feelings: are you calling the shots? From a very different angle than Kahneman’s, namely the angle from Buddhist mindfulness and meditation, Wright ends up at the same conclusion:

(..) our ordinary point of view, the one we’re naturally endowed with, is seriously misleading.

Wright goes on to explain why meditation can help us understand ourselves better:

Mindfulness meditation involves increased attentiveness to the things that cause our behavior  ̶  attentiveness to how perceptions influence our internal states and how certain internal states lead to other internal states and to behaviors.

This is an extraordinary book that takes mindfulness meditation out of the esoteric realm. It puts it straight into evolutionary psychology and hands us a tool to help us understand, and improve, our own decision-making.

Mindfulness for Creativity. Danny Penman. Now that I introduced mindfulness meditation above, there needed to be a book on the actual practice of meditation on this year’s list. Mindfulness meditation is still ‘weird’ enough that you have to explain to the world that you are not a tree-hugger, an anarchist or, well, a useless creature in general. Bill Gates, far from being a useless creature, put a book on meditation on his 5 best books of this year. However, even he still felt the needed to explain what the benefits of meditation for creativity are, and that it’s nothing to freak out over:

Back when I was avoiding music and TV in the hope of maintaining my focus, I knew that lots of other people were using meditation to achieve similar ends. But I wasn’t interested. I thought of meditation as a woo-woo thing tied somehow to reincarnation, and I didn’t buy into it. Lately, though, I’ve gained a much better understanding of meditation. I’m certainly not an expert, but I now meditate two or three times a week, for about 10 minutes each time. I now see that meditation is simply exercise for the mind, similar to the way we exercise our muscles when we play sports. For me, it has nothing to do with faith or mysticism. It’s about taking a few minutes out of my day, learning how to pay attention to the thoughts in my head, and gaining a little bit of distance from them.

Well, if it’s something that Bill Gates and Steve Jobs buy into (founders of two of the most valuable companies in the world), I think we should at least give it a try.

If you need more book recommendations, check out the summer reading lists of 2016, 2017 and 2018, and the holiday reading lists of 2016 and 2017.

Happy holidays, and happy reading!

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

2017 Holiday Reading to Blow Your Mind (and expand your perspective)

In November, I mentioned that Peter Drucker thought of management as a liberal art. Last year I already recommended some contemporary philosophy to broaden your horizons (see, Holiday Reading to Blow Your Mind).  In 2017, I picked up some history books that give grand sweeping views of how we ended up in this day and age. Again, as in 2016’s holiday reading list, I tried to put the books in some kind of broad structure (see the photo directly above). It might give you an indication of what you want to read next after finishing one of the central books in the photo.

The two central books of this year’s list (note: not written in 2017, just read by me this year) are Fukuyama’s The Origins of Political Order and Harari’s Sapiens.

The Origins of Political Order. Francis Fukuyama. One of the first things I learned from glancing through the chapter headings of this book was that political institutions as we know them do not start with democracy in Greece. Instead, Fukuyama shows us the Chinese already gradually built a successful bureaucracy to govern vast lands from the 8th to the 3rd century BCE. The premise of the book is deceitfully simple: countries, empires, republics, etc. that work, have a couple of things in common: successful state building, rule of law, and accountability of governments. Fukuyama takes us on a tour of world history to show us where each of these three concepts first originated. He also shows us many examples of states that do not function because one or more of the three components are missing. Another thing I got out of this book is that I’ll never again take my country’s institutions for granted. In a way, these institutions have been shaped by an evolutionary process on societal level, to hand us working bureaucracies, rule of law and accountable government. If you won’t take my word for it, here’s what The Spectator wrote about this book of big ideas:

The Origins of Political Order is a magisterial work by an influential scholar, drawing on massive research in the social sciences as well as history and evolutionary biology. It provides a powerful and provocative analysis of the origins of the modern state, of relevance not only to historians and political scientists, but to anyone wishing to understand the nature of democratisation in the modern world and how it is to be achieved.

Sapiens – A Brief History of Humankind. Yuval Noah Harari. Harari takes us both further back in time than Fukuyama does, and into the future. This is a book with as broad a scope as you can imagine. Harari tries to answer the question why homo sapiens came to rule (and possibly will destroy) the world. One of his key proposals is our capacity for thinking in concepts that exist outside immediate reality:

(…) the truly unique feature of our language is not its ability to transmit information about men and lions. Rather, it’s the ability to transmit information about things that do not exist at all. As far as we know, only Sapiens can talk about entire kinds of entities that they have never seen, touched or smelled.

Concepts such as democracy, human rights, and religion are all put in a new light:

Yet none of these things exists outside the stories that people invent and tell one another. There are no gods in the universe, no nations, no money, no human rights, no laws and no justice outside the common imagination of human beings. People easily understand that ‘primitives’ cement their social order by believing in ghosts and spirits, and gathering each full moon to dance together around the campfire. What we fail to appreciate is that our modern institutions function on exactly the same basis.

This is a thought provoking book. You might not like or agree with some of the things Harari has to say, but he surely makes you see the world in a new light.

Reading Sapiens, I couldn’t stop thinking about another great book a read some years back:

The Mating Mind. Geoffrey Miller. One of the questions that to me always seemed unresolved, is why homo sapiens evolve in such a peculiar way. Or, better put, why do our ideas, concepts and ideologies (think human rights, democracy and religion for instance) evolve faster than the snail’s pace of biological evolution? Miller thinks he has the answer. He compares the human brain with a peacock’s tail. For every species, a different trait evolved to be the mechanism of natural selection. For the peacock, it was the feathered tail. For humans, argues Miller, it was the mind. Thus: our capacity for concepts that sit outside direct reality (cf. Harari);  our capacity for creating and telling stories our capacity to come up with things like democracy and human rights. In Miller’s words:

Once sexual choice seized upon the brain as a possible fitness indicator, the brain was helpless to resist. Any individuals who did not reveal their fitness through their courtship behaviour were not chosen as sexual partners. (…) By opening up our brains as advertisements for our fitness, we discovered whole new classes of fitness indicators, like generosity and creativity. (…) The healthy brain theory proposes that our minds are clusters of fitness indicators: persuasive salesmen like art, music, and humor, that do their best work in courtship, where the most important deals are made.

This is one of those books whose central idea will stay with you. Like Fukuyama’s idea of state building, rule of law, and accountability. And like Harari’s view on concepts existing outside of direct reality.

The other book that I kept thinking about while reading Sapiens, was:

23 Things They Don’t Tell You about Capitalism. Ha-Joon Chang. This book is a bit of an outlier in this list. A lot of big ideas and concepts are discussed in Origins and Sapiens, and on some of these ideas have been written wonderful little volumes. One of these is 23 Things. It is not a history (therefore an outlier in this list), but it deals with a concept that features prominently in Sapiens: capitalism. In Sapiens, you are told things like:

Ask a capitalist how to bring justice and political freedom to a place like Zimbabwe or Afghanistan, and you are likely to get a lecture on how economic affluence and a thriving middle class are essential for stable democratic institutions, and about the need therefore to inculcate Afghan tribesmen in the value of free enterprise, thrift and self-reliance.

Where Sapiens looks at capitalism as an ideology and even a religion (you will feel a natural response to protest against this ‘sacrilege’), 23 Things will give you all the more arguments to see what Harari means. Ha-Joon Chang, formerly with the World Bank and now at Cambridge University, explains why we need to think differently about capitalism, and why some truisms repeated over-and-over by world leaders and big institutions alike, might actually not have any truths to them. An important book. Not in the last place because it shows us how we are lured into stories. For the record: I’m not saying we should abandon capitalism; I’m saying we should see it for what it really is. And 23 Things is indispensable in being able to do so.

So, Sapiens triggered me to think about books that were already familiar to me (The Mating Mind and 23 Things). In contrast, Fukuyama triggered me to pick up some new books from my ‘to-read-pile’. Fukuyama makes an impressive case for religion having meant more to societies than just religion. An even more thorough case is made in:

The Evolution of the West: How Christianity Has Shaped Our Values. Nick Spencer. One of the few books I have ever (knowingly) read by a religious writer. Echoing Fukuyama, Nick Spencer argues that Christianity was instrumental in creating individualism in the Western world. Christianity, according to the writer, also shaped rule of law (cf. Fukuyama), humanism, human rights, capitalism, science, atheism, ethics, and democracy, to name just a few concepts covered in this beautifully written book. Warning: this is a rewarding read, but not an easy one. The Economist seems to agree:

It is not a popular thesis but, like a prophet crying in the post-modern wilderness, Mr Spencer provokes reflection that goes far beyond the shallow ding-dongs of the modern culture wars. He wants to make sure Westerners know where they came from as a way to illuminate where they are going.

Another book on the to-read-pile, that seems an obvious follow-up to Fukuyama, is:

Why Nations Fail: The Origins of Power, Prosperity and Poverty. Daron Acemoglu & James A. Robinson. I did not read this one yet. But I’m curious if Fukuyama’s three concepts of state building, rule of law and accountable government also play a prominent role in this book. From the back cover:

Based on fifteen years of research, and answering the competing arguments of authors ranging from Jeffrey Sachs to Jared Diamond [more on him next], Why Nations Fail blends economics, politics and history to provide a powerful and persuasive way of understanding wealth and poverty.

Finally, Sapiens and The Origins of Political Order evoke strong images of:

Guns, Germs and Steel. Jared Diamond. In its scope and grand sweeps, and trying to answer big questions, this is certainly a classic of the genre of big history, big questions, and big answers. I’m sure most of you have already read this book a long time ago. If not, it is probably enough to say that Jared Diamond’s books are in the bibliography of Sapiens, The Origins of Political Order, The Mating Mind, and Why Nations Fail. Harari speaks about Diamond when he says ‘[he] taught me to see the big picture’.

Happy holidays, and happy reading!

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

Decision Making: What We Can (Not) Learn from Plato

                  “An early management training on rational decision making(Hulton Archive/Getty Images)

When I started this blog, I was determined to write some posts on decision making. I am fascinated with the fact that humans think we are rational beings, while all the latest research (on moral psychology, neuroscience, etc) clearly contradicts this. I have tried to write about what I ‘discovered’ to be true (by reading other people’s books that is): that we suffer from a number of ‘errors’ that keep us from taking rational decisions. See for example my posts ‘morality binds and blinds and ‘three tools to overcome conformation bias.

How we came to worship reason: enter Plato

As interesting as new research on these topics is, how we came to think of ourselves as rational decision-makers in the first place, seems a relevant topic in itself. In an – undoubtedly futile – attempt to work through the entire list of Great Books, I managed to make it through a number of works by Plato. As everybody who enjoyed the classical education will surely know, Plato is considered the high priest of ratio. Two quotes from Republic (Plato’s attempt to describe the ideal state) show just how high his regard for reason was:

‘.. when one tries to get at what each thing is in itself by the exercise of dialectic, relying on reason without any aid from the senses, and refuses to give up until one has grasped by pure thought what the good is in itself, one is at the summit of the intellectual realm.’

.. reason ought to rule, having the wisdom and foresight to act for the whole, and the spirit ought to obey and support it.’ [Italics added.]

I often found it hard not to be dragged along by Plato’s arguments. He writes with such passion and comes up with many wonderful stories that are still known today. Think about the myth of Atlantis (to be found in the Timaeus), the metaphor of the cave (see Republic), and the famous comparison of the soul to a charioteer with two horses (read Phaedrus). It all makes for fantastic literature actually.

Plato’s unrelenting belief in reason, however, turns him into an enemy of anything that distracts us from pure thought. Poetry, art, passion, emotion, he will have nothing of it. Why? What follows takes quite some effort to grasp (and likely some leaps of faith):  Plato argues that reason leads to pure knowledge, and only pure knowledge can lead to what’s truly good, and what’s truly good ultimately leads to happiness. He then argues (in Republic) that everything that is in the realm outside pure thought – like impressions, appearances, beliefs, emotions, and opinions – could thus never lead to a happy life. In short, using rational thought is the only way to go about your life; emotions and the senses do not have a place in leading a good life or making the right judgments or decisions. This was a defining moment in history: reason won, emotions were out. Or as Jonathan Haidt puts in The Righteous Mind:

‘Western philosophy has been worshiping reason and distrusting the passions for thousands of years. There’s a direct line running from Plato through Immanuel Kant to [20th century psychology].’

Evidence why Plato is wrong: enter ‘moral reasoning’

Plato beliefs that we use (should use) rational thinking because it will lead us to the truth. Plato believes that we argue to get to the truth. In fact, most of his dialogues feature Socrates engaging in arguments about all kinds of topics with the aim of getting  to the truth of the matter. An opposing view would be that we do not argue to get to the truth, but we argue to win the argument: we have a sense or intuition for the right course of action, and we use our reasoning to justify our intuition. This is exactly what the field of moral psychology is proposing, and there’s overwhelming evidence that our decision making is highly influenced by emotions (or gut feelings if you will) instead of pure reason. As journalist Stephen Hall puts it in his highly readable Wisdom:

‘What if moral judgment, so central a notion to all schools of philosophy and the centrepiece of every major religion, is not the conscious, deliberate, reasoned discernment of right or wrong we’ve all been led to believe, but is, rather, a subterranean biological reckoning, fed by an underwater spring of hidden emotions, mischievously tickled and swayed by extraneous feelings like disgust, virtually beyond the touch of what we customarily think of as conscience? What if Plato, Socrates, and Aristotle were nothing but a bunch of two-bit, fork-tongued, post hoc rationalizers? What if, every time we decide what is the “right” or “good” thing to do, we are merely responding, like dogs, to the otherwise inaudible whistling of the emotional brain? That is where moral philosophy is headed these days, and it’s being driven by a new generation of philosophers and social psychologists, who have adopted the uniform of the lab coat.’

One of these social psychologists is Jonathan Haidt, who did a lot of ground breaking work on understanding where our moral reasoning comes from. One of his catch phrases is ‘intuitions come first, strategic reasoning second’. To buttress this claim, he finds indications in research that shows that ‘moral thinking is more like a politician searching for votes than a scientist searching for truth’. Some disturbing conclusions he draws about our thinking are:

  • We are obsessively concerned about what others think of us, although much of the concern is unconscious and invisible to us.
  • Conscious reasoning functions like a press secretary who automatically justifies any position taken by the president.
  • With the help of our press secretary, we are able to lie and cheat often, and then cover it up so effectively that we convince even ourselves.
  • Reasoning can take us to almost any conclusion we want to reach.
  • In moral and political matters we are often groupish, rather than selfish. We deploy our reasoning skills to support our team, and to demonstrate commitment to our team.

Why should you care? Or: how can you improve your decision making?

Why should you care about what the latest research has to say about reasoning? You may think you always make use of rational thinking and never engage in moral reasoning, especially not in the workplace. Think again. There are very few domains that are immune to moral reasoning. The exception might be science. But science obviously does not include the business environment, which is highly politically motivated and thus vulnerable to moral reasoning.

To improve your business decision making, you might ask yourself these two questions prior to making decisions:

  1. Am I trying to get to the truth (and set my ego aside) or am I trying to win the argument?
  2. Am I tackling this problem logically or am I caught in moral reasoning to try to justify a position I intuitively feel is right (also known as post hoc rationalization)?

Whenever the answer points into the direction of argumentative reasoning or post hoc rationalization, seek the advice of others. Others that have opposing views that is. Because the research I have been discussing also shows that we are very well equipped with coming up with my-side arguments, but terrible in coming up with other-side arguments. Reading Plato is still recommended – especially to learn where and how our adulation of reason came about –, but to improve your decision making you had better stick to the latest insights from psychology and neuroscience.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

Three thinking tools to overcome confirmation bias

Understanding that your reasoning is undermined by bias, is one of the most valuable insights into your own decision making capabilities. I wrote another piece on biases (Morality Binds and Blinds), but the title might have been such that it was my least successful blog entry measured by number of reads. Or, it might just be too unsettling to think and read about the flaws in your own reasoning. Nevertheless, I think understanding biases are hugely important for better decision making, and offer you three thinking tools to overcome the arguably most famous of all biases: confirmation bias.

Falling into the trap of confirmation bias, we all do it

An intriguing discussion of confirmation bias can be found in Jonathan Haidt’s The Righteous Mind:

(…) confirmation bias, the tendency to seek out and interpret new evidence in ways that confirm what you already think. People are quite good at challenging statements made by other people, but if it’s your belief, then it’s your possession – your child almost – and you want to protect it, not challenge it and risk losing it.

People outside Haidt’s realm of moral psychology and moral philosophy also figured out that people mainly reason to be right instead of reasoning to get to the truth. Here are some quotes that sum up human nature pretty well, I feel:

The eye sees only what the mind is prepared to comprehend. (Robertson Davies, fiction writer, in: Tempest-Tost.)

For it is a habit of humanity to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy. (Thucydides, chronicler of the Peloponnesian War, 4th century BCE.)

We human beings are egotistical animals; each of us wants to win the argument. (Garrett Hardin, human ecologist, in: Filters Against Folly.)

You would think that educating people will surely get rid of the confirmation bias. Or surely the more intelligent people are they will be able to come up with many more reasons for both sides of any argument? Think again. The opposite is quite true (again from The Righteous Mind):

The findings get more disturbing. Perkins found that IQ was by far the biggest predictor of how well people argued, but it predicted only the number of my-side arguments. Smart people make really good lawyers and press secretaries, but they are no better than others at finding reasons on the other side. Perkins concluded that “people invest their IQ in buttressing their own case rather than in exploring the entire issue more fully and even handedly”.

This is probably why rhetoric is simultaneously admired and met with scepticism. Is the rhetorician building a solid rational case? Or is she only trying to win people’s minds over for her own view? Plato famously bashed rhetoric and sophistry in his dialogues Gorgias and Republic, making the case that winning arguments with rhetoric is more about persuasion than knowledge. (Plato, in turn, was of course also accused of only coming up with supporting reasoning for his worldview. Even the greatest of thinkers are not immune to conformation bias it seems.)

If we agree that conformation bias is indeed a serious problem, it limits our capacity to reach the best outcomes by thinking that we are rational beings always arriving at rational conclusions. Haidt and others (e.g. John Gray; see my blog post on progress for a short discussion of his book The Silence of Animals – on Progress and Other Myths) even go so far as to call this the rationalist delusion. What to do?

Three thinking tools to overcome conformation bias

1. Put together a group consisting of members with different backgrounds.

Jonathan Haidt’s suggestion: always use several people with different ideologies and background when making a decision so they can disprove arguments put forward by individuals (emphasis added):

(..) each individual reasoner is really good at one thing: finding evidence to support the position he or she already holds, usually for intuitive reasons. We should not expect individuals to produce good, open-minded, truth-seeking reasoning, particularly when self-interest or reputational concerns are in play. But if you put individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasoning as an emergent property of the social system. This is why it’s so important to have intellectual and ideological diversity within any group (..).

This short passage reminded me of a method I used some time ago to try to get the most out of a group of experts who were all convinced of the superiority of their own solution to the problem at hand:

2. Use parallel thinking to have members look at the problem from the same angle and then switch angles a number of times using six different ‘hats’.

Once you have put together a diverse group of people, engage in parallel thinking. This method has been perfected by psychologist and management thinker Edward de Bono. In his book How to Have A Beautiful Mind, he echoes Haidt and others in noticing:

Argument is an excellent method and has served us well. At the same time (..) it is unsophisticated. Each side makes a ‘case’ and then seeks to defend that case and prove the other ‘case’ to be wrong. It says, in short: ‘I am right and you are wrong.’

De Bono gives a summary of his highly effective way of looking at decision making processes (emphasis added):

  • The direction of thinking is indicated by six coloured hats, each of which indicates a mode of thinking. At any moment everyone is ‘wearing’ the same colour ? hat. That is what is meant by ‘parallel thinking’.
  • The white hat indicates a focus on information. What do we have? What do we need? How are we going to get the information we need?
  • The red hat gives full permission for the expression of feelings, emotions and intuition without any need to give the reasons behind the feelings.
  • The black hat is for ‘caution’ and the focus is on faults, weaknesses, what might go wrong and why something does not ‘fit’. [This is the classical argumentative mode we often find ourselves in, trying to disprove other people’s arguments.]
  • With the yellow hat the focus is on values, benefits and how something can be done.
  • The green hat sets aside time, space and expectation for creative effort.
  • The blue hat is to do with the organization of thinking. This means setting up the focus and also putting together the outcome.
  • The hats make sure that everyone is using his or her own thinking fully to explore the subject. If you want to show off you now do this by out-performing others on each hat.

Using the six thinking hats is a useful approach to cure confirmation bias. The subject is really explored and should defuse argumentative behavior. The actual quality of the arguments put forward (checked through the black hat) can benefit from a third thinking tool proposed by Garret Hardin in Filters Against Folly:

3. For every argument put forward use the black hat to check against Hardin’s concepts ‘literacy’, ‘numeracy’, and ‘ecolacy’.

For everyone to understand what is put forward, Hardin proposes to pass an argument through three filters, because:

In the universal role of laymen we all have to learn to filter the essential meaning out of the too verbose, too aggressively technical statements of the experts. Fortunately this is not as difficult a task as some experts would have us believe.

Questions we should ask ourselves to understand any argument are:

  • On ‘literacy’. What are the words that we are using? What do they mean? What do these words mean in reality, if we start working with these concepts?
  • On ‘numeracy’. What are the numbers? What do the numbers mean? What is the relative size of quantifiable factors? Are there scale effects? Can we attach words to the numbers in order to convey meaning?
  • On ‘ecolacy’. In ecological thinking (or systems thinking) we introduce time into the equation of the words and numbers used. If we pursue this action, tactic, or strategy, what will happen next? And what if keep repeating this? What are the effects over time? What could the perverse effects be?

Hardin stresses the importance of using all three filters:

The skills of Readin’, Writin’, and ‘Rithmetic need to be combined with an attitudinal checklist that asks if the best words have been used, if quantities have been duly considered, and if the consequences of time and repetition have been taken into account. The “bottom line” of an analysis needs to be subjected to filtration that is simultaneously literate, numerate, and ecolate. (..) We use the ecolate filter to ferret out at least the major interconnections. Every proposal of a plausible policy must be followed by the question “And then what?” Not until we have asked this question (and answered it to the best of our ability) are we ready to put a plan into action.

By way of summary: understand what confirmation bias is, acknowledge that any individual falls victim to it, and then apply the three thinking tools discussed in this blog. Now you are well underway to better decision making.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

On the Need for Redundancy in Project Plans

IMG_0541 (800x597)

As a preface to this blog entry, forget about the strict definition of redundancy found in the Oxford English dictionary that tells you redundancy means ‘no longer needed or useful’. As will become clear – I hope -, redundancy is needed to have an option to maneuver when things happen that you could not possibly have predicted. In other words, redundancy gives you optionality. See redundancy not as superfluous but as an insurance. Or, as an investment even.

Projects fail to meet deadlines and budgets all the time. While rereading random passages in Nassim Taleb’s stimulating and provocative books The Black Swan and Antifragile, it struck me that I always try to build a little redundancy in project plans because ‘you never know what will happen’. Almost never taking time to consider the rationale behind my sub-conscious whispering ‘you never know what will happen’.

Taleb shares some nice insights on the reasons why redundancy (in general) is useful. I hope reading this blog entry will make you look at redundancy – and the world – from a somewhat different angle. (In a way, it is what this blog is all about: discussing concepts and tools that change the way you look at the world. Remember Proust? ‘My destination is no longer a place, rather a new way of seeing.’)

I will discuss two of Taleb’s concepts that will make it easier for you to include some well contemplated redundancy in your future projects. Including redundancy will definitely increase the success rate of your projects. But, you need to be able to explain why you included it in your budget. Here’s how to do that.

Concept 1: the world is more random than you think

Taleb constantly challenges you on how you look at the world. One of the main themes in his books is that the world is more random than we think, and that we are often fooled by this randomness. In Antifragile he argues:

Black Swans (…) are large-scale unpredictable and irregular events of massive consequence – unpredicted by a certain observer (…). I have made the claim that most of history comes from Black Swan events, while we worry about fine-tuning our understanding of the ordinary, and hence develop models, theories, or representations that cannot possibly track them or measure the possibility of these shocks.

Black Swans hijack our brains, making us feel we “sort of” or “almost” predicted them, because they are retrospectively explainable. (…) Life is more, a lot more, labyrinthine than shown in our memory – our minds are in the business of turning history into something smooth and linear, which makes us underestimate randomness.

In The Black Swan, Taleb claims that large scale events cannot be predicted (and are in effect random to the observer):

I discovered (…) that no researcher has tested whether large deviations in economics can be predicted from past large deviations – whether large deviations have predecessors, that is. (…) My results were that regular events can predict regular events, but that extreme events, perhaps because they are more acute when people are unprepared, are almost never predicted from narrow reliance on the past. The fact that this notion is not obvious to people is shocking to me. It is particularly shocking that people do what are called “stress tests” by taking the worst possible past deviation as an anchor event to project the worst possible future deviation, not thinking that they would have failed to account for that past deviation had they used the same method on the day before the occurrence of that past anchor event.

In Antifragile, he goes as far to call this a mental defect:

I have called this mental defect the Lucretius problem, after the Latin poetic philosopher who wrote that the fool believes that the tallest mountain in the world will be equal to the tallest one he has observed. (Note: Read the wonderful poem on science and philosophy by Lucretius called On the Nature of Things. In the 2007 Penguin edition, the translation of the passage Taleb is referring to actually reads as follows: “And any stream will seem to be, to one who’s never seen a larger, the greatest of rivers (…). Indeed, anything we see, we shall imagine, is the largest specimen of its kind if it’s the largest we’ve laid eyes on.”)

So, taking into account that randomness is a fact of life and we cannot predict big events, there needs to be some redundancy to counter for this. The next time you sit down and think through the things that can hurt your project plan, think ‘randomness’ and ‘Lucretius problem’.

Concept 2: the world is more random than we lead ourselves to believe

A second reason why you need redundancy – there might be more reasons, but my aim here is to introduce two new ways of looking at the world offered by Taleb – is the narrative fallacy. Planning is nothing more and nothing less than a narrative – a story – you create around your project based on your previous experiences with projects. You want this story (the budget and the planning) to unravel according to plan. You take with you all the things – the good and the not so good – that happened in previous projects, create a narrative why this happened, and include that knowledge in your current plan. I wrote about narratives and stories before in my blog entry Successful Businesses and the Halo Effect. It’s almost as if rereading Rosenzweig’s comments on stories in the Halo Effect when Taleb writes:

We like stories, we like to summarize, and we like to simplify, i.e., to reduce the dimension of matters. The first of the problems of human nature that we examine in this section (…) is what I call the narrative fallacy. (…) The fallacy is associated with our vulnerability to overinterpretation and our predilection for compact stories over raw truths. It severely distorts our mental representation of the world; it is particularly acute when it comes to the rare event.

And:

If narrativity causes us to see past events as more predictable, more expected, and less random than they actually were, then we should be able to make it work for us as therapy against some of the stings of randomness.

Needless to say, Taleb argues that we are ill prepared for randomness and will always be fooled by our tendency to attach an explaining narrative to events in the past. The narrative, however, will never prepare you for events in the future.

Redundancy as buffer against unforeseen events

The combined effects of randomness of the environment, the Lucretius problem, and the narrative fallacy create a background where you can easily underestimate the impact of unforeseen events in your project (or business plan, or – even – life itself). Build in some redundancy and use the concepts discussed here to rationalize your hunch that tells you: ‘you never know what will happen’. It’ll make your project plans more robust and realistic, and it’ll give you options. A final word from Antifragile:

Redundancy is ambiguous because it seems like a waste if nothing unusual happens. Except that something unusual happens – usually.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

Summer Reading

IMAG1231

All my posts so far have introduced books that I think are well worth reading during your summer break. (See the full list of books referred to in previous posts at the end of this entry.) Why read? Nassim Taleb in Antifragile:

. . .  the worst thing one can do to feel one knows things a bit deeper is to try to go into them a bit deeper. The sea gets deeper as you go further into it, according to a Venetian proverb. Curiosity is antifragile, like an addiction, and is magnified by attempts to satisfy it – books have a secret mission and ability to multiply, as everyone who has wall-to-wall bookshelves knows well.

So, what I am trying to do is dig a little deeper and sometimes also further back in time to understand how some ideas first came into being. As Charlie Munger put it:

The more basic knowledge you have . . . the less new knowledge you have to get.

My additional recommendations for this summer do not all go back to the source, but are rather books about very old books. I hope you will enjoy them as much as I did.

The Mighty Dead – Why Homer Matters, by Adam Nicholson. I reread both the Iliad and the Odyssey recently and found reading the poems rather hard work to be honest. Then I stumbled across this 2014 book. I started reading to understand Homer better. But the book amazes as an archaeological Indiana Jones’ journey through Europe and Eurasia. The Daily Telegraph wrote: ‘. . . a compelling case for viewing Homer as a cluster of the qualities that still underlie our civilisation. He is horror. He is honour. He is home. He is us.’

The Lagoon – How Aristotle Invented Science, by Armand Marie Leroi. Not many people would want to dig through all that Aristotle wrote on biology and natural philosophy. Instead, you might really enjoy this book when you travel to Greece. Or anywhere else where they serve great seafood really. As the Observer put it: ‘This big, sumptuous book made me hungry. Intellectually, to learn about the classical world’s take on what we now call science. But it made me viscerally and literally hungry: for grilled fish, oysters, figs and meze, and to sit on the shores of the Aegean idling at barnacles and cuttlefish copulating in the spume. Not bad for a science book.’

The Great Sea – A Human History of the Mediterranean. (I seem it bit biased towards the Mediterranean this summer.) To counter the stories about just one or a few — although great — men or books, I will attempt to read this sweeping history of the Mediterranean. I hope to find another great travel book combined with lots of new insights on how our current world came into being. The Sunday Times writes: ‘His book is full of intrepid explorers, anxious pilgrims, enterprising merchants, ambitious politicians and terrified refugees . . . such a treasure trove.’

The books introduced in my posts so far, are all well worth taking on your summer holiday as well:

Enjoy your summer.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

Successful Business and the Halo Effect

Being influenced in my thinking on business strategy and execution mostly by the work of Igor Ansoff (see Implanting Strategic Management), I have always wondered why business books like In Search of Excellence and Good to Great had such huge success. Ansoff stressed that business performance is highly dependent on the specific environment your business finds itself in at any given time. Consequently, he was very much against the notion of an ‘if you do this in any situation, it’ll always work’-approach as advocated by such business books.

So, following Ansoff, to me it seems that the prescriptive mantras for long-lasting high business performance are just not in sync with a situational or conditional approach (based on so-called turbulence levels of the business environment). Moreover, companies hailed as consistent high performers at one point in time, will sooner or later come crashing down. (As you may well ascertain when browsing through the companies listed as high performers in older business blockbusters.)

So, the question really is: what’s going on here? Are there really business laws (like laws in physics) that safeguard lasting high business performance? Or are these laws a mere fallacy?

It was only recently that I came across a fascinating theory called the The Halo Effect by Phil Rosenzweig (the book with the same name was first published in 2006 with an updated version published in 2014). It tries to shed some light on the apparent attractiveness of a recipe for long-lasting business success. Rosenzweig argues that our thinking about business performance is shaped by a number of delusions:

For all their claims of scientific rigor, for all their lengthy descriptions of apparently solid and careful research, they [i.e. science in business books] operate mainly at the level of storytelling. They offer tales of inspiration that we find comforting and satisfying, but they’re based on shaky thinking. They’re deluded.

He goes on naming a number of delusions in our thinking about business performance. The pre-eminent delusion Rosenzweig names the Halo Effect:

[The Halo Effect is the] tendency to look at a company’s overall performance and make attributions about its culture, leadership, values, and more. In fact, many things we commonly claim drive company performance are simply attributions based on prior performance.

It’s not so much the result of conscious distortion as it is a natural human tendency to make judgments about things that are abstract and ambiguous on the basis of other things that are salient and seemingly objective. The Halo Effect is just too strong, the desire to tell a coherent story too great, the tendency to jump on bandwagons too appealing.

It turns out that most business blockbusters that tell you precisely which companies to mimic for success, suffer from the Halo Effect. Consequently, the companies that are given as examples in most business books (e.g. Xerox in In Search of Excellence, Fannie Mae in Good to Great; also see this article in The Economist) are not consistent high performers after all:

Yet for all their promises of exhaustive research, Collins and Porras [in Built to Last: Successful Habits of Visionary Companies] didn’t address a basic problem: the Halo Effect. Much of the data they gathered came from the business press, from books, and from company documents, all sources that are likely to contain Halos.

You would have been better off investing randomly than putting your money on Collins and Porra’s Visionary companies.

But why the appeal then? Why are these prescriptive books such a huge success? Time after time? The answer, Rosenzweig argues, is that we like stories:

Managers don’t usually care to wade through discussions about data validity and methodology and statistical models and probabilities. We prefer explanations that are definitive and offer clear implications for action. We like stories.

Now, there’s nothing wrong with stories, provided we understand that’s what we have before us. More insidious, however, are stories that are dressed up to look like science. They’re better described as pseudo-science.

And:

Readers, too, prefer clear stories. We don’t really want to hear about partial causation or incremental effects or threats to validity. And there’s a further problem compounding all of this. As Harvard psychologist Stephen Pinker observed, university departments don’t always represent meaningful divisions of knowledge. If you’re a professor of marketing, you care a lot about market orientation and customer focus, and there’s a natural tendency to want to demonstrate the importance of your specialty.

Does this mean that everything that is written about good business practices is just nonsense and everything might as well be left to chance? No:

Success is not random – but it is fleeting. Why? Because as described by the great Austrian economist Joseph Schumpeter, the basic force at work in capitalism is that of competition through innovation – whether of new products, or new services, or new ways of doing business. Where most economists of his day assumed that companies competed by offering lower process for similar goods and services, Schumpeter’s 1942 book, Capitalism, Socialism and Democracy, described the forces of competition in terms of innovation.

But the main point is that high performance is difficult to maintain, and the reason is simple: In a free market system, high profits tend to decline thanks to what one economist called “the erosive forces of imitation, competition, and expropriation.” Rivals copy the leader’s winning ways, new companies enter the market, consulting companies spread best practices, and employees move from company to company.

These findings show that performance is not random but persists over time, yet there is also a tendency to move toward the middle, a clear regression toward the mean. Competitive advantage is hard to sustain. Nothing recedes like success.

However, real science on business performance, as opposed to mere storytelling, is out there. But it might not make for such a good story:

Anita McGahan at Boston University and Michael Porter at Harvard Business School set out to determine how much of a business unit’s profits can be explained by the industry in which it competes, by the corporation it belongs to, and by the way it is managed. This last category, which they called “segment-specific effects,” covers just about everything we’ve talked about (…): a company’s customer orientation, its culture, its human resource systems, social responsibility, and so forth. Using data from thousands of U.S. companies from 1981 to 1994, McGahan and Porter found that “segment-specific effects” explained about 32 percent of a business unit’s performance. Just 32 percent. The rest was due to industry effects or corporate effects or was simply unexplained. So maybe all of the studies we’ve looked at make sense after all! It’s just that, as we suspected, their efforts overlap – they all explain the same 32 percent [italics mine]. Each study claims to have isolated an important driver of performance, but only because of the Delusion of Single Explanations.

Rosenzweig stays clear from coming up with his own take on a recipe for long-lasting business success. However, understanding that strategy always involves taking risks, that links between input and outcomes are sketchy at best and that flawless execution (once you have made up your mind about your strategic direction) is needed at all times, can veritably be read as a good starting point for discussing your company’s performance. It also means you do not have to resort to the latest and newest four, five, or eight-point list promising the holy grail of ever-lasting high business performance.

So, what’s the very mundane advice that Rosenzweig has to offer to managers?

When it comes to managing a company for high performance, a wise manager knows: (1) Any good strategy involves risk. (2) If you think your strategy is foolproof, the fool may well be you. (3) Execution, too, is uncertain – what works in one company with one workforce may have different results elsewhere. (4) Chance often plays a greater role than we think, or than successful managers usually like to admit. (5) The link between input and outputs is tenuous. But when the die is cast, the best managers act as if chance is irrelevant – persistence and tenacity are everything.

Will all of this guarantee success? Of course not. But I suspect it will improve your chances of success, which is a more sensible goal to pursue.

If you want to read your management books more critically, the lessons drawn by Rosenzweig in The Halo Effect  might just be invaluable.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin