How to Read Companies’ Success Stories

As I argued before (read this blog post on the halo effect), we like simple stories that account for a company’s success or failure. These stories are almost always delusional. So, what should you keep in mind when you pick up the latest best-selling business book written by a famous CEO or management thinker? The answer: beware of biases. In a beautiful little chapter called ‘The Illusion of Understanding’ in Thinking Fast and Slow, Nobel Prize winner Daniel Kahneman gives an example of the biases involved when trying to explain Google’s tremendous business success:

The ultimate test of an explanation is whether it would have made the event predictable in advance. No story of Google’s unlikely success will meet that test, because no story can include the myriad of events that would have caused a different outcome. The human mind does not deal well with nonevents. The fact that many of the important events that did occur involve choices further tempts you to exaggerate the role of skill and underestimate the part that luck played in the outcome. Because every critical decision turned out well, the record suggests almost flawless prescience  ̶  but bad luck could have disrupted any one of the successful steps. The halo effect adds the final touches, lending an aura of invincibility to the heroes of the story.

Kahneman does not deny that there was skill involved in creating a great company like Google. But, since there are not many opportunities to practice how to build a great company, luck has a greater impact than skill in a business environment. (As opposed to for example Roger Federer’s grand slam record: there’s more skill than luck involved here.)

What you need to do, then, is be aware of a number of biases, fallacies and hidden statistical rules that could be at play when reading a business success story. The next sections will briefly explain the ones I found in Thinking Fast and Slow that relate to false explanations of business success.

The Halo Effect

Defined as the tendency to like or dislike everything about a person or a company (including things you have not observed), the halo effect can be an obvious bias in business books. In Rosenzweig’s book on the halo effect, he phrases it like this:

[The halo effect is] the tendency to look at a company’s overall performance and make attributions about its culture, leadership, values, and more. In fact, many things we commonly claim drive company performance are simply attributions based on prior performance.

In other words, we like the performance of the company and then attribute that performance to things like culture, leadership, management techniques and more. We start to like everything about the company, and, thus, are creating a halo. The actual driver of performance, the causal link, is usually not exposed: there is surprisingly little quantitative data that links performance to a leadership style or a management technique (including highly popular ones like Agile). The halo effect stands between us and judging different elements of the organization in isolation (leadership, strategy, structure, culture, etc.): it’s either all good, or all bad.

The Narrative Fallacy

A narrative fallacy is a flawed story of the past that strongly shapes our view of the world and, not unimportant, our expectations of the future. The problem is, of course, that when we construct why things happen in stories, we are often wrong, over simplistic, too concrete and this can thus not serve as a blueprint for future success. Kahneman:

Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen. Any recent salient event is a candidate to become the kernel of a causal narrative. (..) we humans constantly fool ourselves by constructing flimsy accounts of the past and believing they are true.

This is a very powerful trap: we just like stories too much. And once there is a convincing story, we fail to ask more questions.

WYSIATI

We actually can create better stories if we have less information (exacerbating the narrative fallacy!). Less data makes it easier to create a coherent story. What you see is all there is (Kahneman uses the rather ugly abbreviation of WYSIATI):

At work here is that powerful WYSIATI rule. You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it. Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.

Instead of asking yourself ‘What would I need to know to create an informed opinion?’ your brain will go along with many stories that sound intuitively right. But, as I argued elsewhere, intuition is usually wrong when it comes to explaining complex problems or environments. Since the business environment is indeed a complex environment (maybe even a complex adaptive system with lots of positive feedback loops), judging a company on little information only feeds the narrative fallacy and, possibly, the halo effect.

Hindsight Bias

We tend to change the beliefs we previously held in line with what actually happened. It is thus really hard for us to recall what our previously held beliefs were, once they have been changed by an actual outcome. If you ask people to assign probabilities to certain scenarios beforehand; then show them the actual outcomes and ask them again what their initial probability ratings were, they will overestimate the probability they assigned in the past to the scenario that actually played out. This is a problem. Because this so-called hindsight bias feeds the narrative fallacy; in the sense that CEOs or entrepreneurs might be portrayed, in hindsight, as having assigned the right probability to the chosen scenario that caused the company to thrive. Hindsight bias makes these leaders into true visionaries. They were probably just lucky according to Kahneman:

Leaders who have been lucky are never punished for having taken too much risk. Instead, they are believed to have had the flair and foresight to anticipate success, and the sensible people who doubted them [i.e. who assigned better probabilities beforehand] are seen in hindsight as mediocre, timid, and weak. A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.

Regression to The Mean

Extreme groups (including over- and under- performers in business) will regress to the mean over time. This is a statistical fact, there’s no cause. Kahneman, when discussing regression to the mean, directs his attention to some of the most lauded management books:

The basic message of Built to Last and other similar books is that good managerial practices can be identified and that good practices will be rewarded by good results. Both messages are overstated. The comparison of firms that have been more or less successful is to a significant extent a comparison between firms that have been more or less lucky. Knowing the importance of luck, you should be particularly suspicious when highly consistent patterns emerge from the comparison of successful and less successful firms. In the presence of randomness, regular patterns can only be mirages. Because luck plays a large role, the quality of leadership and management practices cannot be inferred reliably from observations of success. And even if you had perfect foreknowledge that a CEO has brilliant vision and extraordinary competence, you still would be unable to predict how the company will perform with much better accuracy than the flip of a coin.

Conclusion

What I am not suggesting is that leadership style and management techniques do not matter. Of course, they do: not implementing best business practice already puts your firm at a disadvantage. What I am suggesting, however, is that company performance is less influenced by management and leadership styles than you would like. It all boils down to two things (Kahnemann):

  • Is the environment sufficiently regular to be predictable?
  • Is there an opportunity to learn these regularities through prolonged practice?

If the answer is yes for both questions, you find yourself in an environment where you can acquire a skill. This is why you can acquire high proficiency in tennis, surgery and firefighting (and probably in some management techniques such as Agile or performance management), but not in overall business management: the business environment is not sufficiently predictable and there’s no opportunity for prolonged practice (how many chances does an entrepreneur get to build Google?). Sure, a good CEO makes a difference. And please read her latest book. But, while reading, remind yourself of the pitfalls described in this article:

  • the halo effect;
  • the narrative fallacy;
  • what you see is all there is;
  • hindsight bias;
  • regression to the mean.

To conclude, a remarkable quote from The Economist on managers in football that might back up my claims in this post:

Fans lay most of the credit or blame for their team’s results on the manager. So do executives: nearly half of clubs in top leagues changed coach in 2018. Yet this faith appears misplaced. After analyzing 15 years of league data, we found that an overachieving manager’s odds of sustaining that success in a new job are barely better than a coin flip. The likely cause of the “decline” of once-feted bosses like Mr Mourinho is not that they lost their touch, but their early wins owed more to players and luck than to their own wizardry.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

Holiday Reading List 2018 – The Rationalist Delusion

Finally, this last year, long overdue, I picked up Daniel Kahneman’s Thinking Fast and Slow (2011). And what a book it is. If you still thought you were a rational human being, deliberately making judgements, weighing pros and cons for every decision you make, consider this quote from Thinking Fast and Slow:

(..) emotion now looms much larger in our understanding of intuitive judgments and choices than it did in the past. The executive’s decision would today be described as an example of the affect heuristic [a mental shortcut], where judgments and decision are guided directly by feelings of liking and disliking, with little deliberation or reasoning.

This resonates so strongly with the work of Jonathan Haidt in The Righteous Mind, that it made me think of one of the themes of that book, the rationalist delusion:

As an intuitionist, I’d say that the worship of reason is itself an illustration of one of the most long-lived delusions in Western history: the rationalist delusion. It’s the idea that reasoning is our most noble attribute, one that makes us like the gods (for Plato) or that brings us beyond the “delusion” of believing in gods (for the New Atheists). The rationalist delusion is not just a claim about human nature. It’s also a claim that the rational caste (philosophers or scientists) should have more power, and it usually comes along with a utopian program for raising more rational children.

How’s that for some provocative ideas worth exploring this holiday?

Haidt’s The Righteous Mind is not on this year’s list because I used it in the past for a number of blogs on biases (see this one on morality bias; and this one on confirmation bias). But the ideas of that book strongly influenced the way I progressed onto the books of this year’s list. Here we go:

Thinking Fast and Slow. Daniel Kahneman. A landmark book if you want to make better decisions. Kahneman shows, that by relying mostly on system 1 (mental shortcuts based on feelings, emotions and morality) in decision-making, and not on system 2 (our rationalist selves), we make predictable errors of judgment. The intuitive system 1 is a lot more influential than your think. Kahneman:

This is the essence of intuitive heuristics [rules of thumb]: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.

Learn how you fool yourself and read about: the availability heuristic (or What-You-See-Is-All-There-Is heuristic), anchoring bias, the law of small numbers, availability bias, the halo effect, and many, many more.

The Economist have their own way of describing the rationalist delusion in a review of this outstanding book:

As Copernicus removed the Earth from the centre of the universe and Darwin knocked humans off their biological perch, Kahneman has shown that we are not the paragons of reason we assume ourselves to be.

The Master and His Emissary – The Divided Brain and the Making of the Western World. Iain McGilchrist. A different dichotomy than intuition and reason is discussed in this ‘fascinating book’ (Financial Times). The leading question here is: ‘Why is the brain divided?’ McGilchrist:

(..) the hierarchy of attention, for a number of reasons, implies a grounding role and an ultimately integrating role for the right hemisphere, with whatever the left hemisphere does at the detailed level needing to be founded on, and then returned to, the picture generated by the right.

This book is almost two books into one: the first part is steeped into neuroscience, tells us why the brain is divided, and which functions the left and right hemispheres perform. (If I would have to place Kahneman’s systems 1 and 2 in McGilchrist’s left and right hemispheres, system 1 would reside in the left, and system 2 in the right hemisphere.) In the second part of the book (called ‘How the Brain Has Shaped Our World’), the story unfolds in a dramatic way. McGilchrist takes us on a tour through the Ancient World (Plato, again, also see my blog on him here), the Renaissance, Enlightenment and the Industrial Revolution, to come to some daring propositions. One of the most striking ones is that the left hemisphere (the Emissary) has become so dominant that it has seized power over the right hemisphere (the Master), creating a Western culture with an obsession for structure, narrow self-interest and a mechanistic view of the world. I had to think of books on the 2016 reading list by John Gray and Matthew Crawford when I read this. True, or not, it makes for some great reading and stuff worth discussing over a good glass of wine during the Holidays.

Why Buddhism Is True. Robert Wright. No, I’m not going religious on you. And no, I’m not going Buddhist on you. Lauded by The New York Times Book Review, The Guardian, The New Yorker and Scientific American, this book is Darwinian in nature. There’s also a good deal of Kahneman and McGilchrist here:

Again, the part of the brain that controls language [system 2; left hemisphere] had generated a coherent, if false, explanation of behavior  ̶  and apparently had convinced itself of the truth of the explanation. The split-brain experiments powerfully demonstrated the capacity of the conscious self to convince itself that it’s calling the shots when it’s not. (..) In short, from natural selection’s point of view, it’s good for you to tell a coherent story about yourself, to depict yourself as a rational, self-aware actor. (..) It is possible to argue that the primary evolutionary function of the self is to be the organ of impression management [note: Haidt has a similar wording in that he talks about the press secretary].

With the help of modern evolutionary psychology, Wright explains that the mind is increasingly seen as having a modular design. Different modules were created by evolution to size up different situations and take action towards these situations. Much of this action goes on without you (the CEO) even knowing that action is being undertaken. Think about things such as fear, lust, love and many other feelings: are you calling the shots? From a very different angle than Kahneman’s, namely the angle from Buddhist mindfulness and meditation, Wright ends up at the same conclusion:

(..) our ordinary point of view, the one we’re naturally endowed with, is seriously misleading.

Wright goes on to explain why meditation can help us understand ourselves better:

Mindfulness meditation involves increased attentiveness to the things that cause our behavior  ̶  attentiveness to how perceptions influence our internal states and how certain internal states lead to other internal states and to behaviors.

This is an extraordinary book that takes mindfulness meditation out of the esoteric realm. It puts it straight into evolutionary psychology and hands us a tool to help us understand, and improve, our own decision-making.

Mindfulness for Creativity. Danny Penman. Now that I introduced mindfulness meditation above, there needed to be a book on the actual practice of meditation on this year’s list. Mindfulness meditation is still ‘weird’ enough that you have to explain to the world that you are not a tree-hugger, an anarchist or, well, a useless creature in general. Bill Gates, far from being a useless creature, put a book on meditation on his 5 best books of this year. However, even he still felt the needed to explain what the benefits of meditation for creativity are, and that it’s nothing to freak out over:

Back when I was avoiding music and TV in the hope of maintaining my focus, I knew that lots of other people were using meditation to achieve similar ends. But I wasn’t interested. I thought of meditation as a woo-woo thing tied somehow to reincarnation, and I didn’t buy into it. Lately, though, I’ve gained a much better understanding of meditation. I’m certainly not an expert, but I now meditate two or three times a week, for about 10 minutes each time. I now see that meditation is simply exercise for the mind, similar to the way we exercise our muscles when we play sports. For me, it has nothing to do with faith or mysticism. It’s about taking a few minutes out of my day, learning how to pay attention to the thoughts in my head, and gaining a little bit of distance from them.

Well, if it’s something that Bill Gates and Steve Jobs buy into (founders of two of the most valuable companies in the world), I think we should at least give it a try.

If you need more book recommendations, check out the summer reading lists of 2016, 2017 and 2018, and the holiday reading lists of 2016 and 2017.

Happy holidays, and happy reading!

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

Three thinking tools to overcome confirmation bias

Understanding that your reasoning is undermined by bias, is one of the most valuable insights into your own decision making capabilities. I wrote another piece on biases (Morality Binds and Blinds), but the title might have been such that it was my least successful blog entry measured by number of reads. Or, it might just be too unsettling to think and read about the flaws in your own reasoning. Nevertheless, I think understanding biases are hugely important for better decision making, and offer you three thinking tools to overcome the arguably most famous of all biases: confirmation bias.

Falling into the trap of confirmation bias, we all do it

An intriguing discussion of confirmation bias can be found in Jonathan Haidt’s The Righteous Mind:

(…) confirmation bias, the tendency to seek out and interpret new evidence in ways that confirm what you already think. People are quite good at challenging statements made by other people, but if it’s your belief, then it’s your possession – your child almost – and you want to protect it, not challenge it and risk losing it.

People outside Haidt’s realm of moral psychology and moral philosophy also figured out that people mainly reason to be right instead of reasoning to get to the truth. Here are some quotes that sum up human nature pretty well, I feel:

The eye sees only what the mind is prepared to comprehend. (Robertson Davies, fiction writer, in: Tempest-Tost.)

For it is a habit of humanity to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy. (Thucydides, chronicler of the Peloponnesian War, 4th century BCE.)

We human beings are egotistical animals; each of us wants to win the argument. (Garrett Hardin, human ecologist, in: Filters Against Folly.)

You would think that educating people will surely get rid of the confirmation bias. Or surely the more intelligent people are they will be able to come up with many more reasons for both sides of any argument? Think again. The opposite is quite true (again from The Righteous Mind):

The findings get more disturbing. Perkins found that IQ was by far the biggest predictor of how well people argued, but it predicted only the number of my-side arguments. Smart people make really good lawyers and press secretaries, but they are no better than others at finding reasons on the other side. Perkins concluded that “people invest their IQ in buttressing their own case rather than in exploring the entire issue more fully and even handedly”.

This is probably why rhetoric is simultaneously admired and met with scepticism. Is the rhetorician building a solid rational case? Or is she only trying to win people’s minds over for her own view? Plato famously bashed rhetoric and sophistry in his dialogues Gorgias and Republic, making the case that winning arguments with rhetoric is more about persuasion than knowledge. (Plato, in turn, was of course also accused of only coming up with supporting reasoning for his worldview. Even the greatest of thinkers are not immune to conformation bias it seems.)

If we agree that conformation bias is indeed a serious problem, it limits our capacity to reach the best outcomes by thinking that we are rational beings always arriving at rational conclusions. Haidt and others (e.g. John Gray; see my blog post on progress for a short discussion of his book The Silence of Animals – on Progress and Other Myths) even go so far as to call this the rationalist delusion. What to do?

Three thinking tools to overcome conformation bias

1. Put together a group consisting of members with different backgrounds.

Jonathan Haidt’s suggestion: always use several people with different ideologies and background when making a decision so they can disprove arguments put forward by individuals (emphasis added):

(..) each individual reasoner is really good at one thing: finding evidence to support the position he or she already holds, usually for intuitive reasons. We should not expect individuals to produce good, open-minded, truth-seeking reasoning, particularly when self-interest or reputational concerns are in play. But if you put individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasoning as an emergent property of the social system. This is why it’s so important to have intellectual and ideological diversity within any group (..).

This short passage reminded me of a method I used some time ago to try to get the most out of a group of experts who were all convinced of the superiority of their own solution to the problem at hand:

2. Use parallel thinking to have members look at the problem from the same angle and then switch angles a number of times using six different ‘hats’.

Once you have put together a diverse group of people, engage in parallel thinking. This method has been perfected by psychologist and management thinker Edward de Bono. In his book How to Have A Beautiful Mind, he echoes Haidt and others in noticing:

Argument is an excellent method and has served us well. At the same time (..) it is unsophisticated. Each side makes a ‘case’ and then seeks to defend that case and prove the other ‘case’ to be wrong. It says, in short: ‘I am right and you are wrong.’

De Bono gives a summary of his highly effective way of looking at decision making processes (emphasis added):

  • The direction of thinking is indicated by six coloured hats, each of which indicates a mode of thinking. At any moment everyone is ‘wearing’ the same colour ? hat. That is what is meant by ‘parallel thinking’.
  • The white hat indicates a focus on information. What do we have? What do we need? How are we going to get the information we need?
  • The red hat gives full permission for the expression of feelings, emotions and intuition without any need to give the reasons behind the feelings.
  • The black hat is for ‘caution’ and the focus is on faults, weaknesses, what might go wrong and why something does not ‘fit’. [This is the classical argumentative mode we often find ourselves in, trying to disprove other people’s arguments.]
  • With the yellow hat the focus is on values, benefits and how something can be done.
  • The green hat sets aside time, space and expectation for creative effort.
  • The blue hat is to do with the organization of thinking. This means setting up the focus and also putting together the outcome.
  • The hats make sure that everyone is using his or her own thinking fully to explore the subject. If you want to show off you now do this by out-performing others on each hat.

Using the six thinking hats is a useful approach to cure confirmation bias. The subject is really explored and should defuse argumentative behavior. The actual quality of the arguments put forward (checked through the black hat) can benefit from a third thinking tool proposed by Garret Hardin in Filters Against Folly:

3. For every argument put forward use the black hat to check against Hardin’s concepts ‘literacy’, ‘numeracy’, and ‘ecolacy’.

For everyone to understand what is put forward, Hardin proposes to pass an argument through three filters, because:

In the universal role of laymen we all have to learn to filter the essential meaning out of the too verbose, too aggressively technical statements of the experts. Fortunately this is not as difficult a task as some experts would have us believe.

Questions we should ask ourselves to understand any argument are:

  • On ‘literacy’. What are the words that we are using? What do they mean? What do these words mean in reality, if we start working with these concepts?
  • On ‘numeracy’. What are the numbers? What do the numbers mean? What is the relative size of quantifiable factors? Are there scale effects? Can we attach words to the numbers in order to convey meaning?
  • On ‘ecolacy’. In ecological thinking (or systems thinking) we introduce time into the equation of the words and numbers used. If we pursue this action, tactic, or strategy, what will happen next? And what if keep repeating this? What are the effects over time? What could the perverse effects be?

Hardin stresses the importance of using all three filters:

The skills of Readin’, Writin’, and ‘Rithmetic need to be combined with an attitudinal checklist that asks if the best words have been used, if quantities have been duly considered, and if the consequences of time and repetition have been taken into account. The “bottom line” of an analysis needs to be subjected to filtration that is simultaneously literate, numerate, and ecolate. (..) We use the ecolate filter to ferret out at least the major interconnections. Every proposal of a plausible policy must be followed by the question “And then what?” Not until we have asked this question (and answered it to the best of our ability) are we ready to put a plan into action.

By way of summary: understand what confirmation bias is, acknowledge that any individual falls victim to it, and then apply the three thinking tools discussed in this blog. Now you are well underway to better decision making.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

How to Beat Morality Bias in Decision Making

June 2016 was one of those months where groups took center stage over individuals. As Euro 2016 got underway, people no longer supported Man United or Man City, Lazio or Roma, Barça or Real. Instead, they support another group, their country. Then, June 23rd witnessed a clash of supporters and opponents of a Brexit. The lead-up to the referendum made me think: it seems nearly impossible to be persuaded by rational arguments of the other side. Why are we so groupish in our thinking? And, if you consider yourself to be part of a group, will you agree with all or most of the group’s standpoints and base your decisions on those? An Economist article reported that even ‘economists tend to fall into rival camps defined by distinct beliefs’.

In his landmark contribution to humanity’s understanding of itself (according to The New York Times), moral psychologist Jonathan Haidt has some interesting thoughts on why we are so groupish in the first place. Haidt (in his book The Righteous Mind) proposes that natural selection in humans not only took place on the individual level but also on group level:

Most of human nature was shaped by natural selection operating at the level of the individual. Most, but not all. We have a few group-related adaptations too (. . .). We humans have a dual nature – we are selfish primates who long to be a part of something larger and nobler than ourselves.

When everyone in a group began to share a common understanding of how things were supposed to be done, and then felt a flash of negativity when any individual violated those expectations, the first moral matrix was born.

Natural selection favored increasing levels of (. . .) “group-mindedness”—the ability to learn and conform to social norms, feel and share group-related emotions, and, ultimately, to create and obey social institutions, including religion.

This is a huge insight. Once you are caught in group thinking it becomes really hard to see the other side of the story. The story of people in other groups with other moral matrices:

Moral matrices bind people together and blind them to the coherence, or even existence, of other matrices. This makes it very difficult for people to consider the possibility that there might really be more than one form of moral truth, or more than one valid framework for judging people or running a society.

One of the phrases Haidt uses throughout the book is therefore: ‘Morality Binds and Blinds.’

“Yes, but …” I hear you protesting just reading this. Because we are rational human beings and our rational nature will overcome our biases. We are surely always trying to get to absolute truth? Haidt’s research leads him to disagree with the rationalists:

We do moral reasoning not to reconstruct the actual reasons why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgment.

You’ll misunderstand moral reasoning if you think about it as something people do by themselves in order to figure out the truth.

What, then, is the function of moral reasoning? Does it seem to have been shaped, tuned, and crafted (by natural selection) to help us find the truth, so that we can know the right way to behave and condemn those who behave wrongly? If you believe that, then you are a rationalist, like Plato, [and] Socrates (. . .). Or does moral reasoning seem to have been shaped, tuned, and crafted to help us pursue socially strategic goals, such as guarding our reputations and convincing other people to support us, or our team, in disputes? If you believe that, then you are a Glauconian. [Glaucon, Plato’s brother, famously claims that people are only virtuous because of fear of a bad reputation. His argument can be found in Plato’s Republic.]

Haidt makes a very convincing case that our thinking is mainly an after the fact activity to justify our quick intuitive moral judgment. And moral judgments are based on moral matrices of the group you are a member of.

My point here is not to counter the idea that we at least try to make rational decisions. But it is worthwhile to keep Haidt’s warning in mind (‘Morality Binds and Blinds’) the next time you enter a project, program or decision making process in which several groups with different backgrounds take part. Ask yourself if your thinking is really an objective weighing of pros and cons, or your thoughts fall prey to a morality bias.

A good counter measure to prevent yourself from falling into the trap of a morality bias – and maybe other biases too – is a rule the investor and Berkshire Hathaway Vice Chairman Charlie Munger uses:

I have what I call an iron prescription that helps me keep sane when I naturally drift toward preferring one ideology over another. And that is I say “I’m not entitled to have an opinion on this subject unless I can state the arguments against my position better than the people do who are supporting it. I think that only when I reach that stage am I qualified to speak.” Now you can say that’s too much of an iron discipline..it’s not too much of an iron discipline. It’s not even that hard to do.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin

The Business Case for Non-Financial Reporting

With an EU directive on non-financial disclosure now being translated into national legislature, social and environmental reporting seems to become a reality for an increasing number of organizations. Although it still isn’t mandatory for all companies to disclose non-financial information, the trend is clearly towards greater disclosure. In this month’s piece, I argue that your organization should start reporting on social and environmental topics even if regulators aren’t (yet) requiring this.

What’s the current status on non-financial reporting? The EU directive states that listed companies, banks and insurance firms will all have to disclose non-financial information on environmental and social topics. My guess is that it won’t stop there: the Netherlands is already opting to extend the scope to other public-interest entities such as pension funds, housing associations and utility firms.

Beyond the group of organizations already mentioned, other firms are also experiencing increasing pressure from stakeholders to disclose more information on their operations and supply chains. Whatever the reason (e.g. regulation, pressure from stakeholders, or maybe even new worldviews such as radical transparency or the purpose economy), companies are faced with mounting pressures to report beyond mere financial statements. We could see this as an increased attention to the distribution of “goods” and “bads” as Garrett Hardin already pointed out in his 1985 classic Filters Against Folly:

Every human activity produces both things that we want – “goods” – and things we don’t want – “bads”. How should society distribute these goods and bads?

Hardin goes on to argue that:

.. most people living today would say that even if it is historically true that the widespread externalizing of business costs was causatively responsible for the rise of modern civilization, we cannot, from here on out, tolerate the practice. Regardless of the past, future policy must insist on internalizing the cost of production.

It seems that with recently implemented policies like the EU directive on non-financial disclosure, the future that Harden writes about in 1985 is becoming a reality.

Two concepts that Hardin introduced in Filters Against Folly show that you can use non-financial reporting as an opportunity to boost the competitive advantage of your organization.

Non-financial reporting to update your business strategy

A useful mental model in analyzing the business world is what Hardin calls the “Double C–Double P Game”, which stands for commonize costs and privatize profits. (Commonizing here means the spreading of the cost of an activity over a population no matter who profits from the cost. Privatizing means the profit from activities accrues to the company, and the company alone.) This model seems to be under attack with the increased attention of stakeholders to disclose which costs are commonized (e.g. land and water use, CO2 emissions, etc.) and which profits are privatized (see the discussion on the relationship between tax evasion and corporate social responsibility in The Economist).

Although the increased scrutiny of stakeholders can seem troublesome to many an organization in terms of increased effort to dig up information from every nook and cranny of operations, I see opportunities to actually use it for a renewed look at the company’s strategy and perceived competitive advantage.

There are numerous frameworks available to help an organization structure its non-financial reporting, these include ISO 26000, the United Nations Global Compact and the Global Reporting Initiative. Rather than just use these as reporting guidelines, a company can actually use these frameworks to understand where it may be losing ground in the “Double C–Double P Game”. You may ask yourself which topics need more attention because costs are becoming increasingly internalized, or how you can continue privatizing the profits of the operations you undertake. Social and environmental analysis might prove indispensable in effectively adjusting your business strategy to accommodate  a new business environment where your current business’s specific and carefully crafted “Double C-Double P Game” is losing ground.

Non-financial reporting to build better stakeholder relations

In addition to the benefit of building a strategy matched to the new business environment, there is a second opportunity in reporting on non-financial performance. If done well, a company might actually improve the relationship with all stakeholders by melding together what Hardin calls numeracy, literacy and ecolacy filters. Companies might be under attack from differing interest groups that are biased toward using one filter only. Literacy, for example, can do much harm if it is not accompanied by numeracy to put statements into perspective. Hardin explains:

There is no royal road to rationality. Whatever means we are tempted to use, we must be wary of the poetic approach. Rhetoric (..) may give one a wonderful “oceanic feeling” (to use Freud’s term), but this feeling is more likely to prevent than to facilitate advances in understanding. It is when ecological rhetoric is most beautiful that we must be most on our guard.

And, again about abusing literacy:

.. the wish to escape debate disguises itself under a multitude of verbal forms: infinity, non-negotiable, never, forever, irresistible, immovable, indubitable, and the recent variant “not meaningfully finite.” All these words have the effect of moving discussion out of the numerate realm, where it belongs, and into a wasteland of pure literacy, where counting and measuring are repudiated.

Of course, the company itself should also avoid vague words and terminology. One of the techniques proposed by Hardin to make clear what is actually meant, is operationalism:

Faced with conflicting views, the critical analyst asks, “What operations are implied by these statements?” Once the operations are made clear, difficulties usually evaporate.

After answering the question ‘What do thew words actually mean’ (i.e. literacy), Hardin further proposes to go beyond mere numbers to numeracy:

In spite of its name, numeracy is concerned with more than numbers. The relative size of quantifiable factors is often more important than their exact measures. The importance of scale effects can be appreciated with little actual measurement.

He gives an example where the numerate filter is a useful addition to the literate filter:

Dichotomies are favored over quantities. It is so comforting to divide polluting substances sharply into the categories of “safe” and “unsafe”. (..) Nature is silent. Nature does not tell us when “safe” slips over into “unsafe”; men and women, reasoning together, must legally define “unsafe”. (..)“Safe” and “unsafe” are literate distinctions; nature is numerate. Everything is dangerous at some level. Even molecular oxygen, essential to human life, becomes lethal as the concentration approaches 100 percent.

Finally, Hardin proposes to combine literacy and numeracy with ecolacy ‘to ferret out at least the major interconnections’. With ecolacy we ask the basic question “And then what?” to sort out what the unwanted consequences are ‘to grant a modicum of justification for the position of society’s nay-sayers’:

Excessive ecolacy can lead to conservatism of the most stultifying sort. For prudence’s sake, ecolacy must be combined with numeracy. Any action that we take – and inaction is a form of action – leads to some unwanted consequences. Prudence dictates that we compare the advantages and disadvantages of all proposed courses of action, choosing the one that, on balance, is quantitatively best.

By using non-financial reporting not as ‘just another report’, but as an actual effort to communicate through multiple lenses (i.e. literate, numerate and ecolate), I see huge opportunities in creating a more effective strategy, and better stakeholder relationships with fewer controversies.

Please share my blog post with your networkEmail this to someone
email
Share on LinkedIn
Linkedin