Chapter Two


Bust your biases

See through the games your brain plays

It is paradoxical yet true to say that the more we know, the more ignorant we become in the absolute sense, for it is only through enlightenment that we become conscious of our limitations. Precisely one of the most gratifying results of intellectual evolution is the continuous opening up of new and greater prospects.

Nikola Tesla

BENEFITS OF THIS MENTAL TACTIC

Systematically surfacing and examining your biases is a way to expand your field of vision – to not only see more clearly but also to see more broadly. This is critical at the evidence-gathering stage. The quality of your analyses and recommendations (outputs) is a function of your inputs (the data and evidence you assemble). Your goal should be to be as objective and comprehensive as possible.

Remember, these are only tendencies. Not everything we talk about here will resonate with you. You might even feel like you never do any of these things. While we might politely suggest you’re at risk of the overconfidence bias, that’s ok. It’s an antidote to snap judgements and an aid to thinking clearly about situations and people.

We think this mental tactic will help you when you are in information-hunting mode, processing mode or reflecting mode.

SEE THROUGH THE GAMES YOUR BRAIN PLAYS

Have you ever arrived home at the end of a long day and not been quite sure how you got there? Do you order the same thing over and over again for lunch, even if somewhere in your mind you are aware there is something you might like more if only you could discover it? Have you ever met someone new who looked a bit like a friend of yours and immediately formed a view about what their personality must be like? Even months later, once you realised they were nothing like your friend, did it still take you a while to shake your initial assumptions?

We have talked about how to overcome your blind spots and to detect and change erroneous beliefs. Now we will talk about what happens when you start to gather evidence to update these beliefs. We want to highlight the ways in which our human processing skills are limited, how strongly instinctive reactions drive our decisions and guide you through the predictable biases you might encounter. We will then give you some tools to overcome them.

COGNITIVE BIASES

In recent decades, psychologists and behavioural economists have uncovered a number of cognitive biases that systematically distort our information-gathering and decision making. The dual processing theory argues that human decisions are guided by two separate systems.25 System 1 is the older system, a hangover from our past as primates, and relies on our intuition. System 2, on the other hand, is guided by reason and thought.26 You can think of System 1 as the instinctive system and System 2 as the deliberate system.

System 1 thinking is typically much faster, more frequent and automatic, but relies on stereo-types and rough approximations. Our predecessors wouldn’t need to know if the blurry object on the horizon was a lion or a rhino. The only important thing was that the fight or flight trigger be pressed. System 1 thinking is a kind of mental autopilot. If you’ve ever pulled your car into your driveway while lost in thought and been a little startled to find yourself there, System 1 was doing the guiding. System 1 is incredibly useful as it allows us to navigate the world without constantly having to make and re-make decisions, preventing us from becoming totally overwhelmed.

System 2 thinking is often characterised as deliberate, effortful and reflective. We use System 2 thinking, for instance, when trying to form sentences in a language we are just learning, following the directions to assemble IKEA furniture, or making the trade-offs associated with opening the company’s new business unit.

Many of the biases that behavioural scientists have uncovered over the last few decades stem from using System 1 thinking in situations that require more thought and reflection – that is, in situations where System 2 might serve us better. In many cases, System 1 thinking narrows our perspective in a way that makes us focus only on one particular aspect of a situation, ignoring all others. Here we’re going to explore the ways in which biases can limit our ability to effectively gather information – and how to do better.

Evolution helps – and hurts

As the state of cognitive science, social psychology and research into judgement and decision making expands, we learn ever more about the limitations of human beings as information processors and decision makers. To date, researchers have documented more than a hundred cognitive biases that can impede our judgement. Ironically, it is precisely some of these biases that have allowed human beings to survive as long as we have. The world is a complex place, and there is a great deal of data coming at us all at once. You can think of our biases almost like blinders that help us sort friend from foe, food from poison, threat from opportunity. In the modern world, our information-processing demands are frequently much more complex.

Our goal here is to identify some categories of blinders that are likely to affect you – and all of us – as we make decisions. By definition, this list cannot be complete, but we hope that you will take three things away from this:

  • Information-gathering and processing is rooted in our animalistic selves; truth-seeking is secondary to rapid information processing and self-preservation.
  • There are a number of predictable biases that will surface again and again as you and your teams attempt to gather information systematically and make sense of it.
  • But you can take steps to recognise, reduce and counteract these biases. We will share some of those tools here.

Researchers have now documented more than a hundred cognitive biases.27 We will introduce those that we consider to be the most important to the evidence-gathering process.

THE 3Ss: SIMPLIFYING, SENSE-MAKING AND STICKING

Here we share a short introduction to several of the biases that you are most likely to see at play in your life and work. Since Part 1 focuses on collecting evidence, we’re going to focus only on those that limit your observation at the evidence-gathering stage. Don’t worry, some of the others will make an appearance later as we get into the deciding and taking action phases.

As we begin to process information in preparation for making a decision, we typically encounter three types of distortions. We simplify, we try to make sense of what we are observing and then we stick to the view we have formed. Without awareness and training to see through these biases, it’s almost impossible to catch them in the moment.

We simplify and stereotype too fast

In some ways, our brains are like sharks, powering through the water in search of prey, biting down immediately on any information that crosses our path. Human beings are gifted information-seekers. But like a hungry shark, we will often bite down on the first piece of data to cross our path, without pausing to examine it closely. Latching on to a single point of data isn’t a problem in itself. In fact, it can often be very efficient. The issue is that it tends to colour all other information that we receive: a phenomenon known as anchoring. These single points of evidence are the things we remember and use to make decisions, often at the expense of more robust averages or representative examples.

The broad category of stereotyping includes a number of biases relevant to observation and data collection, such as the ‘conjunction fallacy’. Take the original example that Tversky and Kahneman offer in their seminal 1983 paper:

“Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.”

They then go on to ask: “Which of the following two statements is more likely to be true?

  1. Linda is a bank teller.
  2. Linda is a bank teller and active in the feminist movement[…]”28

It turns out that the overwhelming majority of people, when presented with this case, answer #2. This is even true for many statisticians. If you look closer, however, you notice that the likelihood of any two statements combined (A is true and B is true) needs to be at least as high as only one of them (A is true). But because Linda’s description fitted neatly into our preconception of a feminist, we immediately jump at explanation number 2.

Take a look at the left of the following graphic. The overlap (shaded area) visualises the probability of Linda being a bank teller and a feminist. The overlapping area will always be smaller than (or equal to) the area in the circle ‘bank teller’. The story ‘Linda is a bank teller and active in the feminist movement’ is, from a standpoint of narrative plausibility, much more salient to us. But statistically, the statement ‘Linda is a bank teller’ is more likely to be true.

An image of a Venn diagram with two overlapping circles. The circle on the left is titled ‘A bank teller’ while the one on the right is named ‘Active in the feminist movement’. The overlapping region is highlighted.

We seek sense and stories even where they don’t exist

Just as we have a desire to clutch the first piece of information we see, we also have a tendency to try and make sense of the information we take in as rapidly as possible. As human beings, we have an instinctive desire to tell stories that make sense of the world around us. And we like the stories we tell to be simple, with a strong preference for single-cause explanations. In reality, events almost never have a single cause, but the stories we tell ourselves and each other often do.

Our brains also hold on to these stories much more tightly than we hold on to data. In an informal study, Professor Jennifer Aaker at Stanford University asked her students to recall everything they could about presentations delivered by their fellow students. Just 5% of students could recall the data presented, but a huge 63% of students could recall the stories that were told during the presentations. Aaker told The Guardian in a subsequent interview: “Research shows our brains are not hard-wired to understand logic or retain facts for very long. Our brains are wired to understand and retain stories.”29

This effect can be even stronger when we have a nugget of information to shape our story around. For instance, we often rely on the outcomes of a decision to evaluate its quality or to inform future decisions. A good example of this is when we catch ourselves saying something like: “Sheila turned out to be a really good marketing manager, so our recruiting processes must be good.” Using what we know with hindsight to evaluate how we should have decided initially is risky. It does not allow us to correctly weigh random chance or calculated risks.

To see how much we love stories and formulating conclusions, let’s look at an experiment by decision scientists Jonathan Baron and John Hershey, who tried to quantify outcome bias. They presented subjects with the case of a 55-year-old man with a heart condition, which could be relieved by a type of bypass operation. But that operation, they told the students, came at a risk as 8% of those undergoing the surgery died. The doctor decided to go ahead with the operation. They told half the subjects that the patient survived the surgery and half the subjects that he died. Subjects were much more likely to declare that the doctor had made the wrong decision when they thought the patient had died.30

We stick to the explanations we generate, and it’s hard to change our minds

Once we’ve got hold of some evidence and generated a story around it, it’s very difficult for us to move away from it. We love sticking to our stories so much that our brain will hunt for almost any excuse to do so. The term for this is ‘confirmation bias’ – our tendency to interpret new evidence in light of our existing beliefs or stories.

Confirmation bias affects the way we observe and process a huge range of inputs. For instance, your team has just taken on a new employee, Avery. In the first team meeting, you notice that Avery doesn’t say anything and you make a note of this data point. You tell yourself that Avery is a shy, introverted person. In every other encounter, you subconsciously look for evidence that confirms your story: meetings where Avery doesn’t speak up, disagreements where she seems reticent, conversations where you feel like she has little to contribute. So powerful is this confirmation bias, that you’re likely to forget that second meeting where Avery spoke up more than five times. You may not even register Avery chatting boisterously with colleagues or hosting an energetic discussion with new suppliers.

You’ve already formed your point of view and your brain is like a heat-seeking missile, looking for information to confirm its conclusions.

A series of nested circles.

DE-BIAS YOURSELF

Once you understand these games that our brains inevitably play, it’s natural to want to play too – to see if you can find a way to overcome your own biases. As you embark on that journey, we ask you to first recognise that System 1 can be valuable in many ways. We simply could not function if we had to carefully weigh and process all of the information coming at us at all times. While System 1 can be incredibly useful, it can also get in the way of thoughtful, objective decision making. It is a valuable goal to ensure that you can interrupt yourself at important moments. We want to share here the approaches to doing this that have been valuable for us.

Be suspicious about yourself

First of all, it is important to realise when a bias could be at work. As we will say often in this book, mindful awareness of how you’re making the decision, and the forces acting on you, is the first step to deciding differently. This kind of situational awareness allows you to earmark or flag situations that could potentially be affected by situational biases. How certain are you about a fact? How firm are you in your belief?

The higher your confidence level, the less likely you are to notice or actively look for evidence that could counter your existing beliefs. For instance, if you’re particularly enthusiastic about a company’s new product (maybe you even had a hand in creating it), it will be harder for you to listen to criticisms from other divisions, or rationally review market research that suggests people do not like the product.

When we meet new people, we instinctively form a judgement of them. Inevitably, that first impression will be weighted by our brains more heavily than the second, third or tenth impression. Indeed, that first impression will weigh so heavily that it will colour all subsequent impressions. As a starting point, as you go through your day tomorrow, just notice situations where you could be vulnerable to one of the 3Ss.

Look for ways to learn more about your own tendencies and situations where you might be more prone to bias without being consciously aware. One very valuable starting point is Harvard’s Implicit Association Test.31 It will highlight the connections that your brain makes naturally between categories (e.g. are you more likely to associate the word scientist with man or with woman? Are you more likely to associate the word parent with man or with woman?) Taking the tests won’t de-bias you, but it will help you notice where to focus.32

Deliberately build ways to gather objective evidence

Once you’ve identified a situation where you might be vulnerable to bias, you need a way to beat your brain at its own game. You need to put in place a way to gather new data and tell different stories. One of the best ways to do this is to deliberately gather objective, comprehensive evidence.

The two of us met during graduate school at Harvard University, where we observed an institution working hard to replace knee-jerk System 1 thinking with rational System 2 thinking. In the Harvard Business School MBA programme, class participation can count for half of a student’s final grade.33 You can easily imagine that in the first week, a professor forms a snap judgement of how often members of the class speak up. You can also envisage that a professor might be very influenced by stereotypes – perhaps by the assumption that men might be more confident or assertive than women.

This is historically what has happened, with men receiving consistently higher-class participation grades than women. As a way to make the class discussion process fairer for all, the business school introduced a tracking system – a scribe sat in the room recording notes on the contribution (or not) of each student. The professor could then refer to this objective record while assigning grades, rather than relying on their own biased impressions and memory.

Deliberately diversify

As you collect evidence, deliberately look for opposing views. You can do this in almost any domain – seeking out media you wouldn’t typically consume (perhaps a newspaper or news website), keeping a deliberate tally in meetings of how often your timid team member speaks up in weekly meetings. In our personal lives, we ask friends who hold very different political views to suggest reading materials or commentators that we might like to follow on Twitter. We do the same with our friends from different parts of the world. It’s easy for Julia to be well-informed about the latest developments in Australian politics, for example, but she needs some nudges to pay attention to news from Southern Europe.

CHECKLIST

How to de-bias yourself

tick CHECK YOURSELF BEFORE YOU WRECK YOURSELF

As we encouraged you before, adapt a sceptic’s mindset – even towards yourself. It is healthy to start from the assumption that your frame of reference is limited and to look for ways to expand it.

tick BUILD AN OBJECTIVE EVIDENCE BASE

When the decision really matters, start with the goal of systematically gathering evidence to answer the question, rather than relying on your own impressions. Be able to answer the question: “What would make you change your mind?”

tick MANAGE INCOMING MESSAGES

Seek ways to diversify your inputs right now. Don’t wait until you are trying to solve a really difficult problem to do this, open yourself up to new perspectives as a matter of course.

THE BOTTOM LINE

Our minds use a lot of shortcuts to help us navigate the world. The problem is that most of these shortcuts are adapted for an environment that is long gone, and lead to biases in modern social settings. As far as gathering data and evidence is concerned, three types of biases are particularly relevant. First, simplification and stereotyping. Second, accepting stories that seem to make sense too quickly. Third, the inherent stickiness of the beliefs we hold. Actively de-biasing yourself takes time, but can be learned. It all starts with acknowledging the various distortions, being mindful and aware, as well as practising ways to deliberately shift down into System 2.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset