Appendix B. Cognitive Biases: Common Mental Mind Traps

Mind traps are like optical illusions that fool you into thinking you’re right when you’re not. It is important to get a proper introduction to these mental mistakes because you will run into them as you practice a new way of thinking. Social scientists have identified many different kinds of mental mistakes associated with the way human beings process information. The pioneering work of Amos Tversky and Daniel Kahneman, culminating in Kahneman being awarded the Nobel Prize in 2002 for economic sciences, sheds light on the human tendency to make systematic errors in certain situations.

Basically, humans use simple rules, heuristics, to make judgments. Although heuristics are quite useful, they operate like shortcuts that save us time, but they also create systematic errors. A few of the more prevalent ones are described here. They are grouped so that you can see which of the five steps of critical thinking they are most likely to impact.

Grabbing Glory and Pushing Blame (Step 2: Recognize Assumptions)

• A fundamental attribution error is an error in attributing cause. If someone makes a mistake, there is a tendency to attribute the mistake to the individual’s personality rather than the situation. For example, if someone makes a mistake at work, the cause will more likely be attributed to a personal shortcoming than work overload or time pressure. Not really fair, is it? This trap leads to faulty assumptions and poorly defined problems that skew a situation in the wrong direction from the start. To minimize this error, analyze the situation by asking questions about the environment and its impact.

Self-serving bias is the tendency to make assumptions about what is fair or right in a way that favors our own self-interest. For example, if you ask four people how much (what percentage) they contributed to a project, the number will exceed 100 percent. We tend to take more than our fair share of credit. When information is ambiguous, we tend to interpret it in a way that benefits our self-interest. To minimize this error, pay particular attention to the contributions of others, and recognize that you might be underestimating their time and value, relative to your own. An open-minded style will help you become more aware of the contributions of others.

Asking the Wrong Questions (Step 3: Evaluate Information)

Confirmation bias is the tendency to search for information that confirms your beliefs. If you are responsible for making important decisions, underline this mind trap and keep it squarely in your view because this trap snaps on a regular basis and you don’t want to make a crucial decision with lopsided information. To minimize this bias, ask yourself, “Am I being objective?” and actively seek out people who will articulate a contrary view. Look for people with inquisitive and truth-seeking styles who can help you explore all sides of a position.

Anchoring is the tendency to give undue weight to the first information you receive. Hammond, Keeney, and Raiffa1 asked people two questions that we invite you to answer:

• Is the population of Turkey greater than 35 million?

• What is your best estimate of Turkey’s population?

They found that information in the first question, specifically the figure 35 million, influenced the answers to the second question. When they used 100 million in the first question, peoples’ estimates to the second question were much larger. The information anchored how they thought about the question. As you can imagine, anchoring is used as a negotiation technique, so be aware of how initial numbers and information can impact the way you evaluate subsequent information.

The framing effect occurs when a person’s response changes based on the way the question is framed. Consider the example we mentioned previously: You need surgery and your doctor says to you, “92% of the patients survive surgery.” That sounds positive. Now let’s say the doctor says, “8% of the patients die in surgery.” That doesn’t sound as good, and people are more likely to reject the latter statement. The same odds of survival, but different acceptance due to the way the information is framed. When you gather information, look at the frame because it could unduly influence your decision. You want to focus on the information (e.g., odds of survival), not the way it is framed.

Group think occurs when members of a tightly knit group try to minimize conflict and reach consensus without critically testing, analyzing, and evaluating ideas. The Kennedy administration’s decision to invade Cuba (Bay of Pigs) and the George W. Bush administration’s decision to invade Iraq have both been described as examples of group think. The ingredients for group think include an inner circle of advisors who are closely aligned and the absence of someone who holds an alternative viewpoint or plays devil’s advocate. A truth-seeking style is particularly valuable for surfacing the tough, but necessary questions.

Curious Conclusions (Step 4: Draw Conclusions)

Optimism bias is the tendency to overestimate positive outcomes and underestimate negative outcomes. This bias is a double-edged sword because optimism is an admirable quality associated with being resilient, but underestimating risk is dangerous. The best safeguard against this mind trap is good planning. Everyone in “Amenah’s Story” maintained a positive attitude, but their plans were meticulous and they recognized and accepted that a single glitch could stop the project.

Planning fallacy is the tendency to underestimate the time, costs, and risks of future actions and, at the same time, overestimate the benefits of those same actions. Think about the last project at work that was late, had cost overruns, and fell short of expectations. It probably didn’t take you long to come up with an example because this mental mistake occurs frequently. To counter this fallacy, leverage timely and analytical thinking styles as you prepare to make a decision.

Sunk cost fallacy is also common and occurs when we make a decision in a way that justifies a past decision. It is reminiscent of the catch phrase “throwing good money after bad.” Sunk cost comes from economics where the past investment (of time or money) can’t be recovered and should be irrelevant to the present decision, but research shows that it is not. If you have already invested in a project or relationship, you are likely to hang on and want to make it work because of your past involvement rather than your analysis of the future success. To minimize this bias, you need to do an emotional temperature check and bring feelings (e.g., regret, fear of failure) to the surface, so you can more readily assess their influence. Then, shift your attention to an analysis of the current and future investments required and the likely return on investment.

Endnote

1. Hammond, John S., Ralph L. Keeney, and Howard Raiffa. 1998. Smart Choices: A Practical Guide to Making Better Decisions. Harvard Business School.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset