Chapter 4

What Color Are Your Glasses?

Chapter Key Benefits

Image Learn about the dangerous judgment errors that result from our gut reaction of seeing the world through filters that match our beliefs, as opposed to seeing reality clearly.

Image Understand the business risks for ourselves and our organizations failing to overcome our predilection to fall for the comfort of seeing the world through rose-colored glasses.

Image Secure for yourself and your team the most effective tools to overcome the dangerous judgment errors resulting from failing to perceive uncomfortable truths.

I was struck by a sentence buried deep in a 2018 Reuters article about the bankruptcy of the number two US nursing home chain, HCR ManorCare, that accrued more than $7 billion of debt. ManorCare, based in Toledo, Ohio, transferred ownership of its assets, valued at $4.3 billion, to its landlord, Quality Care Properties, with which it signed a master lease for 289 facilities in 2011.

Here's the sentence that struck me (see if you can spot what made me do a double-take): “ManorCare said revenues have failed to cover monthly rent obligations since 2012, a year after the master lease was signed.”1 That's right, only a year after the lease was signed, ManorCare couldn't make its rent.

It kept sliding deeper into debt for the next five years until it declared bankruptcy. ManorCare blamed a range of problems, such as decreased government reimbursement rates, low occupancy in its nursing homes, and a shift to alternative nursing care services such as home health care and retirement communities.

The question that popped into my head was: Why didn't ManorCare's executive team foresee these problems down the road? They didn't know that the government reimbursement rates would decrease? Didn't they have statistics on the occupancy in their nursing homes? Wasn't the shift to home health care and retirement communities obvious as well?

I was paying particular attention to ManorCare because I had a speech lined up later that year at the Ohio Health Care Association. As I do for all of my speeches, I read up on the region's industry to customize my content and make it highly relevant to the audience. After my presentation, I talked with a number of long-term health care executives about the ManorCare fiasco. They all told me that the trends on which ManorCare blamed its problems were clearly visible long before 2011, the year it signed the contract that doomed it.

Perhaps you are wondering if ManorCare had new leadership, and because of that change, they couldn't have predicted this problem? Nope. CEO Paul Ormond was at the company's helm for thirty-two years before he left (or was forced out of) his position in September 2017. It was under his leadership that ManorCare became one of the two largest nursing home operators in the nation. Incidentally, he got a payment of more than $116 million as part of the bankruptcy proceedings.

So how could this long-time executive and his well-experienced team drive a billion-dollar giant over a clearly visible fiscal cliff? More importantly, if it happened to them, could it happen to you?

Seeing Is Believing? Not Really!

Research shows it can happen to all of us—yes, including you and me. No one is immune. If you think you might be, go back and reread the first chapter about gut reactions.

Remember what I mentioned in an earlier chapter about the four-year study by LeadershipIQ.com, which interviewed 1,087 board members from 286 organizations that forced out their chief executive officers?2 It found that almost one quarter of CEOs—23 percent—got fired for denying reality. In other words, they refused to recognize clearly visible negative facts about the organization's performance. Plenty of top executives join Paul Ormond in failing to see very obvious and very unpleasant facts about their businesses.

In September 2015, the German car giant Volkswagen acknowledged that it used cheating software in its VW and Audi cars to give false readings when the cars underwent emission tests. Known as Dieselgate, the revelation shook up the car industry and led to the resignation of CEO Martin Winterkorn, along with several other top leaders. VW's stock fell more than 40 percent throughout the next few days, and the overall cost of the scandal to the company has been estimated at more than $20 billion. Of course, the discovery of this falsehood was inevitable, just as Enron, WorldCom, and Tyco's accounting frauds were.

It is mind-boggling that top CEOs can ignore facts, but they are not the only ones guilty of doing so. Indeed, a Harvard Business School professor, Richard Tedlow, wrote a book dedicated to the topic of denial in business settings. He found that such denialism is a fundamental component of many business disasters, calling it the greatest obstacle business leaders face.3 I wouldn't go as far as he does when he says “greatest obstacle.” Still, my experience as a consultant and coach agrees with his and others' research findings: failing to see a business reality that's in front of your nose is a huge problem at all levels in all organizations, as well as for solopreneurs.

Whereas ManorCare, Enron, and Volkswagen are billion-dollar disasters, smaller versions of the same problem occur every day. Why do you think research shows that most restaurants fail in less than three years after opening their doors?3 It's not like their owners set out to fail. It's simply that they didn't (or didn't want to) see the truth about the marketplace.

After all, staring unpleasant truth in the face challenges our self-identity as successful. Many leaders work very hard to convey an appearance of success to themselves and others, and reject any sign they might have made a mistake. This unwillingness to acknowledge mistakes is one of the worst—and unfortunately all too common—qualities of leaders who are otherwise excellent.

Perhaps you're neither a giant like Volkswagen nor a small business like an independent restaurant, but somewhere in the middle market, with a revenue of $10 million to $1 billion. Nope, still not safe.

As a small example, many midsize businesses lose—sometimes dramatically—when pursuing what they see as winning synergy through mergers and acquisitions. Their leadership doesn't pay attention to extensive research that shows mergers and acquisitions fail to increase value for shareholders between 70 to 90 percent of the time.5 These failures happened, not in companies run by dummies, but by experienced, smart people who had a great deal of success in the past. If you're pursuing a merger or acquisition, you better be very confident that you are much, much better than the people who ran those companies, and understand thoroughly everything that made their merger or acquisition a failure before you pursue your own. The takeaway from this is that the old phrase “seeing is believing” simply doesn't apply to uncomfortable business realities.

Confirming Our Biases

So what's going on here? Why do so many business leaders who are generally perceived as highly competent and successful wear rose-colored glasses that prevent them from seeing obvious points of failure—everything from minor bumps to fiscal cliffs?

They are brought down by a series of related mental errors, the most prominent and well known of which is confirmation bias.6 It involves two parts. First, we look only for information that confirms preexisting beliefs, as opposed to disproving them. Second, we actively ignore any information that contradicts these beliefs, rather than putting a high value on this information.

You can hear echoes of the second part of the confirmation bias in Upton Sinclair's famous phrase: “It is difficult to get a man to understand something, when his salary depends on his not understanding it.” Paul Ormond still received a nice salary while driving ManorCare deeper into debt between 2011 to 2018, instead of admitting his grave failure of signing a terrible lease and trying hard to change the situation while ManorCare still had the financial resources to do so.7 According to investigators who charged Martin Winterkorn with fraud and conspiracy in May 2018, the former Volkswagen CEO apparently approved the use of the “defeat device” to falsify emissions standards, despite the obvious fact that eventually word would leak and the company, as well as his personal reputation, would be devastated.8

When you look for examples of information that confirms preexisting beliefs, you find leaders of large or midsize businesses who launch mergers and acquisitions, as well as entrepreneurs who start up restaurants, without first examining thoroughly the base rates and typical causes of failure for both types of endeavors. It's very typical for business leaders at all levels to look only for information that justifies their business case. I've sat in on more than a dozen meetings for clients during which senior executives waxed enthusiastically about a proposed acquisition or merger. Yet, not a word was uttered about the all-too-typical failures of such ventures. Fortunately, I was able to provide the needed (even if not very popular) service of throwing some cold water on these hyped-up plans. What about the numerous meetings where I—or someone else who could provide this dose of reality—wasn't present?

It takes a lot of guts for someone from inside an organization to break the atmosphere of “make nice” if the organization doesn't have a culture of healthy disagreement and searching for potential problems. Moreover, besides the confirmation bias, they have to face the related problem of belief bias, a mental failure mode where our desire to believe the conclusion warps our evaluation of the evidence.9 Combined with the confirmation bias, belief bias makes it very hard to oppose strategies when high-level executives explicitly endorse them.

Although theses biases are obviously very dangerous for the health of our bottom lines in the modern context, they helped facilitate our survival in the savanna. Back then, it was much less important for us to figure out what was true than to align our perceptions about reality with those of our tribe. We are the descendants of those early humans who succeeded in doing so. As a result, our gut reaction is to be very uncomfortable when we face information that goes against the beliefs of others in our tribe, especially authority figures such as the CEOs of ManorCare, Volkswagen, or Enron, or a top executive dead set on a foolhardy acquisition.

EXERCISE

I know it can be really uncomfortable to face the cold, hard truth of reality, and I believe in your ability to stretch your comfort zone and avoid the harsh fate of many leaders and professionals who fell into denialism and ruined their careers. You can advance that outcome by doing all the exercises in this chapter. Take the time to reflect on the following questions for a few minutes, and write down your answers in your professional journal:

Image Where have you fallen for confirmation bias in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

Image Where have you fallen for belief bias in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

Stick My Head Where?

Scholars have a specific term for what happens when we actively deny negative reality that's staring us in the face. You won't be surprised that it's called the ostrich effect, a cognitive bias named after the (ironically mythical) notion that ostriches stick their heads in the sand whenever they see a threat.10

I fell for it during the economic downturn following the 2008 fiscal crisis, when a number of clients stopped returning my calls. I didn't want to face the negative economic reality and failed to reorient as quickly as I should have in pursuing a more appropriate business strategy. In the midst of recovering from the tough times they faced, clients didn't have the energy or focus to invest in my services, even if that was when they could have used them most to avoid problems down the road. I eventually had to cut expenses much more drastically than would otherwise have been the case, and I still regret making that mistake.

What exacerbated the problem for me in 2008 was the normalcy bias, our tendency to underestimate both the probability and the impact of a major disaster.11 I did not realize the devastating extent of the Great Recession; I believed it would be a much shorter and quicker crisis than it proved to be.

The normalcy bias applies to individual companies and people as well as major global disasters. Consider the 1995 collapse of the Barings Bank in London. Nick Leeson, its head derivatives trader in Singapore, made a series of unauthorized bets on the Japanese markets from 1992 to 1995. He was able to hide more than $1.3 billion (yes, that's a “b,” not an “m”) in losses, due to what a later investigation called “a failure of management and other internal controls of the most basic kind.”12 The bank went bankrupt, all because its leadership could not imagine—and did not institute appropriate controls on—the kind of disaster that Leeson brought about.

Let's consider another example at a real estate management company for which I consulted. A manager refused to acknowledge that a person hired directly by her was a bad fit, despite everyone else in the department telling me that the employee was holding back the team. The other members dropped hints to the manager but didn't want to bring up this matter directly. She was known to express anger at those who brought her bad news, a cognitive bias known as the MUM effect, and more colloquially as shooting the messenger.13 Not a healthy tendency for avoiding confirming our biases, as you can well imagine.

EXERCISE

Take the time to reflect on the following questions for a few minutes, and write down your answers in your professional journal:

Image Where have you fallen for the ostrich effect in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

Image Where have you fallen for normalcy bias in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

Image Where have you fallen for the MUM effect in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

I Can Do Anything Better Than You!

Let's move on to a different failure mode affiliated with confirmation bias, a problem I often see undermining teamwork and collaboration, namely, when people tend to claim credit for success and deflect blame for failure. You might call this human nature, but behavioral science scholars call this the self-serving bias.14

When I conducted a needs analysis on improving employee engagement and teamwork for a US factory of a large international car manufacturer, I noticed that the teams the organization tried to build experienced substantial internal tensions. The existing culture favored individualism and competition over teamwork and collaboration, an atmosphere in which self-serving bias thrives. We had to address this problem as part of the broader effort to increase teamwork in the factory.

A related problem at a biotechnology company for which I consulted stemmed from the rumor mill, which passed along gossip that included many outright lies. Unfortunately, the more frequently people hear a claim, the more they believe it, regardless of whether it's true, a phenomenon known as the illusory truth effect.15 In other words, hearing the same falsehood over and over again, whether from the same person or not, makes us more likely to believe it's accurate. See what I did there? I had two sentences that meant exactly the same thing, but you believed me more after reading the second sentence. That's a perfect illustration of the illusory truth effect.

This cognitive bias is a specific case of a broader phenomenon known as the mere exposure effect, where simply being exposed to some external stimulus reduces the perception of novelty and potential threat, and makes us more comfortable with it.16 Hearing the same rumor many times makes people more comfortable with the rumor. Our gut reactions mistake the feeling of comfort for the feeling of truth, and employees believe the rumors.

EXERCISE

Take the time to reflect on the following questions for a few minutes and write down your answers in your professional journal:

Image Where have you fallen for self-serving bias in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

Image Where have you fallen for illusory truth effect and mere exposure effect in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

Halos and Horns

Our tribal nature causes us to ignore negative information about people we perceive as part of our tribe, and vice versa for those we don't, which leads to two linked cognitive biases. The halo effect describes a mental error we make when we like one important characteristic of a person; we then subconsciously raise our estimates of that person's other characteristics. Conversely, the horns effect reflects the mistake of subconsciously lowering our estimates of a person when we don't like one salient characteristic.17

These biases usually start with our perception of tribal affiliation, meaning whether that person belongs to a group with which we identify. If you ever walked into an office and were struck by the similarities between the personalities, physical appearance, and background of the staff, then you know what I mean. The halo effect and the horns effect are especially dangerous in promotion and assessment.

They critically undermine diversity and inclusion efforts, which not only lead to calamitous legal action and terrible PR crises, but are simply bad business. Much research suggests that visible diversity—for example racial and gender—improves a company's bottom line. Likewise, invisible diversity such as differences in personality and perspectives facilitates better decisions, which also improve profits.

EXERCISE

Take the time to reflect on the following questions for a few minutes, and write down your answers in your professional journal:

Image Where have you fallen for the halo effect in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

Image Where have you fallen for the horns effect in your professional activities, and how has doing so harmed you? Where have you seen other people in your organization and professional network fall for this bias in their professional activities, and how has doing so harmed them?

Taking Off the Rose-Colored Glasses

Taking off the rose-colored glasses of confirmation bias and similar biases is easier said than done! Doing so goes directly against our intuitions, even more so than most other cognitive biases, as it may mean sacrificing our sacred cows. It's especially important to train ourselves to turn on our intentional system and avoid relying on the autopilot system to protect ourselves from confirmation bias.

Fortunately, there's extensive research on debiasing this bias; scholars focus on it because of how dangerous this problem tends to be. The most important strategy with the strongest impact as shown by research involves considering alternative options and explanations. For instance, if you hear consistent rumors through the grapevine about proposed layoffs, before polishing your resume, look for disconfirming evidence to fight the illusory truth effect. Is the economic situation in your industry or your company looking up or down? Is your supervisor looking worried or relaxed?

Take a similar approach to shaping the strategy of a company. As I talked several clients out of mergers and acquisitions (M&A) initiatives, and encouraged others to pay a much lower price than they intended, I focused on getting them to consider what would happen if their envisioned synergies didn't materialize and what would happen if they uncovered unexpected problems. After all, investment banks that facilitate mergers and perform “due diligence” are highly incentivized to make a sale to get a high fee. It's essential to have someone who doesn't have a stake in the deal to defend the company's money by arguing thoroughly for the alternative perspective and find evidence that casts doubt on an acquisition. Consider getting a retired company executive, an outside consultant, a member of the board of directors who wasn't involved in the acquisition planning, or someone else who can be maximally impartial.

As an example, one midsize law firm of several hundred lawyers was considering an acquisition of a smaller firm (just under a hundred employees). However, the area of expertise did not line up well with the expertise of the smaller one, and the price was pretty steep because the smaller one had other suitors. Likewise, the initial acquisition conversations showed some clashes between the internal culture at the two firms.

Another thing that helped convince the client to avoid the acquisition was the strategy of probabilistic thinking, particularly considering the base rates. I showed them—and every other client considering acquisitions and mergers—the astoundingly high failure rate of such endeavors. From a probabilistic perspective, it was even more likely for such failure to occur in cases where the leadership did not have extensive M&A experience. The law firm's leadership did not.

I was hired to argue for the “no” side and encouraged the client to compare the acquisition of this smaller firm to the next best alternatives, which in this case included saving money and time and focusing on their own business, or finding another firm to acquire. Eventually, my client decided to let go of this opportunity and undertake due diligence for future acquisitions that considered more thoroughly both expertise and cultural alignment. They did end up acquiring a smaller firm just over a year later after performing a much more thorough due diligence. The firm was much more aligned both culturally and in expertise, and the merger was quite successful.

We know that we tend to overvalue other people who are like us, and undervalue those who are not—the halo effect and the horns effect. So how do we go against our intuitions and address these problems to hire a diverse work force? That was the question posed to me by the regional manager of a New York City clothing store chain who oversaw about 4,000 staff members. She saw a presentation I did on diversity and inclusion at an HR conference and brought me in to consult on the lack of diversity in the store sales staff, which was causing a problem with selling to the diverse customers who entered the stores. Based on testing, the clothing store chain found that stores with more diverse staff (and everything else being equal) had higher sales volume.

The manager adopted the standard approach of using a structured interview process with points for each question to incentivize diversity hires, along with training in cultural competency to facilitate effective recruiting and interviewing. However, she still did not have nearly as much diversity as she wanted.

After I examined her hiring process, I helped her recognize the problem. The traditional approach to incentivizing diversity hires, while crucial, only helps address the horns effect, the tendency to avoid hiring people who are different. Unfortunately, it does not address the halo effect, the tendency to hire people who are like you.

So we worked to revise the structured hiring process. We specifically focused on the structured interview process, and combined probabilistic thinking and the use of numbers with the strategy of setting a policy to guide the organization. To address the horns effect in a more thorough manner, we had interviewers give interviewees positive points for all characteristics in which the interviewee and interviewer differed. These characteristics included traditional diversity categories but also less visible ones, such as socioeconomic backgrounds, accents, cultural preferences, and so on. In turn, to address the halo effect, we had the interviewer give the interviewee negative points for any characteristics, visible and invisible ones, in which the interviewer and interviewee were similar. We also gave additional positive points for the specific areas of diversity that the manager felt were lacking in each store.

It took some time to change the hiring process. We faced some resistance from the hiring staff at first, especially about giving negative points for similarities. They did not feel it was fair to “punish” job candidates just because they were similar to the interviewer. It took substantial training to get them to see that it's a natural human trait to rate people more highly if they liked one aspect of the person. Eventually, we were able to get the hiring staff on board and implement the new process. As a consequence, the new hires grew much more balanced and resulted in the kind of diversity—and eventually the kind of sales revenue—the regional manager wanted.

The probabilistic thinking base rate approach also applies to opening new restaurants. Prior probability suggests it's a very risky idea, so your restaurant business plan better have some special sauce (insert drum roll here) before you proceed.

Let's say you decide to proceed with your restaurant or merger effort despite the base rate. To help improve your estimates of success, use the probabilistic thinking approach of launching experiments to gain additional valuable information and update your probabilities of success or failure before you go all-in on your bet. Can you rent a food truck to see whether your recipes have sufficient appeal? Can you launch a partnership prior to the merger to see if the envisioned synergies on increased revenue or lowered costs actually exist?

To protect yourself from the ostrich effect, consider the long-term future and repeating scenarios and combine that with another strategy, making predictions about the future, in the areas of potential threat and revising your predictions regularly. Doing so can tell you whether your current course is serving you well. If I had made predictions during the 2008 fiscal crisis, I would have had a hard time fooling myself about my client base drying up. If ManorCare's leadership had considered the long term, it would have seen that there were no good options if it chose to proceed with the existing lease instead of admitting they goofed when signing it and renegotiating the agreement. If Volkswagen's executives made predictions about future threats, they would have had a hard time approving the cheating device, due to the catastrophic legal and PR threat it entailed.

Fixing the problem of self-serving bias necessitates the strategy of considering other people's perspectives. If you were in their shoes, how would you decide who deserves credit and who deserves the blame? It helps to make this explicit by talking about this issue in a team, and putting numbers on credit and blame. It might sound weird at first, and it takes some time to integrate into a team, but it works wonders to address hidden resentments and frustrations. That approach helped improve teamwork and collaboration for the car factory I mentioned earlier.

To address the problem of the MUM effect, it greatly helps for those in leadership roles to use the strategy of making a precommitment by very explicitly showing in words and actions that they reward and celebrate those who bring them true but negative information. If you are a leader, make such statements often, and then praise publicly those who bring you such information; integrate doing so into the evaluation process for bonuses, raises, and promotions to show that you're not giving lip service to this notion. If you're not in a leadership role, underline and show this section of the book to your supervisor (or buy him a copy) and talk with him about how much the organization can gain from this practice.

I know what you might think at this point: That's all very nice, and I am totally committed to avoiding the confirmation bias and affiliated cognitive biases. However, how do I address a situation in which a colleague, especially someone above me in the hierarchy, falls into these problematic modes of thinking?

That's one of the most frequent questions I get asked in the Q&A during my presentations when I bring up the problem of confirmation bias and other similar judgment errors. Fortunately, one of my areas of academic research—as well as my consulting and coaching practice—focuses on how to get people to accept uncomfortable facts.

Doing so involves a technique distinct from the ones you use when addressing cognitive biases within your organization, team, or yourself. It requires that you have a great deal of evidence to support your position, as well as some practice in low-risk settings with this technique. Otherwise, you're liable to use it incorrectly and have it backfire. Don't blame me if it happens, so proceed at your own risk.

Remember the manager at the real estate management company who was reluctant to acknowledge she had hired the wrong employee? I was in a somewhat precarious political position with her; although she was the subordinate of the person who hired me, if I pissed her off, she could complain to the big boss who hired me or subtly resist the change efforts I was working on inside the company.

The technique I used with that manager, and in many similar situations, can be summarized under the acronym EGRIP (emotions, goals, rapport, information, positive reinforcement). EGRIP offers a highly useful tool to get professional colleagues to change their minds toward the truth.18

Rather than offer facts, as most of us are tempted to do, start by figuring out the emotions that inhibit your colleague from seeing reality clearly. Use curiosity and subtle questioning to figure out her goals so that you understand the kind of underlying framework that results in false beliefs on the part of your colleague. Once you understand your colleague's perceptions of the situation, build up rapport by showing you care about her goals and empathizing with her emotions. Doing so cultivates both an intellectual and emotional bond, tapping into the mind and heart alike, and places you within her tribe. Now you can work together to address mutual concerns.

Remember the manager with the problematic employee? I had a conversation with her about the role she saw her current and potential future employees playing in the long-term future of her department. I echoed her anxiety about the company's financial performance and concerns about getting funding for future hires, which gave me an additional clue into why she was protecting the incompetent employee.

After placing yourself on the same side of building trust and establishing an emotional connection, move on to the problem at hand: the employee's emotional block. The key is to show the person without arousing a defensive or aggressive response, how his or her current truth denialism undermines the employee's own goals in the long term. This is the first step where you share uncomfortable information—step four, not step one.

I asked the manager to identify which of her employees contributed most to her goals for the department's long-term performance, which contributed the least, and why. It was crucial for me to have the numbers available without revealing that I had this information. I also asked her consider who contributed the most to the team spirit and unit cohesion, and who dragged down morale and performance. Knowing that she valued behavioral economics, I brought up research on why we sometimes make mistakes when we evaluate colleagues and how to avoid them.

After some back and forth, she acknowledged that the employee in question was a poor performer and a drag on the group. Together, we collaborated on a plan of proactive development for the employee; if he did not meet agreed-upon benchmarks, he would be let go.

For colleagues accepting the facts, conclude your conversations with positive reinforcement without any hint of condescension, an effective research-based tactic for changing people's behaviors through getting them to feel positive emotions about new behaviors. If the person can associate positive emotions with the ability to accept negative facts as an invaluable skill, the less likely it is that anyone will need to have the same conversation with her in the future. I praised her for the courage it took to make a tough decision about the employee, and she expressed appreciation for my positive words, which she acknowledged she got too seldom in her role.

Does that sound manipulative? Step back and recognize that all of our social interactions with each other are manipulations of some sort or another. Some people are just naturally better at it than others and we call them “leaders with charisma” or “good salespeople.” Hundreds of my clients prevented disasters for their organizations' bottom lines by using evidence-based methods like EGRIP, which only works when the person whom you're trying to convince holds false beliefs at odds with their own goals. I welcome you to use it throughout your business career as well. And if you ever see me holding mistaken beliefs, I urge you to use it on me too!

Deploying these strategies will empower you and others around you to avoid business disasters by fixing biases that prevent us from seeing reality clearly. In the next chapter, you'll gain the benefit of knowing when and how to be confident about your judgments.

EXERCISE

To avoid confirming your preexisting beliefs and help others in your organization and professional network do so as well, complete these exercises before going to the next chapter! Take a few minutes to reflect on the following questions, and write down your answers in your professional journal:

Image How will you use considering alternative scenarios and options to fight the biases described in this chapter? How will you help others in your organization and professional network use this strategy? What challenges do you anticipate in implementing this strategy and helping others do so, and what steps will you take to overcome these challenges?

Image How will you use probabilistic thinking to fight the biases described in this chapter? How will you help others in your organization and professional network use this strategy? What challenges do you anticipate in implementing this strategy and helping others do so, and what steps will you take to overcome these challenges?

Image How will you use making predictions about the future to fight the biases described in this chapter? How will you help others in your organization and professional network use this strategy? What challenges do you anticipate in implementing this strategy and helping others do so, and what steps will you take to overcome these challenges?

Image How will you use considering the long-term future and repeating scenarios to fight the biases described in this chapter? How will you help others in your organization and professional network use this strategy? What challenges do you anticipate in implementing this strategy and helping others do so, and what steps will you take to overcome these challenges?

Image How will you use considering other people's perspectives to fight the biases described in this chapter? How will you help others in your organization and professional network use this strategy? What challenges do you anticipate in implementing this strategy and helping others do so, and what steps will you take to overcome these challenges?

Image How will you use making a precommitment to fight the biases described in this chapter? How will you help others in your organization and professional network use this strategy? What challenges do you anticipate in implementing this strategy and helping others do so, and what steps will you take to overcome these challenges?

Image Finally, how will you use EGRIP (emotions, goals, rapport, information, positive reinforcement) to help those in your organization and professional network fight these biases? What challenges do you anticipate in implementing this strategy, and what steps will you take to overcome these challenges?

CHAPTER SUMMARY

Image We usually deny unpleasant business realities when they are uncomfortable to our gut.

Image Our intuitions push us to look for information in making business decisions that matches our existing beliefs, as opposed to evidence that might go against these beliefs.

Image We greatly underestimate the possible business impact of major disasters.

Image Our instinct is to claim credit for ourselves for success and deflect blame for failure.

Image We fall too easily for repeated rumors in business settings.

Image When we like one important characteristic of a person, our gut moves us to overestimate all other positive aspects of that person and downplay any negatives; the reverse happens when we don't like one important characteristic.

Image To address these tendencies to confirm our predispositions and instead see reality clearly to make good decisions, the following techniques are most helpful:

Image considering alternative scenarios and options

Image probabilistic thinking

Image making predictions about the future

Image considering the long-term future and repeating scenarios

Image considering other people's perspectives

Image making a precommitment

Image EGRIP (emotions, goals, rapport, information, positive reinforcement)

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset