Chapter 2
Simple Ideas, Complex Organizations

Precisely one of the most gratifying results of intellectual evolution is the continuous opening up of new and greater prospects.

—Nikola Tesla1

September 11, 2001 brought a crisp and sunny late-summer morning to America's east coast. Perfect weather offered prospects of on-time departures and smooth flights for airline passengers in the Boston-Washington corridor. That promise was shattered for four flights bound for California when terrorists commandeered the aircraft. Two of the hijacked aircraft attacked and destroyed the Twin Towers of New York's World Trade Center. Another slammed into the Pentagon. The fourth was deterred from its mission by the heroic efforts of passengers. It crashed in a vacant field, killing all aboard. Like Pearl Harbor in December 1941, 9/11 was a day that will live in infamy, a tragedy that changed forever America's sense of itself and the world.

Why did no one foresee such a catastrophe? In fact, some had. As far back as 1993, security experts had envisioned an attempt to destroy the World Trade Center using airplanes as weapons. Such fears were reinforced when a suicidal pilot crashed a small private plane onto the White House lawn in 1994. But the mind-set of principals in the national security network was riveted on prior hijackings, which had almost always ended in negotiations. The idea of a suicide mission, using commercial aircraft as missiles, was never incorporated into homeland defense procedures.

In the end, 19 highly motivated young men armed only with box cutters were able to outwit thousands of America's best minds and dozens of organizations that make up the country's homeland defense system. Part of their success came from fanatical determination, meticulous planning, and painstaking preparation. We also find a dramatic version of an old story: human error leading to tragedy. But even the human-error explanation is too simple. In organizational life, there are almost always systemic causes upstream of human failures, and the events of 9/11 are no exception.

The United States had a web of procedures and agencies aimed at detecting and monitoring potential terrorists. Had those systems worked flawlessly, the terrorists would not have made it onto commercial flights. But the procedures failed, as did those designed to respond to aviation crises. Similar failures have marked many other well-publicized disasters: nuclear accidents at Chernobyl and Three Mile Island, the botched response to Hurricane Katrina on the Gulf Coast in 2005, and the deliberate downing of a German jet in 2015 by a pilot who was known to suffer from severe depression. In business, the fall of giants like Enron and WorldCom, the collapse of the global financial system, the Great Recession of 2008–2009, and Volkswagen's emissions cheating scandal of 2015 are among many examples of the same pattern. Each illustrates a chain of misjudgment, error, miscommunication, and misguided action that our best efforts fail to avert.

Events like 9/11 and Katrina make headlines, but similar errors and failures happen every day. They rarely make front-page news, but they are familiar to most people who work in organizations. In the remainder of this chapter, we discuss how organizational complexity intersects with fallacies of human thinking to obscure what's really going on and lead us astray. We describe some of the peculiarities of organizations that make them so difficult to figure out and manage. Finally, we explore how our deeply held and well-guarded mental models cause us to fail—and how to avoid that trap.

Common Fallacies in Explaining Organizational Problems

Albert Einstein once said that a thing should be made as simple as possible, but no simpler. When we ask students and managers to analyze cases like 9/11, they often make things simpler than they really are. They do this by relying on one of three misleading and oversimplified explanations.

The first and most common is blaming people. This approach casts every failure as a product of individual blunders. Problems result from egotism, bad attitudes, abrasive personalities, neurotic tendencies, stupidity, or incompetence. It's an easy way to explain anything that goes wrong. After scandals like the ones that hit Volkswagen and Wells Fargo Bank in 2016, the hunt is on for someone to blame, and top executives became the prime target of reporters, investigators, and talk-show comedians.

As children, we learned it was important to assign blame for every broken toy, stained carpet, or wounded sibling. Pinpointing the culprit is comforting. Assigning blame resolves ambiguity, explains mystery, and makes clear what to do next: punish the guilty. Corporate scandals often have their share of culpable individuals, who may lose their jobs or even go to jail. But there is usually a larger story about the organizational and social context that sets the stage for individual malfeasance. Targeting individuals while ignoring larger system failures oversimplifies the problem and does little to prevent its recurrence.

When it is hard to identify a guilty individual, a second popular option is blaming the bureaucracy. Things go haywire because organizations are stifled by rules and red tape or by the opposite, chaos resulting from a lack of clear goals, roles, and rules. One explanation or the other usually applies. When things aren't working, then the system needs either more or fewer rules and procedures, and tighter or looser job descriptions.

By this reasoning, tighter financial controls could have prevented the subprime mortgage meltdown of 2008. The tragedy of 9/11 could have been thwarted if agencies had had better protocols for such a terrorist attack. But piling on rules and regulations is a direct route to bureaucratic rigidity. Rules can inhibit freedom and flexibility, stifle initiative, and generate reams of red tape. The Commission probing the causes of 9/11 concluded: “Imagination is not a gift associated with bureaucracy.” When things become too tight, the solution is to “free up” the system so red tape and rigid rules don't stifle creativity and bog things down. An enduring storyline in popular films is the free spirit who triumphs in the end over silly rules and mindless bureaucrats (examples include the cult classics Office Space and The Big Lebowski). But many organizations vacillate endlessly between being too loose and too tight.

A third fallacy attributes problems to thirsting for power. Enron collapsed, you can say, because key executives were more interested in getting rich and expanding their turf than in advancing the company's best interests. This view sees organizations as jungles teeming with predators and prey. Victory goes to the more adroit, or the more treacherous. You need to play the game better than your opponents—and watch your back.

Each of these three perspectives contains a kernel of truth but oversimplifies a knottier reality. Blaming people points to the perennial importance of individual responsibility. People who are rigid, lazy, bumbling, or greedy do contribute to some of the problems we see in organizations. But condemning individuals often distracts us from seeing system weaknesses and offers few workable options. If, for example, the problem is someone's abrasive or pathological personality, what do we do? Even psychiatrists find it hard to alter character disorders, and firing everyone with a less-than-ideal personality is rarely a viable option. Training can go only so far in ensuring semi-flawless individual performance.

The blame-the-bureaucracy perspective starts from a reasonable premise: Organizations exist to achieve specific goals. They usually work better when strategies, goals, and policies are clear (but not excessive), jobs are well defined (but not constricting), control systems are in place (but not oppressive), and employees behave prudently (but not callously). If organizations always operated that way, they would presumably work a lot better than most do. In practice, this perspective is better at explaining how organizations should work than why they often don't. Managers who cling to logic and procedures become discouraged and frustrated when confronted by intractable irrational forces. Year after year, we witness the introduction of new control systems, hear of new ways to reorganize, and are dazzled by emerging management strategies, methods, and gurus. Yet old problems persist, seemingly immune to every rational cure we devise. As March and Simon point out, rationality has limits.

The thirst-for-power view highlights enduring, below-the-surface features of organizations. Dog-eat-dog logic offers a plausible analysis of almost anything that goes wrong. People both seek and despise power but find it a convenient way to explain problems and get their way. Within hours of the 9/11 terror attacks, a senior FBI official called Richard Clarke, America's counterterrorism czar, to tell him that many of the terrorists were known members of Al Qaeda.

“How the fuck did they get on board then?” Clarke exploded.

“Hey, don't shoot the messenger. CIA forgot to tell us about them.”

In the context of its chronic battles with the CIA, the FBI was happy to throw the CIA under the bus: “We could have stopped the terrorists if CIA had done their job.”

The tendency to blame what goes wrong on people, the bureaucracy, or the thirst for power is part of our mental wiring. But there's much more to understanding a complex situation than assigning blame. Certain universal peculiarities of organizations make them especially difficult to understand or decipher.

Peculiarities of Organizations

Human organizations can be exciting and challenging places. That's how they are often depicted in management texts, corporate annual reports, and fanciful managerial thinking. But they can also be deceptive, confusing, and demoralizing. It is a big mistake to assume that organizations are either snake pits or rose gardens (Schwartz, 1986). Managers need to recognize characteristics of life at work that create opportunities for the wise as well as hidden traps for the unwary. A case from the public sector provides a typical example:

You might have noticed that Helen Demarco's case is more than a little similar to the scandals at Volkswagen in 2015 and Wells Fargo in 2016. At the Geneva International Motor Show in 2012, VW CEO Martin Winterkorn proclaimed that by 2015 the company would cut its vehicles' carbon dioxide emissions by 30 percent from 2006 levels. It was an ambitious goal that would have beat the targets set by European regulators to combat global warming.

But just like Paul Osborne, Winterkorn had set the bar too high. The engineers saw no way to meet the boss's goals, but no one wanted to tell him it couldn't be done. So, they cheated instead. There was a precedent because VW's cheating on diesel emissions had started back in 2008, and observers reported that “an ingrained fear of delivering bad news to superiors” (Ewing, 2015, p. B3) was a feature of VW's culture.

Like Helen Demarco and her colleagues, the VW engineers had other options but couldn't see them. Paul Osborne and Martin Winterkorn both thought they were providing bold leadership to vault their organizations forward. They were tripped up in part by human fallibility but also by how hard it can be to know what's really going on in any organization. Managerial wisdom and artistry require a well-honed understanding of four key characteristics of organizations.

First, organizations are complex. The behavior of the people who populate them is notoriously hard to predict. Large organizations in particular include a bewildering array of people, departments, technologies, strategies. and goals. Moreover, organizations are open systems dealing with a changing, challenging, and erratic environment. Things can get even messier across multiple organizations. The 9/11 disaster resulted from a chain of events that involved several separate systems. Almost anything can affect everything else in collective activity, generating causal knots that are hard to untangle. After an exhaustive investigation, our picture of 9/11 is woven from sundry evidence, conflicting testimony, and conjecture.

Second, organizations are surprising. What you expect is often not what you get. Paul Osborne saw his plan as a bold leap forward; Helen and her group considered it an expensive albatross. In their view, Osborne was going to make matters worse by trying to improve them. He might have achieved better results by spending more time with his family and letting his organization take care of itself. Martin Winterkorn was stunned when the hidden cheating blew up in his face, costing him his job and hitting VW with devastating financial and reputational damage.

The solution to yesterday's problems often creates tomorrow's obstacles. A friend of ours headed a retail chain. In the firm's early years, he had a problem with two sisters who worked in the same store. To prevent this from recurring, he established a nepotism policy prohibiting members of the same family from working for the company. Years later, two key employees met at work, fell in love, and began to live together. The president was startled when they asked if they could get married without being fired. Taking action in a cooperative venture is like shooting a wobbly cue ball into a scattered array of self-directed billiard balls. Balls bounce in so many directions that it is impossible to know how things will eventually sort out.

Third, organizations are deceptive. They camouflage mistakes and surprises. After 9/11, America's homeland defense organizations tried to conceal their confusion and lack of preparedness for fear of revealing strategic weaknesses. Volkswagen engineers developed software whose only purpose was to cheat on emissions tests, hoping that no one would ever see through their deception. Helen Demarco and her colleagues disguised obfuscation as technical analysis.

It is tempting to blame deceit on individual weakness. Yet Helen Demarco disliked fraud and regretted cheating—she simply believed it was her best option. Sophisticated managers know that what happened to Paul Osborne happens all the time. When a quality initiative fails or a promising product tanks, subordinates often clam up or cover up. They fear that the boss will not listen or will kill the messenger. Internal naysayers at Volkswagen and Wells Fargo Bank were silenced until outsiders “blew the whistle.” A friend in a senior position in a large government agency put it simply: “Communications in organizations are rarely candid, open, or timely.”

Fourth, organizations are ambiguous. Complexity, unpredictability, and deception generate rampant ambiguity, a dense fog that shrouds what happens from day to day. It is hard to get the facts and even harder to know what they mean or what to do about them. Helen Demarco never knew how Paul Osborne really felt, how receptive he was to other points of view, or how open he was to compromise. She and her peers piled on more mystery by conspiring to keep him in the dark.

Ambiguity has many sources. Sometimes available information is incomplete or vague. Different people may interpret the same information in a variety of ways, depending on mind-sets and organizational doctrines. At other times, ambiguity is intentionally manufactured as a smoke screen to conceal problems or avoid conflict. Much of the time, events and processes are so intricate, scattered, and uncoordinated that no one can fully understand—let alone control—the reality. Exhibit 2.1 lists some of the most important sources of organizational uncertainty.

Exhibit 2.1. Sources of Ambiguity.

  • We are not sure what the problem is.
  • We are not sure what is really happening.
  • We are not sure what we want.
  • We do not have the resources we need.
  • We are not sure who is supposed to do what.
  • We are not sure how to get what we want.
  • We are not sure how to determine if we have succeeded.

Source: Adapted from McCaskey (1982).

Organizational Learning

How can lessons be extracted from surroundings that are complex, surprising, deceptive, and ambiguous? It isn't easy. Decades ago, scholars debated whether the idea of organizational learning made sense: Could organizations actually learn, or was learning inherently individual? That debate lapsed as experience verified instances in which individuals learned and organizations didn't, or vice versa. Complex firms such as Apple, Zappos, and Southwest Airlines have “learned” capabilities far beyond individual knowledge. Lessons are enshrined in acknowledged protocols and shared cultural codes and traditions. At the same time, individuals often learn even when systems cannot.

Several perspectives on organizational learning are exemplified in the work of Peter Senge (1990), Barry Oshry (1995), and Chris Argyris and Donald Schön (1978, 1996). Senge sees a core-learning dilemma: “We learn best from our experience, but we never directly experience the consequences of many of our decisions” (p. 23). Learning is relatively easy when the link between cause and effect is clear. But complex systems often sever that connection: causes remote from effects, solutions detached from problems, and feedback absent, delayed, or misleading (Cyert and March, 1963; Senge, 1990). Wells Fargo's aggressive push for cross-selling led to cheating from coast to coast, but that was mostly invisible at headquarters, which kept its eyes on the financial results—until the scandal blew up.

Senge emphasizes the value of “system maps” that clarify how a system works. Consider the system created by Robert Nardelli at Home Depot. Nardelli had expected to win the three-way competition to succeed management legend Jack Welch as CEO of General Electric. He was stunned when he learned he didn't get the job. But within a week, he was hired as Home Depot's new CEO. He was a big change from the company's free-spirited founders, who had built the wildly successful retailer on the foundation of an uninhibited, entrepreneurial “orange” culture. Managers ran their stores using “tribal knowledge,” and customers counted on friendly, knowledgeable staff for helpful advice.

Nardelli revamped Home Depot with a heavy dose of command-and-control, discipline, and metrics. Almost all the top executives and many of the frontline managers were replaced, often by ex-military hires. At first, it seemed to work—profits improved, and management experts hailed Nardelli's success. He was even designated Best Manager of 2004 on the cover of Business Week (Business Week, 2005). But employee morale and customer service went steadily downhill. The founders had successfully promoted a “make love to the customers” ethic, but Nardelli's toe-the-line stance pummeled Home Depot to last place in its industry for consumer satisfaction. A website, Home Depot Sucks.com, gave customers a place to vent their rage. As criticism grew, Nardelli tried to keep naysayers at bay, but his efforts failed to placate customers, shareholders, or his board. Nardelli abruptly left Home Depot at the beginning of 2007.

The story is one of many examples of tactics that look good until long-term costs become apparent. A corresponding systems model might look like Exhibit 2.2. The strategy might be cutting training to improve short-term profitability, drinking martinis to relieve stress, offering rebates to entice customers, or borrowing from a loan shark to cover gambling debts. In each case, the results look good at first, and the costs only emerge much later.

Figure depicting systems model with delay.

Exhibit 2.2. Systems Model with Delay.

Oshry (1995) agrees that system blindness is widespread but highlights causes rooted in troubled relationships between groups that have little grasp of what's going on outside their own neighborhood. Top managers feel overwhelmed by complexity, responsibility, and overwork. They are chronically dissatisfied with subordinates' lack of initiative and creativity. Middle managers, meanwhile, feel trapped between contradictory signals and pressures. The top tells them to take risks but then punishes mistakes. Their subordinates expect them to intervene with the boss and improve working conditions. Top and bottom tug in opposite directions, causing those in the middle to feel pulled apart, confused, and weak. At the bottom, workers feel helpless, unacknowledged, and demoralized. “They give us bad jobs, lousy pay, and lots of orders but never tell us what's really going on. Then they wonder why we don't love our work.” Unless you can step back and see how system dynamics create these patterns, you muddle along blindly, unaware of better options.

Both Oshry and Senge argue that our failure to read system dynamics traps us in cycles of blaming and self-defense. Problems are always someone else's fault. Unlike Senge, who sees gaps between cause and effect as primary barriers to learning, Argyris and Schön emphasize managers' fears and defenses. As a result, “the actions we take to promote productive organizational learning actually inhibit deeper learning” (1996, p. 281).

According to Argyris and Schön, our behavior obstructs learning because we avoid undiscussable issues and tiptoe around organizational taboos. That often seems to work because we avoid conflict and discomfort in the moment, but we create a double bind. We can't solve problems without dealing with issues we have tried to hide but discussing them would expose our cover up. Facing that double bind, Volkswagen engineers hid their cheating until outsiders finally caught on. Desperate maneuvers to hide the truth and delay the inevitable made the day of reckoning more catastrophic.

Coping with Ambiguity and Complexity

Organizations try to cope with a complicated and uncertain world by making it more simple. One approach to simplification is to develop better systems and technology to collect and process data. Another is to break complex issues into smaller chunks and assign slices to specialized individuals or units. Still another approach is to hire or develop professionals with sophisticated expertise in handling thorny problems. These and other methods are helpful but not always sufficient. Despite the best efforts, as we have seen, surprising—and sometimes appalling—events still happen. We need better ways to anticipate problems and wrestle with them once they arrive.

Making Sense of What's Going On

Some events are so clear and unambiguous that it is easy for people to agree on what is going on. Determining whether a train is on schedule, a plane landed safely, or a clock is keeping accurate time is fairly straightforward. But most of the important issues confronting leaders are not so clear cut. Will a reorganization work? Was a meeting successful? Why did a consensual decision backfire? In trying to make sense of complicated and ambiguous situations, humans are often in over their heads, their brains too taxed to decode all the complexity around them. At best managers can hope to achieve “bounded rationality,” which Foss and Webber (2016) describe in terms of three dimensions:

  1. Processing capacity: Limits of time, memory, attention, and computing speed mean that the brain can only process a fraction of the information that might be relevant in a given situation.
  2. Cognitive economizing: Cognitive limits force human decision makers to use cognitive short-cuts—rules of thumb, mental models, or frames—in order to cut complexity and messiness down to manageable size.
  3. Cognitive biases: Humans tend to interpret incoming information to confirm their existing beliefs, expectations, and values. They often welcome confirming information while ignoring or rejecting disconfirming signals (Foss and Webber, 2016).

Benson (2016) frames cognitive biases in terms of four broad tendencies that create a self-reinforcing cycle (see Exhibit 2.3). To cope with information overload, we filter out most data and see only what seems important and consistent with our current mind-set. That gives us an incomplete picture, but we fill in the gaps and make everything fit with our current beliefs. Then, in order to act quickly instead of getting lost in thought, we favor the easy and obvious over the complex or difficult. We then code our experience into memory by discarding specifics and retaining generalities or by using a few specifics to represent a larger whole. This reinforces our current mental models, which then shape how we process experience in the future.

Exhibit 2.3. Cognitive Biases.

Source: Adapted from Benson, 2016.

Cognitive Challenge Solution Risk
Too much data to process Filter out everything except what we see as important and consistent with our current beliefs Miss things that are important or could help us learn
Tough to make sense of a confusing, ambiguous world Fill in gaps, make things fit with our existing stories and mental models Create and perpetuate false beliefs and narratives
Need to act quickly Jump to conclusions—favor the simple and obvious over the messy and complex Quick decisions and actions lead to mistakes and get us in trouble
Memory overload Discard specifics to form generalities or use a few specifics to represent the whole Error and bias in memory reinforce current mind-sets and biases in information-processing

To a greater or lesser degree, we all use these cognitive short-cuts. In the early days of his presidency, Donald Trump's tweet storms and off-the-cuff communications provided prominent examples. In March, 2017, he tweeted that his predecessor, Barack Obama, was a “bad (or sick) guy” for tapping Trump's phones prior to the election. Trump apparently based this claim on an article from the right-wing website Breitbart. Since the charge aligned with Trump's world view, he figured it must be true and continued to insist he was right even after investigators concluded it never happened.

Decisions, whether snap judgments or careful calculations, work only if we have adequately sized up the situation. As one highly placed female executive reported to us, “I thought I'd covered all the bases, but then I suddenly realized that the rest of my team were playing football.”

Managers regularly face an unending barrage of puzzles or “messes.” To act without creating more trouble, they must first grasp an accurate picture of what is happening. Then they must move to a deeper level, asking, “What is really going on here?” When this step is omitted, managers too often form superficial analyses and pounce on the solutions nearest at hand or most in vogue. Market share declining? Try strategic planning. Customer complaints? Put in a quality program. Profits down? Time to reengineer or downsize.

A better alternative is to think, to probe more deeply into what is really going on, and to develop an accurate diagnosis. The process is more intuitive than analytic: “[It] is in fact a cognitive process, faster than we recognize and far different from the step-by-step thinking we rely on so willingly. We think conscious thought is somehow better, when in fact, intuition is soaring flight compared to the plodding of logic” (DeBecker, 1997, p. 28).

The ability to size up a situation quickly is at the heart of leadership. Admiral Carlisle Trost, former Chief of Naval Operations, once remarked, “The first responsibility of a leader is to figure out what is going on…That is never easy to do because situations are rarely black or white, they are a pale shade of gray…they are seldom neatly packaged.”

It all adds up to a simple truth that is easy to overlook. The world we perceive is, for the most part, the image we construct in our minds. Ellen Langer, the author of Mindfulness (1989), captures this viewpoint succinctly: “What we have learned to look for in situations determines mostly what we see” (Langer, 2009, p. 33). The ideas or theories we hold determine whether a given situation is foggy or clear, mildly interesting or momentous, a paralyzing disaster, or a genuine learning experience. Personal theories are essential because of a basic fact about human perception: in any situation, there is simply too much happening for us to attend to everything. To help us understand what is going on and what to do next, well-grounded, deeply ingrained personal theories offer two advantages: they tell us what is important and what is safe to ignore, and they group scattered bits of information into manageable patterns. Mental models shape reality.

Research in neuroscience has called into question the old adage, “Seeing is believing.” It has been challenged by its converse: “Believing is seeing.” The brain constructs its own images of reality and then projects them onto the external world (Eagleman, 2011). “Mental models are deeply held internal images of how the world works, images that limit us to familiar ways of thinking and acting. Very often, we are not consciously aware of our mental models or the effects they have on our behavior” (Senge, 1990, p. 8). Reality is therefore what each of us believes it to be. Shermer (2012) tells us that “beliefs come first, explanations for beliefs follow.” Once we form beliefs, we search for ways to explain and defend them. Today's experience becomes tomorrow's fortified theology.

In November, 2014, two police officers in Cleveland received a radio report of a “black male sitting on a swing pulling a gun out of his pants and pointing it at people” in a city park (Holloway, 2015). Arriving at the site, one officer spotted the suspect and saw him reach for his gun. The officer immediately shot and killed the suspect. The officer might have responded differently if the radio report had included two additional details. The caller who made the initial report had said that the suspect might be a juvenile, and the gun was probably fake. The gun was a toy replica of a Colt semiautomatic pistol. The victim, Tamir Rice, was 12 years old, but, at 195 pounds, might have looked like an adult on a quick glance.

Perception and judgment involve matching situational cues with previously learned mental models. In this case, the perceptual data were hard to read, and expectations were prejudiced by a key missing clue—the radio operator had never mentioned the possibility of a child with a toy. The officer was expecting a dangerous gunman, and that is what he saw.

Impact of Mental Models

Changing old patterns and mind-sets is difficult. It is also risky; it can lead to analysis paralysis, confusion, and erosion of confidence. This dilemma exists even if we see no flaws in our current thinking because our theories are often self-sealing. They block us from recognizing our errors. Extensive research documents the many ways in which individuals spin reality to protect existing beliefs (see, for example, Garland, 1990; Kühberger, 1995; Staw and Hoang, 1995). In one corporate disaster after another, executives insist that they were not responsible but were the unfortunate victim of circumstances.

Extensive research on the “framing effect” (Kahneman and Tversky, 1979) shows how powerful subtle cues can be. Relatively modest changes in how a problem or decision is framed can have a dramatic impact on how people respond (Shu and Adams, 1995; Gegerenzer, Hoffrage, and Kleinbölting, 1991). One study found that doctors responded more favorably to a treatment with “a one-month survival rate of 90 percent” than one with “a 10 percent mortality rate in the first month,” even though the two are statistically identical (Kahneman, 2011).

Many of us sometimes recognize that our mental models or maps influence how we interpret the world. It is less widely understood that what we expect often determines what we get. Rosenthal and Jacobson (1968) studied schoolteachers who were told that certain students in their classes were “spurters”—students who were “about to bloom.” The so-called spurters, who had been randomly selected, achieved above-average gains on achievement tests. They really did spurt. Somehow, the teachers' expectations were communicated to and assimilated by the students. Medical science is still probing the placebo effect—the power of sugar pills to make people better (Hróbjartsson and Gøtzsche, 2010). Results are attributed to an unexplained change in the patient's belief system. When patients believe they will get better, they do. Similar effects have been replicated in countless reorganizations, new product launches, and new approaches to performance appraisal. All these examples show how hard it is to disentangle reality from the models in our minds.2

Japan has four major spiritual traditions, each with unique beliefs and assumptions: Buddhism, Confucianism, Shintoism, and Taoism. Though they differ greatly in history, traditions, and basic tenets, many Japanese feel no need to choose only one. They use all four, taking advantage of the strengths of each for suitable purposes or occasions.3 The four frames can play a similar role for managers in modern organizations. Rather than portraying the field of organizational theory as fragmented, we present it as pluralistic. Seen this way, the field offers a rich spectrum of mental models or lenses for viewing organizations. Each theoretical tradition is helpful. Each has blind spots. Each tells its own story about organizations. The ability to shift nimbly from one to another helps redefine situations so they become understandable and manageable. The ability to reframe is one of the most powerful capacities of great artists. It can be equally powerful for managers and leaders.

Conclusion

Because organizations are complex, surprising, deceptive, and ambiguous, they are formidably difficult to comprehend and manage. Our preconceived theories, models, and images determine what we see, what we do, and how we judge what we accomplish. Narrow, oversimplified mental models become fallacies that cloud rather than illuminate managerial action. The world of most managers and administrators is a world of messes: complexity, ambiguity, value dilemmas, political pressures, and multiple constituencies. For managers whose images blind them to important parts of this messy reality, it is a world of frustration and failure. For those with better theories and the intuitive capacity to use them with skill and grace, it is a world of excitement and possibility. A mess can be defined as both a troublesome situation and a group of people who eat together. The core challenge of leadership is to move an organization from the former to something more like the latter.

In succeeding chapters, we look at four perspectives, or frames, that have helped managers and leaders find clarity and meaning amid the confusion of organizational life. The frames are grounded in both the cool rationality of management science and the hot fire of actual practice. You can enhance your chances of success with an artful appreciation of how to use the four lenses to understand and influence what's really going on.

Notes

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset