CHAPTER TWO
Decision-Making Challenges
Two Perspectives on Decision Making
Nothing is more difficult, and therefore more precious, than to be able to decide.
—Napoleon, “Maxims,” 1804
Nothing good ever came from a management decision. Avoid making decisions whenever possible. They can only get you in trouble.
—Dogbert, Dogbert’s Top Secret Management Handbook, 1996
In this chapter, we describe decision-making challenges and introduce some reasons why decision analysis is valuable to decision makers, but also why decision analysis may be difficult to apply. The axioms of decision analysis (See Chapter 3) assume rational decision makers operating in efficient organizations, but that may be the exception rather than the rule. Although decision analysis is mathematically sound, it is applied in the context of human decision making and organizational decision processes. As decision analysts, we interact with decision makers, stakeholders, and subject matter experts to build models in environments where objective data may be scarce and we have to rely on eliciting knowledge from decision makers, stakeholders, and subject matter experts (SMEs) who are prone to many cognitive biases. We must develop our soft skills as well as our technical skills. The challenges introduced in this chapter include understanding organizational decision processes, understanding decision traps, and understanding cognitive and motivational biases. Soft skills, such as strategic thinking, leading teams, managing projects, researching, interviewing, and facilitating group meetings, are covered in Chapter 4. Even if we do a superb technical analysis, the most important part of our job remains—communicating results of the analysis to decision makers and stakeholders as discussed in Chapter 13.
The rest of the chapter is organized as follows. Section 2.2 discusses the decision-making processes that humans typically employ. Section 2.3 introduces the factors that make decision making a challenge. Section 2.4 introduces the social and organizational decision factors that impact how a decision analysis can be conducted, including the organizational culture, the impact of stakeholders, and the level at which decisions are made. Section 2.5 discusses issues involved in obtaining credible domain knowledge for making the decision and the role of experts in providing this data. Section 2.6 focuses on the behavioral aspects of decision making to include decision traps and barriers and cognitive biases that affect the decision-making process. Section 2.7 provides two anecdotes of success and failure of supporting the human decision-making process. Section 2.8 sets the stage for our illustrative decision problems used throughout the handbook by setting the decision-making context.
Decision analysis practitioners sometimes take it for granted that people have little difficulty in making decisions. We often assume that thinking about alternatives, preferences, and uncertainty comes naturally to those we are trying to help, and that rational thought is the norm. The reality is that regardless of how many well-documented methodologies with thoroughly proven theorems are provided to them, human decision makers are inconsistent. Roy Gulick, a decision analyst with Decisions and Designs, Inc., coined the term “anatomical decision making” to describe the answers he had gathered from decision makers over the years to the question, “How do you make your decisions?” Why “anatomical decision making”? The responses he received typically included things such as “seat of the pants,” “gut reaction,” “rule of thumb,” “top of the head,” “knee-jerk reaction,” “pulled it out of my … ”—almost every part of the body was mentioned except “the brain”! These responses serve to remind us that no matter how well-honed our analytical methods are, and no matter how much objective data we have, human decision making cannot be overlooked. Humans are sometimes inconsistent, irrational, and subject to cognitive biases. Nonetheless, their subjective judgments, tenuous though they may be, must be included in the decision analysis.
All that said, decision analysts believe that the human decision-making process can be studied systematically, and that coherent, structured, and formal processes are better than purely “anatomical” decision-making processes.
To achieve effective decision making, it is desirable to bring rational decision makers together with high-quality information about alternatives, preferences, and uncertainty. Unfortunately, the information is not always the high quality we would like. While we want to have factual information, very often, erroneous and biased data slip in as well. While we would like to have objective information based upon observed data, sometimes, the best we can obtain is opinion, advice, and conjecture. Similarly, the rational decision makers are not always as rational as we would like. Often, the goals of an organization are ambiguous and conflicting. In many cases, the decision-making environment is characterized by time pressures that impose additional constraints. As a result, the effective decision making that we seek is often less attainable than we desire. One key role of decision analysis thus becomes providing an effective link between the decision makers and the best available information.
Decision problems are complex, and this complexity can be characterized in three dimensions, as shown in Figure 2.1 (Stanford Strategic Decision and Risk Management Decision Leadership Course, 2008). Content complexity ranges from few scenarios with little data and a relatively stable decision-making setting, to many scenarios with data overload, many SMEs involved, and a dynamic decision context. Analytic complexity ranges from deterministic problems with little uncertainty and few fundamental and means objectives to problems with a high degree of uncertainty, many alternatives, and a complicated value hierarchy with many dependencies (see Chapter 7). Organizational complexity ranges from a single decision maker with a homogeneous set of stakeholders to multiple decision makers requiring consensus and a diverse set of stakeholders with conflicting perspectives. The best time to address the organizational complexity is when we are setting up the project structure by engaging the right people in the right way (see Chapter 4).
(Adapted from Stanford Strategic Decision and Risk Management Decision Leadership Course, 2008, used with permission.)
As we discuss in Chapters 9, 10, and 11, when we are eliciting expertise about a decision situation, modeling its consequences, and analyzing the results, it can be helpful to further decompose the analytical and content complexity into five more specific dimensions: value components, uncertainty, strategy, business units, and time.
As decision analysis practitioners, we are asked to go into organizations, whether in the public or private sector, and to work within the existing decision processes. The key thing to remember is that one size does not fit all. Each decision opportunity has its own unique characteristics, and we must be willing and able to adapt to the individuals and the organizational decision-making environment. There are many examples of analytically sound studies that sit on bookshelves or in trashcans because the processes used and the conclusions reached did not “fit” with the existing organizational decision processes. All too often, analysts tend to think of processes used to “solve” the client problems that they face as “technical” processes. Properly applying decision trees, or influence diagrams, or Monte Carlo simulations, may provide a superb technical solution to the problem, but, by themselves, they can miss what may be the most important part of the solution—the social aspects of the solution. Larry Phillips of the London School for Economics describes what decision analysts do as a “socio-technical process.” (Phillips, 2007) The way that the technical solution fits into the organizational culture, structure, decision-making style, and other factors may determine the acceptability of the technical solution. These factors are discussed in the next section.
Decision analysis cannot be performed in isolation in any organization. The approach must be tailored to the context of the problem and the culture and environment of the organization. Culture can include many aspects that must be considered. Some of the major factors to consider include:
The most important thing to remember is that as decision analysts, we must be prepared to adapt our techniques and processes to the culture of the organization, especially to the style of the decision maker. Keep in mind the “golden rule” of consulting—“the one with the gold makes the rules.” As we develop our analytical solutions, we must offer a process that matches the organizational culture.
A stakeholder is a person, group, or organization that has direct or indirect stake in an organization because it can affect or be affected by the organization’s actions, objectives, and policies. Key stakeholders in a business organization include creditors, customers, directors, employees, government (and its agencies), owners (shareholders), suppliers, unions, and the community from which the business draws its resources.
(BusinessDictionary.com, 2011)
Stakeholders comprise the set of individuals and organizations that have a vested interest in the problem and its solution (Sage & Armstrong, 2000). Understanding who is affected by a solution to a decision problem provides the foundation for developing a complete definition of the problem. Stakeholders collectively perform many functions. They help frame the problem and specify constraints; they participate in the alternative generation and solution process to include evaluation and scoring; they provide data and subject matter expertise; and they identify and often execute tasks for implementing recommended solutions (Parnell et al., 2011).
A straightforward taxonomy of stakeholders is offered in Decision Making in Systems Engineering and Management (Parnell et al., 2011). Stakeholders can choose to be active or passive when it comes to participating in the decision process. Stakeholders are listed in typical order of relative importance:
Stakeholder analysis is a key technique to ensure that the problem has been fully described before we attempt to obtain a solution to the problem. The three most common techniques for stakeholder analysis are interview, focus groups, and surveys. Several techniques are available for soliciting input from diverse stakeholders as shown in Table 2.1 (Trainor & Parnell, 2007). The techniques are characterized and compared on five attributes—time commitment, ideal stakeholder group, preparation, execution, and analysis.
Stakeholder analysis is critical, since the fundamental and means objectives (see Chapter 7) are built upon the needs of the stakeholders. Without a clear understanding of the different perspectives and different success criteria upon which alternatives will be judged, the analysis can easily be built upon a shaky foundation that will not withstand the pressures of intense scrutiny and organizational implementation. The best practices for the use of these techniques are presented in Chapter 4.
Decision analysis can be applied at a variety of decision levels in an organization. A common characterization of decision levels includes strategic, operational, and tactical. A good analysis must balance concerns across all three and must consider the dependencies across levels.
This level is focused on the long-term goals and directions of the organization, which are often expressed in the organization’s strategic plan. Strategic decision making is oriented around the organization’s mission and vision for where it wants to be in the future. It addresses very fundamental issues, such as what business the organization is in versus what business it should be in? What are the core values? What products and services should it deliver? Who are the customer sets? What is management’s intent about how the organization will evolve and grow? From the decision analyst’s perspective, this level of decision making typically involves the fewest viable alternatives, the greatest degree of uncertainty since it is future oriented, and the greatest need for fleshing out the fundamental objectives since statements of strategic goals are often broad and vague. In order to help an organization, the decision analyst must be a strategic thinker. We identify this as one of the soft skills required for decision analysts (Chapter 4).
This level focuses on turning the broad strategic goals into achievable, measurable objectives (Eyes Wide Open: Tips on Strategic, Tactical and Operational Decision Making, 2011). It requires developing actions and allocating resources that will accomplish the objectives. It includes the set of procedures that connects the strategic goals with the day-to-day operational activities of the organization, and its primary purpose is to enable the organization to be successful as a whole rather than as independent parts (Tactical Decision Making in Organizations, 2011). From the decision analyst’s perspective, it is important to identify redundancies and synergies in the alternatives, to conduct value of information analysis (see Chapter 11) to avoid modeling uncertainties that do not affect decisions, to fully understand how to decompose fundamental objectives of the organization into manageable means objectives, and to avoid suboptimization (see Chapter 7).
This level focuses on day-to-day operational decisions, particularly on how the organization allocates scarce resources. Decisions are short term, and the decision context can change rapidly (Eyes Wide Open:Tips on Strategic, Tactical and Operational Decision Making, 2011). In a business context, it can be highly reactive since much depends upon the competitive environment. From the decision analyst’s perspective, it frequently involves rapid response, “quick turn” analyses with little data other than that of SMEs. Benefit/cost analysis, to include net present value (NPV) analysis, and high-level multiple-objective decisions analyses (MODA), are frequently used tools that are appropriate for longer time horizons.
Identifying the decisions to be made in a decision analysis is a nontrivial task. Knowing the decision level is one useful technique. In Chapter 6, we introduce the decision hierarchy, which is a decision framing tool to help define the decisions in sufficient detail to perform a decision analysis.
In theory, it is easy to think of the decision analyst as working directly with the decision maker to build models and solve problems. In practice, it is rarely that straightforward. Most decision analyses rely on knowledge that is dispersed among many experts and stakeholders. The views of individual decision makers are subject to both cognitive and motivational biases, and often, these views must be interpreted and implemented by groups of stakeholders in multiple organizations. The decision analyst is often asked to take on a complicated role other than model builder—the analyst must be the facilitator who translates management perspective to others, who balances conflicting perspectives of stakeholders, who elicits knowledge from dispersed SMEs, and who combines these varied ingredients into a composite model that organizes and integrates the knowledge (see Chapter 9). While unanimity among participants is a noble goal, it is exceptionally rare. Consensus is a more achievable goal—if it is defined as developing a solution that everyone can “live with” rather than as any form of unanimity. But even to achieve consensus, the decision analyst must use modeling and facilitation skills to bring together technical knowledge (often the purview of scientists and engineers) with business knowledge (often the purview of managers and financial personnel) in the particular domain at hand. This is no easy task as sources of such knowledge can be varied and uneven in quality.
For a decision analysis to be credible, it must be based upon sound technical knowledge in the problem domain. In some cases, the decision maker may have such knowledge by coming up through the ranks. In others, the decision maker is more focused on the business side of the organization and relies upon the technical staff of scientists, engineers, and others to provide such knowledge. In some ways, it is easier to reconcile conflicting opinions on technical matters than on business matters since technical matters tend to be more factually based and objective. That said, sometimes, it is difficult to establish which “facts” to believe, particularly on controversial issues, such as global warning!
There can be a huge difference in the level of domain technical knowledge required of the decision analyst. Specific technical knowledge may be demanded of a decision analyst who is internal to an organization. This is typical, for example, in the oil and gas industry and in the pharmaceutical industry. For a decision analyst external to an organization, there may be less expectation of technical knowledge, but rather the expectation is that the decision analyst can work with a group of technical experts to identify and model the key concerns. In fact, in many consulting firms, it is considered to be an advantage for the decision analyst to not be burdened by having to be an expert in the technical aspects of the organization; this allows the decision analyst to focus on the decision process.
For either the internal or the external decision analyst, it is essential to help the client develop a clear set of objectives, a range of possible outcomes, and probability distributions, to flesh out the key assumptions and constraints, to understand the factors that could create extreme outcomes, and to document how the technical knowledge obtained from others is used.
While a firm grasp on domain technical knowledge is essential for credibility, a firm grasp on business knowledge is essential for success. Such knowledge includes analysis of the competition, analysis of the economic environment, analysis of the legislative environment, and analysis of required rates of return, among other business environment areas. As with technical knowledge, it would not be unusual for the decision analyst to not be a SME in these areas, but rather, obtain such knowledge required for decisions from business experts internal or external to the organization. Familiarity by the decision analyst with corporate financial reports, benefit/cost analysis, costing approaches, net present value calculations, and portfolio theory may be essential for success.
The role of experts in decision analyses is not always as straightforward as one might think. Clearly, they provide factual, objective information in the areas of their expertise. Whether it is actual hard technical data, actual performance data, or projected performance data on proposed systems, most technical experts are comfortable providing both point estimates and uncertainty ranges on such data. Where it becomes murkier is when SMEs or technical experts are asked to provide “preference” or “value judgment” estimates that may require access to stakeholders and decision makers.
As one would expect, all experts are not created equal. Some have more technical or business knowledge than others, and it is not always easy for the decision analyst to determine their limitations. Even experts are subject to motivational and cognitive biases—in fact, as will be pointed out in Section 2.6.2, experts may be even more subject to biases, such as failing to spread probability distributions widely enough. As indicated above, some experts are very uncomfortable doing anything other than “reporting” on what they know and may be unwilling to provide value judgments. Some experts will attempt to dominate the group by citing that their knowledge is more recent or more authoritative than others, and this may have the effect of “shutting down” other experts that are present. Some will “stretch” their areas of expertise to go far beyond their actual areas of expertise, often providing a mix of very good quality information with less credible information. One of the greatest challenges for the decision analyst is to determine the bona fides of both the experts and of the expertise they provide and to determine what can and cannot be used.
This section provides insights into the decision traps and barriers that get in the way of good decision analysis practice, as well as into the cognitive and motivational biases that can impact the quality of the knowledge we elicit from decision makers and SMEs.
An excellent summary of behavioral decision insights and barriers to good decision making can be found in Decision Traps (Schoemaker & Russo, 1989). The authors describe the ten most dangerous decision traps as follows:
The first author of this chapter has compiled a similar list of barriers to good decision analysis based upon his experiences over the last 35 years as follows:
“Cognitive biases are mental errors caused by our simplified information processing strategies. A cognitive bias is a mental error that is consistent and predictable” (Heuer, 1999). As analysts attempt to elicit value judgments and probabilities from decision makers and SMEs, they must confront many of the cognitive biases that are well documented in the behavioral decision analysis literature. Some of the biases are related to decision making, some to probability or value assessment, and some to personal motivation (motivation to have positive attitudes toward oneself). This section provides a quick overview of the most common biases. The letters DM after the name of the bias refer to decision-making and behavioral biases, P to probability or belief biases, and M to motivational biases.
These and other biases can lead to common assessment mistakes to include the following:
The first step in overcoming these biases is to be aware of and to recognize them when they occur. Additional ways to counter biases and faulty assessment heuristics include the following:
We highlight the key points of this chapter with two anecdotes that show the importance that human decision making plays in determining the success or failure of a decision analysis.
In 1978, decision analysts from a decision analysis consulting firm began working with the U.S. Marine Corps (USMC) to develop a new approach for prioritizing items to be funded in the annual budget cycle. They had a “champion” in the form of a young Colonel who was fighting a “that’s not how we do it here” attitude in trying to implement an innovative, decision analysis approach. By developing a sound prioritization method based on benefit/cost analysis, tailoring it to the organizational culture and demands, and evolving it as the organizational considerations changed, the decision analysts developed a facilitated process that is still being used today. It is still being facilitated by some of the same decision analysts more than 30 years later. Over the years, the Marines have tried other approaches to prioritizing budgetary items, but they continue to return to the decision analysis framework. In a presentation at INFORMS, the following reasons were given for why the decision process has been so successful (Leitch et al., 1999):
As a point of interest, the young Colonel who took the risk of innovating and who made it all happen was P.X. Kelly, who later became a four-star general and served as Commandant of the Marine Corps.
In 2000, decision analysts from a consulting firm were asked by the Chief Systems Engineer of a major intelligence agency to develop a methodology and process for putting together the annual budget. The existing process was stove-piped, highly parochial, and there was little collaboration among those fighting for shares of the budget, thus leading to suboptimization. The guidance from the decision maker was to put in place a process that would allocate resources efficiently, but more importantly, that would break down internal barriers, encourage cross-discipline discussion, and foster shared purpose. A facilitated process was established that evolved over 7 years that included detailed stakeholder analysis (both internal and external), prioritization of fundamental goals and objectives, a detailed multiple objective value model with scenario-specific value curves, and clear communication tools. The process worked well over the first few years, and collaborative analysis was greatly enhanced. Few argued with the technical correctness of the analytical model. However, the more the process did, the more it was asked to do. Instead of addressing the strategic and tactical decisions for which it was designed, it was being used for day-to-day operational decisions for which it did not have the right degree of sensitivity and granularity to discriminate value among the alternatives. The data demands grew more than the stakeholders could tolerate and accommodate, stakeholder support for the process waned, and the process began to collapse under its own weight. Later, a subsequent decision maker determined that a simpler process was needed, and mandated that no value curves or weights be used and that facilitated processes were not necessary since SMEs internal to the organization could prioritize initiatives on their own. The organization is currently engaged in developing a new process “from scratch” that is “based on only objective data” and better fits the evolved culture and constraints of the organization and the decision-making style of the decision maker. After all, that is the golden rule of management!
This chapter discusses decision-making challenges in general. We now describe those challenges further in the context of the three illustrative example problems introduced in Chapter 1. See Table 1.3 for further information on the illustrative examples.
Roughneck Oil and Gas is a global oil and gas company with operations in North America. They had an advocacy-based approach to decision making that led to narrowly framed decisions and only incremental change from one year to the next. Decision makers were not familiar with creating distinctly different alternatives, and analysts had no experience accounting for the actual range of outcomes that might be encountered. The organization had many silos, with little communication between business areas. Within this situation, there was a desire to take a broad strategic look at the assets in North America.
Compared with traditional pharmaceuticals, personalized medicine decision-making creates additional complexity and challenges for a drug development team, particularly for a young biotech company like DNA Biologics.
To make the Geneptin decision, DNA Biologics had to wrestle with a number of cultural and organizational issues. How formal or informal should the decision process be? Should the approach be driven more by data and analysis or more by intuition and experience? To what extent should the senior management be involved in the decision process—fully engaged or a “just show me your final recommendation” approach? Which of the different organizational “habits” or cultures for building consensus should be employed —a top-down, hierarchical approach or a collaborative approach based on preagreed upon criteria? To what extent should the company’s decision be driven by science and innovation versus by commercial value?
These organizational and cultural aspects are not specific to personalized medicine, but because of inertia, lack of sufficient knowledge, and higher level of uncertainty associated with personalized medicine, these challenges were amplified at DNA Biologics.
Additional complexity and uncertainty associated with personalized medicine can make the decision analysis more challenging. There are additional decisions to make regarding design, pricing and positioning of diagnostic biomarker tests. There are additional variables to consider, including biomarker prevalence, addressable patient population, market share within the addressable patient population, and probability of success of both the drug and companion diagnostic test. Personalized medicine brings different tradeoffs; for example, a biomarker reduces addressable patient population but can increase the drug’s market share among the stratified patient segment. Some variables have a different impact on value—for example, personalized medicine R&D costs may be higher or lower than traditional costs; while other variables serve as new drivers of value—for example, a personalized medicine can offer a better benefit/risk ratio, allowing patients to potentially take it for longer duration and drug manufacturers to provide a more compelling value proposition to payers for reimbursement and to physicians and patients for clinical adoption, increasing commercial value to the drug company.
A major government agency had an expanding mission that required significant increases in data analysis. The agency’s headquarters data centers were already operating at capacity, and the existing data centers lacked additional floor space, power, and cooling. The agency had identified the need for a large new data center and had already ordered a significant number of new servers, but had no process to verify that sufficient data center capacity would be provided to support mission demands. They also had no process for determining the best location of the data center.
There were several complicating factors in the decision. Technology advances had resulted in large servers becoming smaller, consuming more power, and requiring more cooling. Multiple stakeholder organizations within the agency were involved in the decision, including the mission-oriented operational users, the information technology office responsible for procuring and operating, and the logistics office responsible for facilities, power, and cooling. Some of their objectives conflicted with each other. Each believed that it should be the final decision authority, and each believed it had the expertise to make the decisions without using SMEs from the others. All of the organizations appeared to be biased toward solutions with which they were already familiar and comfortable. Life cycle costs were a major factor in the selection of the best data center alternative, and budgets for IT had been shrinking. Multiple approval levels were necessary to obtain the funds from both within and outside the agency. This included the requirement to communicate the need for funds to Congress. The agency had a history of getting what it asked for, but it had been challenged more and more to justify requests for additional funds.
For a decision analysis to be useful, it must be theoretically sound, performed with accepted analytical techniques, and be methodologically defensible. But that is not enough. It must work in the environment of the organizational culture and be compatible with the decision making style of the decision makers. It must be based on sound objective data when they are available and on sound subjective information from credible experts when objective data are not available. It must be as free as possible from cognitive and motivational biases of the participants. It frequently must reconcile stakeholders positions based on conflicting perspectives and, often, conflicting information. It often must reach a single conclusion that can be agreed upon by participants who at times have little or no motivation to reach consensus. And finally, it must be communicated clearly and effectively.
These challenges demand that decision analysts be far more than technical experts. We must be knowledge elicitors to gain the information required. We must be facilitators to help overcome group decision-making barriers. We must be effective communicators to prepare presentations and reports that will be read, understood, and accepted. Most of all, we must recognize that decision analysis is not just a science, but an art, and we must be fluent from both perspectives.
REFERENCES
Adams, J. (1979). Conceptual Blockbusting. Stanford, CA: Stanford Alumni Association.
Baron, J. (2000). Thinking and Deciding, 3rd ed. New York: Cambridge University Press.
BusinessDictionary.com. Stakeholder Definition. 2011.
Eyes Wide Open. Tips on Strategic, Tactical and Operational Decision Making (2011). http://www.smallbusinesshq.com.au/factsheet/20305-tips-on-strategic-tactical-and-operational-decision-making.htm.
Goodie, F. (2011). Cognitive distortion as a component and treatment focus of pathological gambling: A review. Psychology of Addictive Behaviors, 26(2), 298–310.
Heuer, R. Jr. (1999). Psychology of Intelligence Analysis. Mclean, VA: Central Intelligence Agency.
Kruger, J. (1999). Lake Wobegon be gone! The “below-average effect” and the egocentric nature of comparative ability judgments. Journal of Personality and Social Psychology, 77(2), 221–232.
Leitch, S., Kuskey, K., Buede, D., & Bresnick, T. Of princes, frogs, and marine corps’ budgets: Institutionalizing decision analysis over 23 years. INFORMS. Philadelphia, 1999.
Mellers, B. & Locke, C. (2007). What have we learned from our mistakes? In W. Edwards, R. Miles, & D. von Winterfeldt (eds.), Advances in Decision Analysis, pp. 351–371. New York: Cambridge University Press.
Miller, D.T. & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction? Psychological Bulletin, 82(2), 213–225.
Myers, D. (1994). Did You Know It All Along? Exploring Social Psychology. New York: McGraw-Hill, 15–19.
O’Boyle, J.G. The culture of decision-making. R&D Innovator, 1996.
Oswald, M. & Stefan, G. (2004). Confirmation bias. In R.F. Pohl (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press.
Parnell, G.S., Driscoll, P., & Henderson, D. (eds.) (2011). Decision Making in Systems Engineering and Management. Hoboken, NJ: John Wiley & Sons.
Phillips, L. (2007). Decision conferencing. In W. Edwars, R. Miles, & D. Von Winterfeldt (eds.), Advances in Decision Analysis, pp. 375–398. New York: Cambridge University Press.
Pohl, R.F. (2004). Hindsight bias. In R.F. Pohl (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press.
Sage, A. & Armstrong, J. (2000). Introduction to Systems Engineering. New York: John Wiley & Sons.
Schoemaker, J.E. & Russo, P. (1989). Decision Traps. New York: Bantam/Doubleday/Dell Publishing Group.
Silverman, B.G. (1992). Modeling and critiquing the confirmation bias in human reasoning. IEEE Transactions on Systems, Man, and Cybernetics, Sept/Oct: 972–982.
Society for Decision Professionals. (2012). SDP Home Page. http://www.decisionprofessionals.com, accessed April 2012.
Stanford Strategic Decision and Risk Management Decision Leadership Course (2008).
Tactical decision making in organizations. (2011). http://main.vanthinking.com/index.php/20080910125/Tactical-Decision-Making-in-Organizations.html, accessed 2011.
Trainor, T. & Parnell, G. Using stakeholder analysis to define the problem in systems engineering. INCOSE, 2007.
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124–1131.
Tversky, A. & Kahneman, D. (1982). Evidential impact of base rates. In P. Slovic, A. Tversky, & D. Kahneman (eds.), Judgment under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press.