Chapter 5
Understanding Decision Management

Matthew Cilli

U.S. Army, Armament Research Development and Engineering Center (ARDEC), Systems Analysis Division, Picatinny, NJ, USA

Gregory S. Parnell

Department of Industrial Engineering, University of Arkansas, Fayetteville, AR, USA

 

Decide what you want, decide what you are willing to exchange for it. Establish your priorities and go to work.

(H. L. Hunt)

5.1 Introduction1

Successful Systems Engineering requires good decision-making. Many systems engineering decisions are difficult decisions in that they include multiple competing objectives, numerous stakeholders, substantial uncertainty, significant consequences, and high accountability. In these cases, good decision-making requires a formal decision management process. The purpose of the decision management process, as defined by ISO/IEC 15288:2015, is “…to provide a structured, analytical framework for identifying, characterizing and evaluating a set of alternatives for a decision at any point in the life-cycle and select the most beneficial course of action.” This chapter aligns with the structure and principles of the Decision Management Process Section of the INCOSE Systems Engineering Handbook v4.0 (INCOSE SE Handbook Working Group, 2015) and presents the decision management process steps as described therein (written permission from INCOSE Handbook Working Group pending), and it expands on the SEBok section on Decision Management (http://sebokwiki.org/wiki/Decision_Management). Building upon the foundation, this chapter adds a significant amount of text and introduces several illustrations to provide richer discussion and finer clarity.

5.2 Decision Process Context

A formal decision management process is the transformation of a broadly stated decision situation (see Chapter 4) into a recommended course of action and associated implementation plan. The process is executed by a resourced decision team that consists of a decision-maker with full responsibility, authority, and accountability for the decision at hand, a decision analyst with a suite of reasoning tools, subject matter experts with performance models, and a representative set of end users and other stakeholders (Parnell et al., 2013). The decision process is executed within the policy and guidelines established by the sponsoring agent. The formal decision management process realizes this transformation through a structured set of activities described later in this chapter. Note the process presented here does not replace the engineering models, performance models, operational models, cost models, and expert opinion prevalent in many enterprises but rather complements such tools by synthesizing their outputs in a way that helps decision-makers thoroughly compare relative merits of each alternative in the presence of competing objectives and uncertainty. (Buede, 2009; Parnell et al., 2011).

Models are central to systems analysis and trade-off analysis. A decision support model is a composite model that integrates outputs of otherwise separate models into a holistic system view mapping critical design choices to consequences relevant to stakeholders. A decision support model helps decision-maker(s) overcome cognitive limits without oversimplifying the problem.

Early in the life cycle, inputs to the decision management process are often little more than broad statements of the decision situation. As such, systems engineers should not expect to receive a well-structured problem statement as input to the decision management process. In later stages of the system life cycle, the inputs usually include models and simulations, test results, and operational data. Opportunities to use a decision management process as part of a systems engineering trade-off analysis throughout the system analysis life cycle are illustrated in Figure 5.3.

The ultimate output of the decision management process should be a recommended course of action and associated implementation plan provided in the form of a high-quality decision report. The decision report should communicate key findings through effective tradespace visualizations underpinned by defendable rationale grounded in analysis results that are repeatable and traceable. As decision-makers seek to understand root causes of top-level observations and build their own understanding of the trade-offs, the ability to rapidly drill down from top-level tradespace visualizations into lower level analyses and data supporting the synthesized view is often beneficial.

5.3 Decision Process Activities

The decision analysis process as described in Parnell et al. (2013) and Parnell et al. (2011) can be summarized in 10 process steps: (i) frame decision and tailor process, (ii) develop objectives and measures, (iii) generate creative alternatives, (iv) assess alternatives via deterministic analysis, (v) synthesize results, (vi) identify uncertainty and conduct probabilistic analysis, (vii) assess impact of uncertainty, (viii) improve alternatives, (ix) communicate trade-offs, and (x) present recommendation and implementation plan. An illustration of this 10-step decision process interpretation is provided in Figure 5.1 (http://sebokwiki.org/wiki/Decision_Management, 2015)

Schema for Decision Analysis Process.

Figure 5.1 Decision analysis process (Courtesy of Matthew Cilli)

Applying this decision process to a new product development context calls for the integration of this process with the systems engineering process. The systems engineering process provides the holistic, structured thinking perspective required for the design and development of complex systems while the analytics-based decision process provides the mathematical rigor needed to properly represent and communicate reasoning and produce meaningful visualization of the tradespace. Figure 5.2 provides a process map of this analytical decision process integrated with six of the systems engineering technical processes, one cross-cutting method, and three technical management processes as described in the INCOSE Systems Engineering Handbook V4. This integrated process will be referred to as the Integrated Systems Engineering Decision Management (ISEDM) Process throughout the rest of this dissertation.

Schema for Integrated Systems Engineering Decision Management (ISEDM) Process Map.

Figure 5.2 Integrated Systems Engineering Decision Management (ISEDM) Process Map

Schema for Trade-off studies throughout the system's development life cycle.

Figure 5.3 Trade-off studies throughout the system's development life cycle

The white text within the outer green ring identifies elements systems engineering processes while the 10 arrows forming the inner ring represent the10 steps of the decision management process. Interactions between the systems engineering processes and the decision process are represented by the small, dotted green or blue arrows. These interactions are discussed briefly in the subsequent sections of this chapter. (The reader is referred to the online version of this book for color indication.)

The focus of the process is to find system solutions that best balance competing objectives in the presence of uncertainty as shown in the center of Figure 5.2. This single focus is important as it can be argued that all systems engineering activities should be conducted within the context of supporting good decision-making. If a systems engineering activity cannot point to at least one of the many decisions embedded in a system's lifecycle, one must wonder why the activity is being conducted at all. Positioning decision management as central to systems engineering activity will ensure that the efforts are rightfully interpreted as relevant and meaningful and thus maximize the discipline's value proposition to new product developers and stakeholders.

The decision management process is an iterative process with an openness to change and adapts as understanding of the decision and the tradespace emerges with each activity. The circular shape of the process map is meant to convey the notion of an iterative process with significant interaction between the process steps. The feedback loops seek to capture new information regarding the decision task at any point in the decision process and make appropriate adjustments.

Table 5.1 provides a crosswalk between the Systems Engineering terms used in Figure 5.2, ISO/IEC/IEEE 15288:2015 Systems and Software Engineering – System Life Cycle Processes and the section of the INCOSE Systems Engineering Handbook V4 devoted to the term.

Table 5.1 Crosswalk Between SE Terms in Figure 5.2 and INCOSE Systems Engineering Handbook V4 and ISO/IEC/IEEE 15288:2015

Systems Engineering Terms in of Figure 5.3 ISO/IEC/IEEE 15288:2015 Section INCOSE Systems Engineering Handbook V4 Section
Business or mission analysis process 6.4.1 4.1
Stakeholder needs and requirements definition process 6.4.2 4.2
System requirements definition process 6.4.3 4.3
Architecture definition process 6.4.4 4.4
Design definition process 6.4.5 4.5
Measurement process 6.3.7 5.7
System analysis process 6.4.6 4.6
Modeling and simulation - 9.1
Risk management process 6.3.4 5.4
Project planning process 6.3.1 5.1

The ISEDM process can be used for trade-off analyses encountered across the system's development life cycle – tailored to the particulars of the decision situation. Figure 5.3 adds the ISEDM process icon several times to the generic life cycle model put forth in the INCOSE Systems Engineering Handbook V4 to illustrate key opportunities to use the process to execute systems engineering trade-off analyses throughout the systems development life cycle.

5.3.1 Frame Decision

The first step of the decision management process is to frame the decision and to tailor the decision process. To help ensure that the decision-makers and stakeholders fully understand the decision context and to enhance the overall traceability of the decision, the systems engineer should capture a description of the system baseline as well as a notion for how the envisioned system will be used (concept of operations) along with system boundaries and anticipated interfaces. Decision context includes such details as the timeframe allotted for the decisions, an explicit list of decision-makers and stakeholders, available resources, and expectations regarding the type of action to be taken as a result of the decision at hand as well as decisions anticipated in the future (Edwards et al. 2007). The best practice is to identify a decision problem statement that defines the decision in terms of the system life cycle. Next, three categories of decisions should be listed: decisions that have been made, decisions to be made now, and subsequent decisions that can be made later in the life cycle. Effort is then focused on the decisions to be made now.

Once the decision at hand is sufficiently framed, systems engineers must select the analytical approach that best fits the frame and structure of the decision problem at hand. For deterministic problems, optimization models can explore the decision space. However, when there are “… clear, important, and discrete events that stand between the implementation of the alternatives and the eventual consequences…” (Edwards et al., 2007), a decision tree is a well-suited analytical approach, especially when the decision structure has only a few decision nodes and chance nodes. As the number of decision nodes and chance nodes grow, the decision tree quickly becomes unwieldy and loses some of its communicative power. However, decision trees and many optimization models require consequences to be expressed in terms of a single number. This is commonly accomplished for decision situations where the potential consequences of alternatives can be readily monetized and end state consequences can be expressed in dollars, euros, yen, and so on. When the potential consequences of alternatives within a decision problem cannot be easily monetized, an objective function can often be formulated to synthesize an alternative's response across multiple, often competing, objectives. A best practice for this type of problem is the multiple objective decision analysis (MODA) approach (Chapter 2).

The decision management method most commonly employed by systems engineers is the trade study and more often than not employs some form of MODA approach. The aim is to define, measure, and assess shareholder and stakeholder values and then synthesize this information to facilitate the decision-maker's search for an alternative that represents the optimally balanced response to often competing objectives. Major system projects often generate large amounts of data from many separate analyses performed at the system, subsystem, component, or technology level by different organizations. Each analysis, however, only delivers one dimension of the decision at hand, one piece of the puzzle that the decision-makers are trying to assemble. These analyses may have varying assumptions and may be reported as standalone documents, from which decision-makers must somehow aggregate system-level data for all alternatives across all dimensions of the tradespace in his or her head. This would prove to be an ill-fated task as all decision-makers and stakeholders have cognitive limits that preclude them from successfully processing this amount of information in their short-term memory (Miller 1956). When faced with a deluge of information that exceeds human cognitive limits, decision-makers may be tempted to oversimplify the tradespace by drastically truncating objectives and/or reducing the set of alternatives under consideration, but such oversimplification runs a high risk of generating decisions that lead to poor outcomes.

By providing techniques to decompose a trade-off decision into logical segments and then synthesize the parts into a coherent whole, a formal decision management process offers an approach that allows the decision-makers to work within human cognitive limits without oversimplifying the problem. In addition, by decomposing the overall decision problem into smaller elements, experts can provide assessments of alternatives as they perform within the objective associated with their area of expertise. Buede and Choisser put it this way,

These component parts can be subdivided as finely as needed so that the total expertise of the system design team can be focused, in turn, on specific, well-defined issues. The analyses on the component parts can then be combined appropriately to achieve overall results that the decision makers can use confidently. The benefits to the decision maker of using this approach include increased objectivity, less risk of overlooking significant factors and, perhaps most importantly, the ability to reconstruct the selection process in explaining the system recommendation to others. Intuition is not easily reproducible.

(Buede & Choisser 1992)

MODA approaches generally differ in the techniques used to elicit values from stakeholders, the use of screening techniques, the degree to which an alternative's responses to objectives (and subobjectives) are aggregated, the mathematics used to aggregate such responses, the treatment of uncertainty, the robustness of sensitivity analyses, the search for improved alternatives, and the versatility and quality of tradespace visualization outputs. If time and funding allow, systems engineers may want to conduct trade-off studies using several techniques, compare and contrast results, and reconcile any differences to ensure that the findings are robust. Although there are many possible ways to specifically implement MODA, the discussion contained in the rest of this chapter represents a short summary of best practices.

5.3.1.1 Example of Framing the Decision

As an example of decision management process execution, consider the hypothetical small unmanned aerial system (sUAS) case study introduced in the following paragraphs. Note that the lead author of this chapter created a plausible and sufficiently rich example by distilling the technical ideas presented in the-805 page textbook by Dr. Jay Gundlach, Designing Unmanned Aircraft Systems: A Comprehensive Approach. The lead author of this chapter used Gundlach's textbook to inform physical architecture descriptions of the notional UAVs and the stakeholder requirements created for the case study that follows, but no attempt was made to use the mathematical relationships provided in the textbook to generate cost, schedule, and performance estimates for the sUAV concepts within the case study of this chapter. All estimates should be considered illustrative.

We assume that the military is contemplating the start of a new effort to develop the next-generation sUAS and a lead systems engineer has been tasked to conduct a systems engineering trade-off analysis to identify system concepts in order to inform requirements generation. The lead systems engineer is told that the future system will be used primarily in an Intelligence, Surveillance, and Reconnaissance (ISR) mission context as they are now but instead of operating at altitudes of 500–1000 ft, the sUAS will be expected to operate at an altitude of 3000 ft in order to avoid airspace conflicts with military helicopters and to reduce the likelihood of being detected by enemy forces. The lead engineer was also told that the new capability should be operational within 7 years and that the life cycle costs must be affordable.

The lead systems engineer is excited about the opportunity but is a bit anxious about the ambiguity surrounding the problem statement. To curb some of the anxiety, he begins by asking some clarification questions regarding decision timeframe, system boundaries, and expectations regarding affordability. He learns that the final report is due in 12 months with executive-level reviews scheduled every quarter with preliminary study findings expected by the third review. He also learns that the system boundaries include the air vehicle, the ground elements, and the communication links between them. With regard to affordability, he was told not to initially discard concepts on a cost basis but rather collect rough order of magnitude life cycle cost estimates for each concept and show the cost versus performance versus schedule relationship for each. With this information and the information from similar trades being conducted elsewhere in the portfolio, the executive decision board will determine appropriate affordability goals for the next phase of the future sUAS development effort.

Armed with the initial framing of the trade at hand, the lead systems engineer begins wondering how the goodness of system alternatives should be defined. The next section of this chapter addresses the best practices associated with developing objectives and measures and is immediately followed by the continuation of the sUAV case study.

5.3.2 Develop Objectives and Measures

Defining how a decision will be made may seem straightforward, but often becomes an arduous task of seeking clarity amidst a large number of ambiguous stakeholder need statements. The first step is to use the information obtained from the Stakeholder Requirements Definition Process, Requirements Analysis Process, and Requirements Management Processes to develop objectives and measures. If these processes have not been started, then stakeholder analysis is required. Often, this begins with reading documentation on the decision topic followed by a visit to as many decision-makers and stakeholders as reasonable and facilitating discussion about the decision problem. This is best done with interviews and focus groups with subject matter experts and stakeholders.

For systems engineering trade-off analyses, top-level stakeholder value often includes competing objectives of performance, development schedule, life cycle costs, and long-term viability. For corporate decisions, shareholder value would be added to this list. With the top-level objectives set, lower levels of objective hierarchy should be discovered. For performance-related objectives, it is often helpful to work through a functional decomposition (usually done as part of the requirements and architectural design processes) of the system of interest to generate a thorough set of potential objectives. Start by identifying inputs and outputs of the system of interest and craft a succinct top-level functional statement about what the system of interest does, identifying the action performed by the system of interest to transform the inputs into outputs. Test this initial list of fundamental objectives for key properties by checking that each fundamental objective is essential and controllable and that the set of fundamental objectives is complete, nonredundant, concise, specific, and understandable (Edwards et al. 2007). See Figure 5.4 for a list of the key properties for a set of fundamental objectives.

Snapshot of Key Properties of a High-Quality Set of Fundamental Objectives.

Figure 5.4 Key properties of a high-quality set of fundamental objectives

Beyond these best practices, the creation of fundamental objectives is as much an art as it is a science. This part of the decision process clearly involves subjectivity. It is important to note, however, that a subjective process is not synonymous with an arbitrary or a capricious process. As Keeney points out,

Subjective aspects are a critical part of decisions. Defining what the decision is and coming up with a list of objectives, based on one's values, and a set of alternatives are by nature subjective processes. You cannot think about a decision, let alone analyze one, without addressing these elements. Hence, one cannot even think about a decision without incorporating subjective aspects

(Keeney 2004)

The output of this process step takes on the form of a fundamental objectives hierarchy as illustrated in Figure 5.5.

Schematic illustration of an Objectives Hierarchy.

Figure 5.5 Example of an objectives hierarchy

For completeness, it is often helpful to build a crosswalk between the objectives hierarchy and any stakeholder need statements or capability gap lists as illustrated in Table 5.2. This activity helps ensure that all stakeholder need statements or capability gaps have been covered by at least one objective. It also aids in identifying objectives that do not directly trace to an expressed need. Such objectives will need additional explanation to justify their inclusion in the hierarchy. It should be noted, however, that it is common to have such objectives included in a hierarchy because stakeholders are often silent about needs that are currently satisfied by the incumbent system but would not be happy if, in an effort to fill a perceived need, the new system created a new gap: for example, up-armored vehicles that become unreliable or insufficiently mobile.

Table 5.2 Crosswalk Between Fundamental Objectives and Stakeholder Need Statements

Objective 1 Objective 2 Objective 3 Objective 4
OBJ 1.1 OBJ 1.2 OBJ 1.3
OBJ 1.1 OBJ 1.2 OBJ 1.3 OBJ 2.1 OBJ 2.2 OBJ 3.1 OBJ 3.2 OBJ 3.3
Capability Gap 1 x
Capability Gap 2 x
Capability Gap 3 x
Capability Gap 4 x
Capability Gap 5 x x
Business Need 1 x
Business Need 2 x
Business Need 3 x

For each fundamental objective, a measure (also known as attribute, criterion, and metric) must be established so that alternatives that more fully satisfy the objective receive a better score on the measure than those alternatives that satisfy the objective to a lesser degree. Table 5.3 illustrates this one-to-one mapping of objective and measure.

Table 5.3 Illustrating the One-to-One Mapping of Objective and Measure

Stakeholder value Objective 1 Objective 1.1 Objective 1.1.1 Measure 1.1.1
Objective 1.1.2 Measure 1.1.2
Objective 1.1.3 Measure 1.1.3
Objective 1.2 Objective 1.2.1 Measure 1.2.1
Objective 1.2.2 Measure 1.2.2
Objective 1.3 Objective 1.3.1 Measure 1.3.1
Objective 1.3.2 Measure 1.3.2
Objective 1.3.3 Measure 1.3.3
Objective 2 Measure 2
Objective 3 Measure 3
Objective 4 Measure 4

A measure should be unambiguous, comprehensive, direct, operational, and understandable (Keeney & Gregory 2005). Table 5.4 defines these properties of a high-quality measure.

Table 5.4 Properties of a High-Quality Measure

Property Definition
Unambiguous A clear relationship exists between consequences and descriptions of consequences using the measure.
Comprehensive The attribute levels cover the range of possible consequences for the corresponding objective, and value judgments implicit in the attribute are reasonable.
Direct The measure levels directly describe the consequences of interest.
Operational In practice, information to describe consequences can be obtained and value trade-offs can reasonably be made.
Understandable Consequences and value trade-offs made using the measure can readily be understood and clearly communicated.

Source: Data from Keeney & Gregory 2005.

Keeney has identified three types of measures – natural measure (kilometers, degrees, probability, seconds, etc.), constructed measure (Dow Jones Industrial Average, Heat Index, Consumer Price Index, etc.), and a proxy measure (usually a natural measure of a consequence that is thought to be correlated with the consequence of interest). Keeney recommends natural measures whenever possible since they tend to be commonly used and very understandable. When a natural measure is not available to describe a consequence, he recommends a constructed measure that directly describes the consequence of interest. If neither a natural measure nor a constructed measure is practical, then a proxy measure using a natural scale is often workable although by definition, it is an indirect measure as it does not directly describe the consequence of interest.

A defining feature of Multiobjective Decision Analysis (also called multiattribute value theory) is the transformation from measure space to value space that enables mathematical representation of a composite value score across multiple measures. This transformation is performed through the use of a value function. Value functions describe returns to scale on the measure. In other words, value functions describe the degree of satisfaction stakeholders perceive at each point along the measure scale.

There are several techniques available to elicit value functions and priority weightings from stakeholders. One of the more popular techniques used in marketing circles is Conjoint Analysis (Green et al., 2001) where stakeholders are asked to make a series of pairwise comparisons between hypothetical products. For decisions that involve fewer than eight competing objectives, Conjoint Analysis is a compelling technique for generating representative value schemes. As an objectives hierarchy grows beyond eight objectives, the number of pairwise comparisons required to formulate the representative value schemes balloons to an unreasonable number. Since many complex systems engineering decision tasks tend to involve the balancing of 25 to 35 objectives, the value scheme elicitation approach described in this chapter is a direct interview value function formulation technique coupled with a swing weight matrix methodology for determining priority weightings.

When creating a value function, one ascertains whether stakeholders believe there is a walk-away point on the objective measure scale (x-axis) and map it to 0 value on the value scale (y-axis). A walk-away point is defined as the measure score where regardless of how well an alternative performs in other measures, the decision maker will walk away from the alternative. Working with the stakeholder, find the measure score beyond which an alternative provides no additional value, label it “meaningful limit” (also called ideal), and map it to 100 (1 and 10 are also common scales) on the value scale (y-axis). If the returns to scale are linear, connect the walk-away value point to the meaningful value point with a straight line. If there is reason to believe stakeholder value behaves with nonlinear returns to scale, pick appropriate inflection points and draw the curve. The rationale for the shape of the value functions should be documented for traceability and defensibility (Parnell et al., 2011). Figure 5.6 provides examples of some common value function shapes.

Schematic illustration of Value Function Examples.

Figure 5.6 Value function examples

Practice suggests that eliciting two end points and three inflection points provides informative value functions without overtaxing the systems engineering or stakeholder.

  • Walk Away: Stakeholder will dismiss an alternative if it fails to meet at least this level regardless of how it performs on other value measures (1 point for meeting, 0 points if missed).
  • Marginally Acceptable: Stakeholder begins to become interested, and beyond this point the perceived value increases rapidly (10 points).
  • Target: Desired level (50 points).
  • Stretch Goal: Improving beyond this point is considered gold plating, so there is very little available value between this point and meaningful limit (90 points).
  • Meaningful Limit: Theoretical limit or known practical limit beyond which would be considered unrealistic (100 points).

In an effort to capture the voice of the customer, systems engineers will often ask a stakeholder focus group to prioritize their requirements. As Keeney puts it,

Most important decisions involve multiple objectives, and usually with multiple-objective decisions, you can't have it all. You will have to accept less achievement in terms of some objectives in order to achieve more on other objectives. But how much less would you accept to achieve how much more?

(Keeney 2002)

The mathematics of Multiobjective Decision Analysis (MODA) requires that the weights depend on importance of the preferentially independent measure and the range of the measure (walk away to stretch goal or ideal). A useful tool for determining weightings is the swing weight matrix. (Parnell et al., 2011) For each measure, consider its importance by determining if the measure corresponds to a defining capability, a critical capability, or an enabling capability and also consider the variation measure range by considering the gap between the current capability and the desired capability and put the name of the measure in the appropriate cell of the matrix. Swing weights are then assigned to each measure according to the required relationship rules described in Figure 5.7. Swing weights are then converted to measure weights by normalizing such that the set sums to 1. For the purposes of swing weight matrix use, consider a defining capability to be one that directly traces to a verb/noun pair identified at the top-level (level 0) functional definition of the system of interest – the reason why the system exists. Consider enabling capabilities to trace to functions that are clearly not the reason why the system exists but allow the core functions to be executed more fully. Let critical capabilities be those that are more than enabling but not quite defining.

Schematic illustration of Swing Weight Matrix.

Figure 5.7 Swing weight matrix

All decisions involve elements of subjectivity, the distinctive feature of formal decision management process is that these subjective elements are rigorously documented so that the consequences can be identified and assessed. Toward this end, it is considered good practice to document the measured, the value function, and the priority weighting along with associated rationale for each fundamental objective.

5.3.2.1 Example of Developing Objectives and Measures

Returning to the sUAV case study example, we find that after some quality time with many of the stakeholders and hours digging through white papers, memos, and presentations relevant to the problem statement, the lead systems engineer spearheaded a requirements analysis effort that included a functional decomposition. This exercise helped the systems engineering trade-off study team understand and be able to articulate what the system of interest is expected to “do,” which in turn enabled them to construct the functional performance objectives shown in the objectives hierarchy shown in Figure 5.8. Note that stakeholder value is measured in terms of not only functional performance but also life cycle costs and development schedule. (Although not addressed here due to space considerations, the notion of long-term viability fits well within an objectives hierarchy such as this).

Overview of Objectives Hierarchy for sUAV.

Figure 5.8 Objectives hierarchy for sUAV example

With the objectives hierarchy in hand, the study team identified measures for each of the functional performance objectives as shown in Table 5.5.

Table 5.5 Measures for sUAV Example

1.1 Be Soldier Transportable 1.1.1 Avoid Impeding Soldier Endurance Measure: % decrease in sustainable march speed
1.1.2 Avoid Impeding Soldier Sprint Measure: % increase in soldier sprint time
1.1.3 Avoid Impeding Soldier Jump Measure: % degredation in soldier jump height
1.2 Maneuver to and Dwell at Area of Interest 1.2.1 Reach Areas of Interest Quickly Measure: Max flight speed (km/hour)
1.2.2 Reach Distant Areas of Interest Measure: Maximum operational range (km)
1.2.3 Dwell @ Area of Interest for Extended Periods Measure: Operational Endurance (hours)
1.3 Collect ISR Info 1.3.1 Be Responsive to a Variety of ISR Data Requests Measure: ISR Data Request Responsiveness Index
1.3.2 Collect High-Quality Imagery During Daytime Measure: TTP rating per NV-IPM @ 3000m full light
1.3.3. Collect High-Quality Imagery at Night Measure: TTP rating per NV-IPM @ 3000m low light
1.3.4 Collect High-Quality Imagery in Obscured Env. Measure: TTP rating per NV-IPM @ 3000m w/ smoke
1.4 Securely Exchange Info w/ Command Station 1.4.1 Exchange Info Across Terrains & Geometries Measure: BLOS comms capable (yes/no)
1.4.2 Send Large Volumes of Data Quickly & Reliably Measure: High data rate payload comm link? (Y/N)
1.4.3 Avoid Spoofing, Jamming, Intercept Measure: Digital C2 link? (Y/N) Digital Payload Com link? (Y/N)
1.5 Be Recoverable & Tamper Resistant 1.5.1 Enable High Probability of Recovery Measure: Subjective assessment of landing scheme
1.7.2 Render System Useless Upon Enemy Capture Measure: Command self destruct feature?

Not shown in Table 5.5 are the measures for the Life Cycle Cost objective and the Development Schedule objective. These two measures are discussed here. The life cycle cost measure for this hypothetical case study is the sum of the rough order of magnitude estimates for development costs, procurement costs, training costs, maintenance costs, and wartime costs. Schedule duration for this exercise is measured as the number of years that the development effort requires estimated at the 80% confidence level after considering the uncertainty associated with the duration estimates for each configuration item to mature from its current state to form, fit, and function tested across temperatures plus the estimate for time required for system-level integration and test.

With a good understanding of how each objective is to be measured, the lead systems engineer worked to understand the degree of satisfaction that each stakeholder perceives at each point along a particular measure scale and then expressed these relationships as a set of value functions. To accomplish this, the lead systems engineer of the sUAV effort worked with a small group of stakeholders to document a walk-away point, a marginally acceptable point, a target point, a stretch goal, and a meaningful limit. The lead systems engineer repeated this process for several stakeholder groups in order to capture any differences among value schemes. The lead systems engineer maintained a record of each set of value functions so that he may use them as part of the sensitivity analysis later in the process. Table 5.6 describes the value functions associated with the 17 measures of this hypothetical sUAV case study as constructed by one of the stakeholder groups. Notice that life cycle cost measures and schedule duration measures are not included in Table 5.6 due to space considerations. Figure 5.9 provides a graphical view of 3 of the 17 value functions.

Table 5.6 End and Inflection Points of sUAV Value Functions

Name Value Functional Performance
Be Transported Fly Collect Communicate End
Avoid Impeding Soldier Endurance Avoid Impeding Soldier Sprint Avoid Impeding Soldier Jump Reach Area of Interest (10 km) Quickly Reach Distant Areas of Interest Dwell at Area of Interest Be Responsive to a Variety of ISR Data Requests Collect High Quality Imagery During Day Collect High Quality Imagery During Night Collect High Quality Imagery In Obscured Environments Exchange Info Across Various Terrains & Geometries Send ISR Imagery Quickly and Reliably Avoid Spoofing, Jamming, or Communicate Intercept Enable High Probability of Recovery Render System Useless Upon Enemy Capture
% % % min km hrs VI Pd Pd Pd L/B sec A/D % y/n
Walk-away 1 10 15 20 15 10 1 1 0.1 0.1 0.1 90 90
Marginally Acceptable 10 8 12 16 12 15 2 2 0.2 0.2 0.2 L 30 A 92 N
Target 50 5 7.5 10 8 30 4 5 0.7 0.7 0.7 10 95
Stretch goal 90 2 3 4 4 45 8 8 0.9 0.9 0.9 B 5 D 98 Y
Meaningful limit 100 0 0 0 1 50 16 10 1.0 1.0 1.0 1 100
Graph of Value Function Graphs for sUAV.

Figure 5.9 Graphical representations of value function graphs for sUAV example

To complete his understanding of the stakeholder value, the lead systems engineer set out to identify the objectives of which the stakeholders were willing to accept marginal returns in order to achieve high returns on others. By working through a swing weight matrix with each stakeholder group, the lead systems engineer identified weightings for each measure. The normalized weights for one of the stakeholder groups are depicted in Figure 5.10. Weights developed by other stakeholder groups were also documented and maintained for use in the sensitivity analysis of later steps.

Overview of Weights for sUAV.

Figure 5.10 Weights for sUAV example

With the value schemes of stakeholders having been captured, the lead systems engineer knows how goodness will be measured for each alternative considered within the systems engineering trade-off analysis. The lead systems engineer can now turn his attention to generating sUAV system alternatives. The next section of this chapter describes best practices for generating creative alternatives and is followed by the continuation of the sUAV case study.

5.3.3 Generate Creative Alternatives

For many trade studies, the alternatives will be systems composed of many interrelated subsystems. It is important to establish a meaningful product structure for the system of interest and to apply this product structure consistently throughout the decision process effort in order to aid effectiveness and efficiency of communications about alternatives. The product structure should be a useful decomposition of the physical elements of the system of interest.

Each alternative is composed of specific design choices for each generic product structure element. The ability to quickly communicate the differentiating design features of given alternatives is a core element of the decision-making exercise. Tables 5.85.10 provide a template for succinct yet complete system-level alternative descriptions. These subsystem design choices have system-level consequences across the objectives hierarchy. Every subsystem design choice will impact system-level cost, system-level development schedule, and system-level performance. It is important to emphasize that these design choices are not fundamental objectives, they are means objectives important only to the degree that they assist in achieving fundamental objectives. It may be useful to think of design choices as the levers used by the system architect to steer the system design toward a solution that best satisfies the all elements of stakeholder value – the full fundamental objectives hierarchy. These levers are very important and care should be given in this step of the process to clearly and completely identify specific design choices for each generic product structure element for every alternative being considered. Incomplete or ambiguous alternative descriptions can lead to incorrect or inconsistent alternative assessments in the process described later. The ability to quickly and accurately communicate the differentiating design features of given alternatives is a core element of the decision-making exercise.

System characteristics are interesting consequences of subsystem design choices measured at the system level but are not themselves fundamental objectives. For example, the weight of a system alternative is classified as a system characteristic in that weight is neither a design decision nor a fundamental objective. Weight is not a design decision but rather a consequence of all the subsystem design choices made throughout the product structure. Weight is not a fundamental objective because it is not inherently good or bad although it does factor into many fundamental objectives measures. Documenting characteristics is considered a best practice for two reasons;

  1. if senior stakeholders are known to ask about certain aspects of various alternatives, the lead systems engineer should be prepared to provide an immediate answer, and
  2. some system characteristics are part of so many objective measures that it is useful to report them at the same level as system design choices for the sake of clarity and explanatory power.

5.3.3.1 Example of Generating Creative Alternatives

Returning to our sUAV example, the lead systems engineer has identified the following four top-level elements of the sUAV physical architecture: the Air Vehicle, the ISR Collecting Payload, the Communication Links, and the Ground Elements. The sUAV Physical Architecture Description in Table 5.7 decomposes the four top level physical elements into generic subelements and also provides a list of specific design choices available for each generic subelement of the physical architecture.

Table 5.7 sUAV Physical Architecture Description

Air Vehicle
Propulsion System Energy Source Prop Size & Location Wing Span Wing Config. Fin Config. Actuators Airframe Material Autopilot Launch Land
Electric 300W Li-Ion Battery 18” Rear 4 ft Conv. Twin Boom Conv. Electro-magnetic Graphite Epoxy Preprogram, Auto Hand Skid and Belly
Electric 600W Li-S Battery 22” Rear 5 ft Canard Inverted V Hydraulic Aramid Epoxy Semiauto Tensioned Line Net
Piston Engine 2.5HP Fuel Cell 26” Rear 6 ft Tandem Wing V Tail MEMS Boron Epoxy Remotely Piloted Gun Launch Parachute
Piston Engine 4.0HP Solar 18” Front 7 ft Three Surface H Tail Fiberglass Epoxy Deep Stall
JP-8 Fuel 22” Front 8 ft Cruciform
26” Front 9 ft
Fixed None None Small Fixed Antenna transmits analog data direct to GCS (VHF or UHF) Small Fixed Antenna transmits analog data direct to GCS (VHF or UHF) Dipole Ruggedized Laptop Keyboard Generator
Pan–tilt 4 Megapixel Daylight Camera Cooled 320 × 240 MWIR Small, Fixed, Nonpointing Antenna transmits digital data to LEO Satellite (L Band) Small, Fixed, Nonpointing Antenna transmits digital data to LEO Satellite (L Band) Parabolic Reflector Wearable Computer Joystick Battery + Generator
Roll–tilt 8 Megapixel Daylight Camera Cooled, 640 × 480 MWIR Mech. Steerable parabolic dish transmits digital data to GEO Satellite (Ka or Ku Band) Smartphone Touchscreen Battery + Backup Batteries
Pan–tilt–roll Cooled 1280 × 720 MWIR & LWIR Electronically Steered Phased Array Antenna transmits digital data to GEO Satellite (Ka or Ku Band) Stylus
Uncooled 1024 × 768 MWIR & LWIR

Using the Physical Architecture Description in Table 5.6, the lead systems engineer and his study team developed the 12 system-level sUAV concepts in Tables 5.85.10. The team used the table format for describing the alternatives to ensure that each system would be described completely, succinctly, and consistently to aid in the efficiency and effectiveness of communications throughout the trade study.

Table 5.8 Descriptions for Buzzard I, Buzzard II, Cardinal I, and Cardinal II

1 2 3 4
Buzzard I Buzzard II Cardinal I Cardinal II
image image image image
Subsystem/Component Design Choice Design Choice Design Choice Design Choice
Air vehicle
Propulsion System Electric 300W Electric 300W Electric 300W Electric 300W
Energy Source Li-Ion Battery Li-Ion Battery Li-S Battery Li-S Battery
Prop Size and Location 18” Rear 18” Rear 20” Rear 20” Rear
Wing Span 5' 5' 6' 6'
Wing Configuration Canard Canard Conventional Conventional
Fin Configuration Inverted V Inverted V Twin Boom Twin Boom
Actuators Electromagnetic Electromagnetic Electromagnetic Electromagnetic
Airframe Material Graphite Epoxy Graphite Epoxy Graphite Epoxy Graphite Epoxy
Autopilot Semiauto Semiauto Remotely Piloted Remotely Piloted
Launch Mechanism Hand Hand Hand Hand
Landing Mechanism Belly Belly Belly Belly
ISR Collecting Payload
Sensor Actuation Fixed Fixed Fixed Fixed
EO Imager 4 MP 4 MP 4 MP 4 MP
IR Imager 320 × 240 MWIR 640 × 480 MWIR 320 × 240 MWIR 640 × 480 MWIR
Communication Links
Command and Control Link Fixed VHF Fixed VHF Fixed VHF Fixed VHF
Payload Data Link Fixed VHF Fixed VHF Fixed VHF Fixed VHF
Ground Elements
Antenna Dipole Dipole Dipole Dipole
Computer Laptop Laptop Smartphone Smartphone
User Input Device Keyboard Keyboard Joystick Joystick
Power Battery + Spare Battery + Spare Battery + Spare Battery + Spare

Table 5.9 Descriptions for Crow I, Crow II, Pigeon I, and Pigeon II

5 6 7 8
Crow I Crow II Pigeon I Pigeon II
image image image image
Subsystem/Component Design Choice Design Choice Design Choice Design Choice
Air Vehicle
Propulsion System Electric 600W Electric 600W Electric 600W Electric 600W
Energy Source Li-Ion Battery Li-Ion Battery Li-S Battery Li-S Battery
Prop Size and Location 22” Rear 22” Rear 20” Rear 20” Rear
Wing Span 6' 6' 6' 6'
Wing Configuration Tandem Wing Tandem Wing Conventional Conventional
Fin Configuration V Tail V Tail Twin Boom Twin Boom
Actuators MEMS MEMS Electromagnetic Electromagnetic
Airframe Material Graphite Epoxy Graphite Epoxy Graphite Epoxy Graphite Epoxy
Autopilot Semiauto Semiauto Remotely Piloted Remotely Piloted
Launch Mechanism Hand Hand Hand Hand
Landing Mechanism Belly Belly Belly Belly
ISR Collecting Payload
Sensor Actuation Pan–tilt Pan–tilt Pan–tilt Pan–tilt
EO Imager 8 MP 8 MP 8 MP 8 MP
IR Imager 1280 × 720 MWIR & LWIR cooled 1280 × 720 MWIR & LWIR uncooled 1280 × 720 MWIR & LWIR cooled 1280 × 720 MWIR & LWIR uncooled
Communication Links
Command and Control Link Fixed VHF Fixed VHF Fixed VHF Fixed VHF
Payload Data Link Fixed VHF Fixed VHF Phased Array Ka Phased Array Ka
Ground Elements
Antenna Dipole Dipole Dipole & Dish Dipole & Dish
Computer Laptop Laptop Laptop Laptop
User Input Device Keyboard Keyboard Joystick Joystick
Power Battery + Spare Battery + Spare Battery + Spare Battery + Spare

Table 5.10 Descriptions for Robin I, Robin II, Dove I, and Dove II

9 10 11 12
Robin I Robin II Dove I Dove II
image image image image
Subsystem/Component Design Choice Design Choice Design Choice Design Choice
Air Vehicle
Propulsion System Piston 2.5 HP Piston 2.5 HP Piston 4.0 HP Piston 4.0 HP
Energy Source JP-8 JP-8 JP-8 JP-8
Prop Size and Location 26” Front 26” Front 28” Front 28” Front
Wing Span 8' 8' 9' 9'
Wing Configuration Conventional Conventional Conventional Conventional
Fin Configuration H Tail H Tail Cruciform Cruciform
Actuators Hydraulic Hydraulic Hydraulic Hydraulic
Airframe Material Fiberglass Epoxy Fiberglass Epoxy Fiberglass Epoxy Fiberglass Epoxy
Autopilot Remotely Piloted Remotely Piloted Remotely Piloted Remotely Piloted
Launch Mechanism Tensioned Line Tensioned Line Tensioned Line Tensioned Line
Landing Mechanism Net Net Net Net
ISR Collecting Payload
Sensor Actuation Pan–tilt Pan–tilt Pan–tilt Pan–tilt
EO Imager 8 MP 8 MP 8 MP 8 MP
IR Imager 1280 × 720 MWIR & LWIR cooled 1280 × 720 MWIR & LWIR uncooled 1280 × 720 MWIR & LWIR cooled 1280 × 720 MWIR & LWIR uncooled
Communication Links
Command and Control Link Fixed VHF Fixed VHF Fixed VHF Fixed VHF
Payload Data Link Elect. Steered Phased Array Ka Elect. Steered Phased Array Ka Mech. Steered Dish Ka Mech. Steered Dish Ka
Ground Elements
Antenna Dipole Dipole Dipole and Dish Dipole and Dish
Computer Laptop Laptop Laptop Laptop
User Input Device Keyboard Keyboard Joystick Joystick
Power Battery + Gen. Battery + Gen. Battery + Gen. Battery + Gen.

The lead systems engineer knows that with the alternatives defined, he is ready to start collecting data regarding each system's response to the measures established earlier in the process. The next section of this chapter walks through some best practices with regard to assessing alternatives via deterministic analysis followed by the continuation of this sUAV example.

5.3.4 Assess Alternatives via Deterministic Analysis

With objectives and measures established and alternatives identified and defined, the decision team should engage subject matter experts, ideally equipped with operational data, test data, models, simulations and expert knowledge. Often a mapping between physical architecture elements to fundamental objectives may help identify the types of subject matter expertise needed to fully assess each alternative against a particular objective (Table 5.11 – - Physical Architecture to Fundamental Objective Mapping). These simple maps often grow to more complex flow diagrams showing the interrelationships between physical architecture choices, different levels of intermediate measures, and, finally, the fundamental objectives.

Table 5.11 Physical Architecture to Fundamental Objective Mapping

Objective 1 Objective 2 Objective 3 Objective 4
OBJ 1.1 OBJ 1.2 OBJ 1.3
OBJ 1.1.1 OBJ 1.1.2 OBJ 1.1.3 OBJ 1.2.1 OBJ 1.2.2 OBJ 1.3.1 OBJ 1.3.2 OBJ 1.3.3
Subsystem A x x x x
Subsystem B x x x x
Subsystem C x x x x
Subsystem D x x x x
Subsystem E x x x x x
Subsystem F x x x
Subsystem G x x x
Subsystem H x x x
Subsystem I x x x
Subsystem J x x x x

It may be helpful to expand these simple maps into more informative Assessment Flow Diagrams (AFDs) that trace the relationships between physical means, intermediate measures, and fundamental objectives. As an example of such a diagram, consider the sample provided in Figure 5.11. An Assessment Flow Diagram helps individual subject matter experts understand how their area of expertise fits into the larger assessment picture, from where inputs to feed their particular model will be coming and to where their outputs will be consumed. An AFD can be used by the lead systems engineer to organize, manage, and track assessment activities especially when used in conjunction with the consequence scorecard shown in Table 5.12 and Table 5.13.

Illustration of Assessment Flow Diagram (AFD) for a Hypothetical Gun Design Choice Activity.

Figure 5.11 Assessment flow diagram (AFD) for a hypothetical gun design choice activity (lead author's original graphic)

Table 5.12 Structured Scoring Sheet for a Given Measure

Detailed Description of Measure: Assessment Alternative Description
Estimate Rationale Subsystem A Subsystem B Subsystem C Subsystem D
ID Name Image Low Expected High
1 Descriptive Name for Alt #1 Illustration for Alt #1
2 Descriptive Name for Alt #2 Illustration for Alt #2
3 Descriptive Name for Alt #3 Illustration for Alt #3
4 Descriptive Name for Alt #4 Illustration for Alt #4
5 Descriptive Name for Alt #5 Illustration for Alt #5
6 Descriptive Name for Alt #6 Illustration for Alt #6

Table 5.13 Consequence Scorecard Structure

ID Name Image Objective 1 Objective 2 Objective 3 Objective 4
OBJ 1.1 OBJ 1.2 OBJ 1.3
OBJ 1.1.1 OBJ 1.1.2 OBJ 1.1.3 OBJ 1.2.1 OBJ 1.2.2 OBJ 1.3.1 OBJ 1.3.2 OBJ 1.3.3
1 Descriptive Name for Alt #1 Illustration for Alt #1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,2 x1,3 x1,4
2 Descriptive Name for Alt #2 Illustration for Alt #2 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,2 x2,3 x2,4
3 Descriptive Name for Alt #3 Illustration for Alt #3 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,2 x3,3 x3,4
4 Descriptive Name for Alt #4 Illustration for Alt #4 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,2 x4,3 x4,4
5 Descriptive Name for Alt #5 Illustration for Alt #5 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,2 x5,3 x5,4
6 Descriptive Name for Alt #6 Illustration for Alt #6 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,2 x6,3 x6,4

In addition to the organization and communication benefits, an AFD seems to provide some psychological benefits to the subject matter experts (SMEs) conducting the assessments and to the stakeholders hoping to make use of the results. An AFD gives an SME confidence that their analysis will not be ignored and sends the message that their expertise is important and needed and their results will find their way to the decision table in proper context, as a piece of the whole assessed in terms meaningful to the stakeholder. Similarly, by showing the pedigree of the data feeding the decision support model, an AFD gives the stakeholders confidence that the trade-off analysis rests on a solid foundation and not a product of generalists sitting around the table voting.

The decision team can prepare for subject matter expert engagement by creating structured scoring sheets. Assessments of each concept against each criterion can be captured on separate structured scoring sheets for each alternative/measure combination. Each score sheet contains a summary description of the alternative under examination and a summary of the scoring criteria to which it is being measured. The structured scoring sheet should contain ample room for the evaluator to document the assessed score for the particular concept against the measure followed by clear discussion providing the rationale for the score, noting how design features of the concept under evaluation led to the score as described in the rating criteria. Whenever possible, references to operational data, test data, calculations, models, simulations, analogies, or experience that led to a particular score should be documented.

Creating separate structured scoring sheets for each alternative/measure combination may become somewhat cumbersome for large studies. In practice, a separate structured scoring sheet is often constructed for each measure only and alternatives are identified as separate rows within each measure sheet. Table 5.12 provides a sample format for such a scoring sheet. This approach has the added benefit of reinforcing the notion that each alternative is assessed using the same measure and reducing the risk of inconsistent assessments.

After all the structured scoring sheets have been completed for each alternative/measure combination, it is useful to summarize all the data in tabular form. Each column in such a table would represent a measure and each row would represent a particular alternative. Table 5.13 provides a sample structure identified here as a consequences scorecard.

5.3.4.1 Example of Assessing Alternatives via Deterministic Analysis

Continuing the sUAV case study example, the lead systems engineer recruited a team of subject matter experts to assess each alternative against a measure that aligns with their skill set. For instance, human factors experts assessed impact to soldier mobility and endurance, aerospace engineers assessed the measures pertaining to flight, electrical engineers specializing in sensors assessed the ISR data collection measures, communication engineers scored the information transit and receive measures, and mechanical engineers rated each alternative against the recover measures. Each subject matter expert was provided with a scoring sheet for recording their findings. The lead systems engineer took the findings from the scoring sheets and created the consequence scorecard shown in Table 5.14.

Table 5.14 Consequence Scorecard Example for sUAV Case Study

Functional Performance
Be Transported Fly Collect Communicate End
Avoid Impeding Soldier Endurance Avoid Impeding Soldier Sprint Avoid Impeding Soldier Jump Reach Area of Interest (10 km) Quickly Reach Distant Areas of Interest Dwell at Area of Interest Be Responsive to a Variety of ISR Data Requests Collect High Quality Imagery During Day Collect High Quality Imagery During Night Collect High Quality Imagery In Obscured Environments Exchange Info Across Various Terrains & Geometries Send ISR Imagery Quickly and Reliably Avoid Spoofing, Jamming, or Communicate Intercept Enable High Probability of Recovery Render System Useless Upon Enemy Capture Life Cycle Costs Development Schedule Duration
ID Name Image % % % min km hrs VI Pd Pd Pd LB sec AD % y/n $B yrs
1 Buzzard I image 1 2 3 15 10 1 2 0.3 0.2 0.1 L 30 A 90 90 1 2
2 Buzzard II image 1 2 3 15 10 1 4 0.4 0.3 0.2 L 30 A 90 90 1.5 2
3 Cardinal I image 2 4 5 12 12 1.5 5 0.5 0.4 0.3 L 90 A 90 90 2.3 3
4 Cardinal II image 2 4 5 12 12 1.5 10 0.5 0.4 0.3 L 90 A 90 90 4 3
5 Crow I image 4 5 8 10 15 5 10 0.9 0.8 0.7 B 20 D 95 90 5 9
6 Crow II image 3 4 7 9 15 6 10 0.8 0.7 0.6 B 30 D 95 90 6 9
7 Pigeon I image 5 7 10 8 18 8 10 0.9 0.8 0.7 B 20 D 95 90 6.5 7
8 Pigeon II image 4 6 9 7 18 9 10 0.8 0.7 0.6 B 30 D 95 90 7.5 7
9 Robin I image 7 12 17 6 22 10 10 0.9 0.8 0.7 B 20 D 98 90 8.3 6
10 Robin II image 6 11 16 6 22 11 10 0.8 0.7 0.6 B 30 D 98 90 8.8 6
11 Dove I image 10 15 20 4 30 23 10 0.9 0.8 0.7 B 20 D 98 90 9.3 5
12 Dove II image 9 14 19 4 30 24 10 0.8 0.7 0.6 B 30 D 98 90 9.9 5

With 204 measurements (12 alternatives scored against 17 measures) taken, some would be tempted to claim success and call it a day, but the lead systems engineer for the sUAV trade-off analysis knew there was much to be done in order to fully mine this data for understanding and to communicate the findings and recommendations to the study sponsor in a way that would lead to action. Of course, making 204 measurements and recording them in a well-structured data store is no small task, and it is better than some of the trade-study products he has seen over his career, but the consequence scorecard alone is certainly not conducive to confident decision-making. The next section of this chapter discusses some of the best practices for synthesizing results for rapid and thorough understanding of the trade at hand followed by an application of these best practices to the sUAV case study.

5.3.5 Synthesize Results

At this point in the process, the decision team has generated a large amount of data as summarized in the consequences scorecard. Now it is time to explore the data and display results in a way that facilitates understanding. Transforming the data in the consequences scorecard into a value scorecard is accomplished through the use of the value functions developed in the decision analysis process step described earlier. Table 5.15 shows the structure of a value scorecard. In an effort to enhance speed and depth of comprehension of the value scorecard, consider associating increments on the value scale with a color according to heat map conventions. This view can be useful when trying to determine which objectives are causing a particular alternative trouble. In addition, one can use this view to quickly see if there are objectives for which no alternative scores well. From this view, the systems engineer can also see if there is at least one alternative that scores above the walk-away point for all objectives. If not, the solution set is empty and the decision team needs to generate additional alternatives or adjust objective measures.

Table 5.15 Value Scorecard Structure

ID Name Image Objective 1
OBJ 1.1 OBJ 1.2 OBJ 1.3
OBJ 1.1.1 OBJ 1.1.2 OBJ 1.1.3 OBJ 1.2.1 OBJ 1.2.2 OBJ 1.3.1 OBJ 1.3.2 OBJ 1.3.3
1 Descriptive Name for Alt #1 Illustration for Alt #1 v1.1.1(x1,1.1.1) v1.1.2(x1,1.1.2) v1.1.3(x1,1.1.3) v1.2.1(x1,1.2.1) v1.2.2(x1,1.2.2) v1.3.1(x1,1.3.1) v1.3.2(x1,1.3.2) v1.3.3(x1,1.3.3)
2 Descriptive Name for Alt #2 Illustration for Alt #2 v1.1.1(x2,1.1.1) v1.1.2(x2,1.1.2) v1.1.3(x2,1.1.3) v1.2.1(x2,1.2.1) v1.2.2(x2,1.2.2) v1.3.1(x2,1.3.1) v1.3.2(x2,1.3.2) v1.3.3(x2,1.3.3)
3 Descriptive Name for Alt #3 Illustration for Alt #3 v1.1.1(x3,1.1.1) v1.1.2(x3,1.1.2) v1.1.3(x3,1.1.3) v1.2.1(x3,1.2.1) v1.2.2(x3,1.2.2) v1.3.1(x3,1.3.1) v1.3.2(x3,1.3.2) v1.3.3(x3,1.3.3)
4 Descriptive Name for Alt #4 Illustration for Alt #4 v1.1.1(x4,1.1.1) v1.1.2(x4,1.1.2) v1.1.3(x4,1.1.3) v1.2.1(x4,1.2.1) v1.2.2(x4,1.2.2) v1.3.1(x4,1.3.1) v1.3.2(x4,1.3.2) v1.3.3(x4,1.3.3)
5 Descriptive Name for Alt #5 Illustration for Alt #5 v1.1.1(x5,1.1.1) v1.1.2(x5,1.1.2) v1.1.3(x5,1.1.3) v1.2.1(x5,1.2.1) v1.2.2(x5,1.2.2) v1.3.1(x5,1.3.1) v1.3.2(x5,1.3.2) v1.3.3(x5,1.3.3)
6 Descriptive Name for Alt #6 Illustration for Alt #6 v1.1.1(x6,1.1.1) v1.1.2(x6,1.1.2) v1.1.3(x6,1.1.3) v1.2.1(x6,1.2.1) v1.2.2(x6,1.2.2) v1.3.1(x6,1.3.1) v1.3.2(x6,1.3.2) v1.3.3(x6,1.3.3)

Radar graphs and tornado graphs (Figures 5.12 and 5.13) are popular visualization techniques to show the same value data captured in a Heat-Indexed Value Scorecard discussed earlier but usually for only two alternatives at a time.

Illustration of Radar Value Graph Structure.

Figure 5.12 Radar value graph structure

Illustration of Tornado Graph Structure.

Figure 5.13 Tornado graph structure

5.3.5.1 Example of Synthesizing Results

Returning to the sUAV example, the lead systems engineer has transformed the consequence table into a value scorecard through the use of the value functions created in the second step of this process. Notice how the heat map conditional formatting of the scorecard makes the strengths and weaknesses of each alternative very apparent. It also quickly highlights objectives that are nondiscriminating and objectives that are difficult to achieve for any alternative.

The lead systems engineer was pleased with the value scorecard and rushed to show the emerging results to the study sponsor and several other stakeholders. The feedback he received was very positive and encouraging, but all asked for the cost/schedule/performance trade to be more explicitly shown. The next section in this chapter covers the development of multidimensional value models to create aggregated value visualizations followed by an application of these techniques to the sUAV case study.

5.3.6 Develop Multidimensional Value Model

Beyond the consequence scores for each alternative on each measure, all that was needed to construct the visualizations covered in Table 5.16 were the value functions associated with each objective measure. By introducing the weighting scheme, the systems engineer can create aggregated value visualizations. The first step in assessing an alternative's aggregated value is a prescreen for alternatives that fail to meet a walk-away point for any objective measure and set that alternative's aggregated value to zero regardless of how it performs on other objective measures. For those alternatives that pass the walk-away prescreen, the additive value model2 uses the following equation to calculate each alternative's aggregated value:

5.1 equation

where v(x) is the alternative's value, i = 1 to n is the number of the measure, xi is the alternative's score on the ith measure, vi(xi) = is the single-dimensional value of a score of xi, wi is the weight of the ith measure,

5.2 equation

and (all weights sum to 1).

Table 5.16 Value Scorecard for sUAV example

c05f013

This chapter is devoted to the pragmatic application of the aggregation technique but a thorough treatment of the mathematical foundation for the additive value model is provided in Chapter 2 and by (Keeney 1981; Stewart 1996; Von Winterfeldt et al., 1986).

With the weights in hand, one can construct aggregated visualizations such as the value component graph as shown in Figure 5.14. In a value component graph, each alternative's total value is represented by the total length of a segmented bar. Each bar segment represents the contribution of the value earned by the alternative of interest within a given measure by the weighted value (Parnell et al. 2013). As discussed in Section 5.3.2 and illustrated in Table 5.3, every objective has one measure and only one measure.

Graphical display of Value Component structure.

Figure 5.14 Value component graph structure

The heart of a decision support process for systems engineering trade analysis is the ability to integrate otherwise separate analyses into a coherent, system-level view that traces consequences of design decisions across all dimensions of stakeholder value. The stakeholder value scatterplot illustrated in Figure 5.15 shows in one chart how all system-level alternatives respond in multiple dimensions of stakeholder value.

Graphical display of Stakeholder Value Scatterplot Structure.

Figure 5.15 Stakeholder value scatterplot structure

Figure 5.15 illustrates the structure of a stakeholder value scatterplot showing how the six hypothetical alternatives respond to four dimensions of stakeholder value – performance value, life cycle cost, development schedule, and long-term viability. Each system alternative is represented by a scatterplot marker. An alternative's life cycle cost and performance value are indicated by a marker's x and y positions, respectively. An alternative's development duration is indicated by the color of the marker per heat map conventions shown in the legend, while the long-term viability of a particular alternative is indicated by the shape of the marker as described in the legend.

5.3.6.1 Example of Developing a Multidimensional Value Model

Resuming the sUAV case study example, the lead systems engineer is anxious to respond to the study sponsor feedback and provide a more explicit representation of the cost, schedule, and performance trade between the alternatives under consideration. Making use of the additive value model and the weighting scheme described in Figure 5.10 along with the value scorecard in Table 5.15 the lead systems engineer is able to create the aggregated value visualizations shown in Figures 5.16 and 5.17.

Illustration of Value Component Chart for sUAV.

Figure 5.16 Value component chart for sUAV

Illustration of Value Scatterplot for the sUAV.

Figure 5.17 Value scatterplot for the sUAV example

The lead systems engineer was excited about the value scatterplot and once again hurried to the show this visualization to the study sponsor and several other stakeholders. The feedback he received was again glowing, agreeing that this particular visualization clearly showed the cost/schedule/performance trade. However, this time the study sponsor asked to understand how variations in priority weightings would impact the results. The next section in this chapter covers the best practices associated with identifying uncertainty and conducting probabilistic analysis followed by an application of these techniques to the sUAV case study.

5.3.7 Identify Uncertainty and Conduct Probabilistic Analysis

As part of the assessment, it is important for the subject matter expert to explicitly discuss potential uncertainty surrounding the assessed score and variables that could impact one or more scores. One source of uncertainty that is common within systems engineering trade-off analyses that explore various system architectures is technology immaturity. System design concepts are generally described as a collection of subsystem design choices, but if some of the design choices include technologies that are immature, there may be a lack of detail associated with component-level design decisions that will eventually be made downstream during detailed design. Many times the subject matter expert can assess an upper, nominal, and lower bound measure response by making three separate assessments (i) assuming low performance, (ii) assuming moderate performance, and (iii) assuming high performance.

Another source of uncertainty has to do with the subjective nature of the value schemes elicited from the stakeholders. Considering that a stakeholder's value scheme is often tied to their forecast of future scenarios and acknowledging that the stakeholder is probably not clairvoyant leads to the conclusion that people can reasonably disagree about things such as priority weightings. One of the common pitfalls of systems engineering trade-off analyses is to collect value scheme information from a small set of like-minded stakeholders and ignore the fact that value schemes likely vary across the full population of stakeholders. The best practice that should be employed here is to collect value scheme information from many different stakeholders and then run a battery of sensitivity analyses pertaining to priority weightings to ensure that a meaningful decision can be made in the presence of such uncertainty. The next section of this chapter applies some of techniques for identifying uncertainty to the sUAV example, and the subsequent section covers techniques used to assess the impact of uncertainty.

5.3.7.1 Example of Identifying Uncertainty and Conducting Probabilistic Analysis

Recall the sUAV case study and how the study sponsor asked the lead systems engineer to show how variations in priority weightings would impact the results. Toward this end, the lead systems engineer discussed the composition of the focus group he used to develop the priority weightings with the study sponsor and asked him to provide some recommendations regarding stakeholder representatives that may have a different view on how weightings for this trade should be set. The study sponsor provided the systems engineering lead with a list of contacts that he suspects would offer a somewhat different take on weights associated with this trade. Focus group number 2 was formed from this list, and the lead systems engineer developed Figure 5.18 to highlight the differences in the priority weightings generated by the two groups.

Overview of Weightings as Generated by Focus Group 1 and Focus Group 2.

Figure 5.18 Weightings as generated by focus group 1 and focus group 2

The lead systems engineer noticed the clear differences between the two group's areas of emphasis, group 1 on ISR data collection quality and group 2 on soldier mobility while transporting the sUAV system. He realized that capturing this source of uncertainty is the first step to assessing its impact on the overall decision. The next section of this chapter describes some sensitivity analyses that can be used to gain this understanding followed by a return the sUAV case study.

5.3.8 Assess Impact of Uncertainty

Decision analysis uses many forms of sensitivity analysis including line diagrams, tornado diagrams, and waterfall diagrams and several uncertainty analyses including Monte Carlo Simulation, decision trees, and influence diagrams (Parnell et al., 2013). Due to space limits, only line diagrams of sensitivity to weighting and Monte Carlo Simulation are discussed in this section.

Many decision-makers will want to understand how sensitive a particular recommendation is to weightings and will ask questions regarding the degree to which a particular weighting would need to be changed in order to change in recommended alternative. A common approach to visualizing the impact of measure weighting on overall value is to sweep each measure's weighting from absolute minimum to absolute maximum while holding the relative relationship between the other measure weightings constant and noting changes to overall score. The output of this type of sensitivity analysis is in the form of a line graph (Parnell et al., 2011). An example of such a graph is provided in Figure 5.19. Note that this particular example shows how sweeping the weight associated with Objective 1.1.2 impacts performance value. The graph in this example shows that the alternative with the highest performance value is alternatives 2 for all cases where priority weighting associated with Objective 1.1.2 is somewhat low but as the weight of Objective 1.1.2 is increased to 0.8 and above, alternative 4 emerges as the high performer.

Graph of Weight Sensitivity Line structure.

Figure 5.19 Weight sensitivity line graph structure

Sometimes, it is useful to consider all uncertainties at once instead of merely investigating the impact of one particular uncertainty at a time. For this type of view, consider the tradespace visualization in Figure 5.20. Once all the uncertainties have been assessed, Monte Carlo Simulations (Chapter 3) can be executed to identify the uncertainties that impact the decision findings and of the uncertainties that are inconsequential. For example, Figure 5.20 shows that after considering all sources of uncertainty, alternative 1 is less susceptible to changes in stakeholder value than alternative 3. Note, however, that although the stakeholder value of alternative 3 is a bit more volatile in the presence of uncertainty, its stakeholder value never falls below the highest level of alternative 1's stakeholder value. This graph also indicates that alternative 3 has the edge on long-term viability, whereas development duration differentiation is nil. Consequently, decision-makers may be inclined to pursue alternative 3 over alternative 1 if the differences in life cycle costs were deemed affordable.

Overview of Stakeholder Value Scatterplot with Uncertainty.

Figure 5.20 Stakeholder value scatterplot with uncertainty

The takeaway of this section may be that good decisions can often be made in the presence of high uncertainty. Systems engineers should not let what is not known sabotage a decision-making opportunity if what is known is sufficient.

5.3.8.1 Example of Assessing the Impact of Uncertainty

Picking up the sUAV case study, the lead systems engineer applied two sensitivity analysis techniques to assess the impact of the uncertainty surrounding priority weightings. As a first step in his attempt to get his arms around the degree of decision volatility introduced by changes in weighting schemes, he generated a full set of performance value sensitivity line graphs – one graph per measure. Figure 5.21 shows one of the these line graphs, the line graph that shows changes in functional performance value as the priority weight associated with “avoid impeding soldier sprint” objective is swept from 0 to 1. Notice that the top performing alternative only changes twice throughout the entire sweep – from Crow I to Crow II at 0.45 and from Crow II to Buzzard II at 0.75.

Graph of sUAV Performance Value Sensitivity to Changes in Priority Weight of “Avoid Impeding Soldier Sprint” Objectives.

Figure 5.21 sUAV performance value sensitivity to changes in priority weight of “avoid impeding soldier sprint” objectives

Although the set of line graphs were interesting and shed some light on the degree of volatility pertaining to this specific decision, the lead systems engineer feared that such graphs would not directly address the question raised by the study sponsor. For this, he decided to generate a stakeholder value scatterplot with uncertainty. Figure 5.22 shows this scatterplot. Notice how this graph clearly shows that Crow I maintains the highest functional performance value under either weight set considered.

Overview of sUAV Stakeholder Value Scatterplot with Uncertainty.

Figure 5.22 sUAV stakeholder value scatterplot with uncertainty

The study sponsor was thrilled when the lead systems engineer presented the visualization of Figure 5.22, and the graph formed the focal point for many thoughtful negotiations among the stakeholders and the decision authority.

5.3.9 Improve Alternatives

One could be tempted to end the decision analysis here, highlight the alternative that has the highest total value, and claim success. Such a premature ending, however, would not be considered best practice. Mining the data generated for the first set of alternatives will likely reveal opportunities to modify some subsystem design choices to claim untapped value and reduce risk. Recall the cyclic decision process map and the implied feedback. Taking advantage of this feedback loop and using initial findings to generate new and creative alternatives starts the process of transforming the decision process from “Alternative-Focused Thinking” to “Value-Focused Thinking” (Keeney 1992). To complete the transformation from alternative-focused thinking to value-focused thinking, consider taking additional steps to spark focused creativity to overcome anchoring biases. As Keeney warns,

Once a few alternatives are stated, they serve to anchor thinking about others. Assumptions implicit in the identified alternatives are accepted, and the generation of new alternatives, if it occurs at all, tends to be limited to a tweaking of the alternatives already identified. Truly creative or different alternatives remain hidden in another part of the mind, unreachable by mere tweaking. Deep and persistent thought is required to jar them into consciousness. (Keeney 1993)

To help generate a creative and comprehensive set of alternatives, consider conducting an alternative generation table (also called a morphological box) (Buede, 2009; Parnell et al., 2011) analysis to generate new alternatives. Chapter 8 presents this and other alternative generation techniques.

5.3.9.1 Example of Improving Alternatives

Within the sUAV example, the lead systems engineer worked with the pool of subject matter experts and sUAV design engineers to explore ways to potentially reduce the time required for the development of CROW I from about 8 years to 5 or 6 years without diminishing the functional performance value.

5.3.10 Communicating Trade-Offs

This is the point in the process where the decision team identifies key observations regarding what stakeholders seem to want and what they must be willing to give up in order achieve it. It is here where the decision team can highlight the design decisions that most influence shareholder and stakeholder value and that are inconsequential. In addition, the important uncertainties and risks should also be identified. Observations regarding combination effects of various design decisions are also important products of this process step. Competing objectives that are driving the trade should be explicitly highlighted as well

Beyond the top-level tradespace visualization products, the decision analyst must be able to rapidly drill down to supporting rationale. The decision support tool construct represented in Figure 5.23 will allow the decision team to navigate seamlessly from top-level stakeholder value scatterplot all the way down to any structured scoring sheet so that rationale for any given score is only a click away. Rapid access to rationale associated with the derivation of the value function or priority weightings is also essential for full traceability.

Overview of Decision Support Model Construct.

Figure 5.23 Decision support model construct

5.3.10.1 Example of Communicating Trade-Offs

Concluding the sUAV case study, the lead systems engineer presented the sUAV systems engineering trade-off analysis to the study sponsor and supporting stakeholder senior advisory group. He used Figure 5.22 to summarize the study findings and Figure 5.23 to provide an appreciation for the processed used and the heritage of underpinning data. He ended the talk with a crisp summary of the decision at hand – if an 8-year development time is acceptable, then Crow I offers superior performance at a very attractive life cycle cost point. If the capability is somehow urgent, Dove I can be fully developed within 4 years but comes with about a 15% drop in performance value and about a 90% increase in life cycle costs relative to Crow I.

5.3.11 Present Recommendation and Implementation Plan

It is often helpful to describe the recommendation in the form of clearly worded, actionable task list to increase the likelihood of the decision process leading to some form of action, thus delivering some tangible value to the sponsor. Reports are important for historical traceability and future decisions. Take the time and effort to create a comprehensive, high-quality report detailing study findings and supporting rationale. Consider static paper reports augmented with dynamic hyperlinked e-reports.

5.4 Summary

The decision management process discussed in this paper integrates decision analysis best practices with systems engineering activities to create a baseline from which the next chapters can explore innovations to further enhance trade-off study quality. The process enables enterprises to develop an in-depth understanding of the complex relationship between requirements, the design choices made to address each requirement, and the system-level consequences of the sum of design choices across the full set of performance requirements as well as other elements of stakeholder value to include cost and schedule. Through data visualization techniques, decision-makers can quickly understand and crisply communicate a complex tradespace and converge on recommendations that are robust in the presence of uncertainty.

The decision management approach is based on several best practices:

  1. Align the decision process with the systems engineering process (Figure 5.2).
  2. Use sound mathematical technique of decision analysis for trade-off studies (Chapter 2).
  3. Develop one master decision model and refine, update, and use it as required for trade-off studies throughout the system development life cycle (Parnell et al, 2013) (Figure 5.3).
  4. Use Value-Focused Thinking (Keeney, 1992) to create better alternatives (Section 3.3.9).
  5. Identify uncertainty and assess risks for each decision (Parnell et al., 2013) (Section 3.3.8).

5.5 Key Terms

  1. Integrated Systems Engineering Decision Management (ISEDM) Process: A procedure that combines the holistic perspective required for the design of complex systems with mathematical rigor needed to properly represent and communicate assessments of system level alternatives across all elements of stakeholder value.
  2. Fundamental Objectives: The essential ends that a decision-maker is trying to achieve.
  3. Measures: A scale established to assess the degree to which various alternatives satisfy a fundamental objective.
  4. Value Function: Value functions describe the degree of satisfaction stakeholders perceive at each point along the measure scale.
  5. Swing Weight Matrix: A structured technique to elicit stakeholder perception of the level of importance of each measure and the differentiation in each measure range in order to determine meaningful swing weights.
  6. Assessment Flow Diagram: A mapping technique to trace the relationships between physical means, intermediate measures, and the measures associated with fundamental objectives.
  7. Consequence Scorecard: Assessment results for each alternative/measure combination summarized in tabular form.
  8. Value Scorecard: Assessment results for each alternative/measure combinations transformed into value space and summarized in tabular form.
  9. Value Component Graph: An aggregated value visualization where each alternative's total value is represented by the total length of a segmented bar where each bar segment represents the contribution of the value earned by the alternative of interest within a given measure by the weighted value.
  10. Stakeholder Value Scatterplot: An aggregated value visualization plotting alternatives' assessed values for two dimensions of stakeholder value on Cartesian Coordinates and for third and fourth dimensions of stakeholder value with marker shape and color.
  11. Stakeholder Value Scatterplot with Uncertainty: Markers of the stakeholder value scatterplot are augmented with box-and-whisker plots to express uncertainty associated with assessed value in a particular dimension of stakeholder value.

5.6 Exercises

The questions with an * apply only if you are working in an organization performing trade-off analyses.

  1. 5.1 Decision management processes
    1. a. Why does an organization need a decision management process using decision analysis?
    2. b. What is the intended output of the decision analysis process when applied to new product development context?
    3. c. List and explain the 10 steps of the decision process as described in this chapter.
    4. d. What is the circular shape of the Integrated Systems Engineering Decision Management (ISEDM) process meant to convey?
    5. e. (*)Which steps are used in your organization?
    6. f. (*)Which of the missing steps would provide the most value to your organization?
  2. 5.2 Decision support models using decision analysis
    1. a. Why is a decision support model considered to be a composite model?
    2. b. What are two key properties of a well-developed fundamental objective?
    3. c. What are five key properties of a well-developed set of fundamental objectives?
    4. d. What are five key properties of a well-developed measure?
    5. e. In terms of value function creation, what is meant by a “walk-away” point? What is meant by a “stretch goal?”
    6. f. What is the potential benefit of aggregating several single-dimensional values to create synthesized value visualization?
  3. 5.3 Analysis of uncertainty and risk
    1. a. List two sources of uncertainty common within systems engineering trade-off analyses.
    2. b. Which steps in the ISEDM process explicitly consider uncertainty and risk?
  4. 5.4 Tradespace visualization
    1. a. List three tradespace visualization techniques discussed in this chapter.
    2. b. What insights does each visualization technique provide?
    3. c. Explain how the cost versus value plot relates to affordability analysis (described in Chapter 4)?
    4. d. Which techniques are used in your organization?
  5. 5.5 Perform a trade-off analysis using the ISEDM illustrated in this chapter.

References

  1. Buede, D.M. and Choisser, R.W. (1992) Providing an analytic structure for key system design choices. Journal of Multi-Criteria Decision Analysis, 1 (1), 17–27.
  2. Buede, D.M. (2009) The Engineering Design of Systems: Models and Methods, Wiley.
  3. Cilli, M. (2015) Improving Defense Acquisition Outcomes Using an Integrated Systems Engineering Decision Management (ISEDM) Approach. PhD Dissertation. Stevens Institute of Technology, Hoboken, NJ.
  4. Edwards, W., Miles, R.F. Jr., and Von Winterfeldt, D. (2007) Advances in Decision Analysis: from Foundations to Applications, Cambridge University Press.
  5. Gundlach, J. (2012) Designing Unmanned Aircraft Systems: A Comprehensive Approach, American Institute of Aeronautics and Astronautics.
  6. Green, P.E., Krieger, A.B., and Wind, Y. (2001) Thirty years of conjoint analysis: reflections and prospects. Interfaces, 31, S56–S73.
  7. INCOSE SE Handbook Working Group (2015) Chapter 4: Business or mission analysis, Chapter 5: Technical management processes, Chapter 9: Cross-cutting systems engineering methods, in Systems Engineering Handbook: A Guide for System Life Cycle Process and Activities, 4th edn (eds D.D. Walden, G.J. Roedler, K.J. Forsberg et al.), International Council on Systems Engineering, Published by John Wiley & Sons, Inc., San Diego, CA.
  8. Keeney, R.L. and Raiffa, H. (1976) Decisions with Multiple Objectives Preferences and Value Tradeoffs, Wiley, New York, NY.
  9. Keeney, R.L. (1992) Value-Focused Thinking: A Path to Creative Decisionmaking, Harvard University Press, Cambridge, Massachusetts.
  10. Keeney, R.L. (1993) Creativity in MS/OR: value-focused thinking—Creativity directed toward decision making. Interfaces, 23 (3), 62–67.
  11. Keeney, R.L. (2004) Making better decision makers. Decision Analysis, 1 (4), 193–204.
  12. Keeney, R.L. and Gregory, R.S. (2005) Selecting attributes to measure the achievement of objectives. Operations Research, 53 (1), 1–11.
  13. Kirkwood, C.W. (1997) Strategic Decision Making: Multiobjective Decision Analysis with Spreadsheets, Duxbury Press, Belmont, CA.
  14. Miller, G.A. (1956) The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychological Review, 63 (2), 81.
  15. Parnell, G.S., Driscoll, P.J., and Henderson, D.L. (eds) (2011) Decision Making for Systems Engineering and Management, 2nd edn, Wiley & Sons Inc., Wiley Series in Systems Engineering.
  16. Parnell, G., Bresnick, T., Tani, S., and Johnson, E. (2013) Handbook of Decision Analysis, Wiley & Sons.
  17. SEBoK authors. “Decision Management.” BKCASE Editorial Board. Guide to the Systems Engineering Body of Knowledge (SEBoK), version 1.4, R.D. Adcock (EIC). Hoboken, NJ: The Trustees of the Stevens Institute of Technology ©2015.29 June 2015. Web. 16 Jun 2015, 14:02 <http://sebokwiki.org/w/index.php?title=Decision_Management&oldid=50860>. BKCASE is managed and maintained by the Stevens Institute of Technology Systems Engineering Research Center, the International Council on Systems Engineering, and the Institute of Electrical and Electronics Engineers Computer Society.
  18. Standard, I. (2015) Systems and software engineering-system life cycle processes. ISO Standard, 15288, 2015.
  19. Keeney, R.L. (1981) Analysis of preference dependencies among objectives. Operations Research, 29 (6), 1105–1120.
  20. Stewart, T.J. (1996) Robustness of additive value function methods in MCDM. Journal of Multi-Criteria Decision Analysis, 5 (4), 301–309.
  21. Von Winterfeldt, D., Edwards, W. et al. (1986) Decision Analysis and Behavioral Research, Cambridge University Press, Cambridge.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset