by Ronald A Howard ©
The Stanford Decisions and Ethics Center
Although decision analysis has developed significantly over the last two decades, the basic principles of the field have served well. They are unlikely to change because they are based on simple logic. In the first part of this paper, we summarize the original, fundamental disciplines of decision analysis; in the second part, we show how the discipline has evolved.
PART I: A BRIEF DESCRIPTION OF DECISION ANALYSIS
Making important decisions often requires treating major uncertainty, long time horizons, and complex value issues. To deal with such problems, the discipline of decision analysis was developed. The discipline comprises the philosophy, theory, methodology, and professional practice necessary to formalize the analysis of important decisions.
Overview of Decision Analysis
Decision analysis is the latest step in a sequence of quantitative advances in the operations research/management science field. Specifically, decision analysis results from combining the fields of systems analysis and statistical decision theory. Systems analysis, which grew as a branch of engineering, was good at capturing the interactions and dynamic behavior of complex situations. Statistical decision theory was concerned with logical decisions in simple, uncertain situations. The merger of these concepts creates a methodology for making logical decisions in complex, dynamic, and uncertain situations.
Decision analysis specifies the alternatives, information, and preferences of the decision-maker and then finds the logically implied decision.
Decision-making requires choosing between alternatives, mutually exclusive resource allocations that will produce outcomes of different desirabilities with different likelihoods. While the range of alternatives to be considered is set by the decision-maker, the decision analyst may be able to suggest new alternatives as the analysis progresses.
Since uncertainty is at the heart of most significant decision problems, decision-making requires specifying the amount of uncertainty that exists given available information. Many decision problems become relatively trivial if uncertainty is removed. For example, consider how easily a decision-maker could make a critical decision in launching a new commercial product if he could predict with certainty production and sales costs, price-demand relationships, and governmental decisions. Decision analysis treats uncertainty effectively by encoding informed judgment in the form of probability assignments to events and variables.
Decision-making also requires assigning values on the outcomes of interest to the decision-maker. These outcomes may be as customary as profit or as troubling as pain. Decision analysis determines the decision-maker's trade-offs between monetary and non-monetary outcomes and also establishes in quantitative terms his preferences for outcomes that are risky or distributed over time.
One of the most basic concepts in decision analysis is the distinction between a good decision and a good outcome. A good decision is a logical decision -- one based on the information, values, and preferences of the decision-maker. A good outcome is one that is profitable, or otherwise highly valued. In short, a good outcome is one that we wish would happen. By making good decisions in all situations that face us, we hope to ensure as high a percentage of good outcomes as possible. We may be disappointed to find that a good decision has produced a bad outcome, or dismayed to learn that someone who has made what we consider to be a bad decision has achieved a good outcome. Short of having a clairvoyant, however, making good decisions is the best way to pursue good outcomes.
An important benefit of decision analysis is that it provides a formal, unequivocal language for communication among the people included in the decision-making process. During the analysis, the basis for a decision becomes evident, not just the decision itself. A disagreement about whether to adopt an alternative may occur because individuals possess different relevant information or because they place different values on the consequences. The formal logic of decision analysis subjects these component elements of the decision process to scrutiny. Information gaps can be uncovered and filled, and differences in values can be openly examined. Revealing the sources of disagreement usually opens the door to cooperative resolution.
The formalism of decision analysis is also valuable for vertical communication in a management hierarchy. The organizational value structure determined by policymakers must be wedded to the detailed information that the line manager, staff analyst, or research worker possesses. By providing a structure for delegating decision-making to lower levels of authority and for synthesizing information from diverse areas for decision-making at high levels, decision analysis accomplishes this union.
Methodology
The application of decision analysis often takes the form of an iterative procedure called the Decision Analysis Cycle (see Figure 1). Although this procedure is not an inviolable method of attacking the problem, it is a means of ensuring that essential steps have been considered.
The procedure is divided into three phases. In the first (deterministic) phase, the variables affecting the decision are defined and related, values are assigned, and the importance of the variables is measured without any consideration of uncertainty.
The second (probabilistic) phase starts with the encoding of probability on the important variables; then, the associated probability assignments on values are derived. This phase also introduces the assessment of risk preference, which defines the best solution in the face of uncertainty.
In the third (informational) phase, the results of the first two phases are reviewed to determine the economic value of eliminating uncertainty in each of the important variables in the problem. In some ways, this is the most important phase because it shows just what it would be worth in dollars and cents to have perfect information. Comparing the value of information with its cost determines whether additional information should be collected.
If there are further profitable sources of information, then the decision should be to gather the information rather than to make the primary decision at this time. The design and execution of the information-gathering program follows.
Since new information generally requires revisions in the original analysis, the original three phases must be performed once more. However, the additional work required to incorporate the modifications is usually slight, and the evaluation, rapid. At the decision point, it may again be profitable to gather new information and repeat the cycle, or it may be more advisable to act. Eventually, the decision to act will be made because the value of new analysis and information-gathering will be less than its cost.
Applying the above procedure ensures that the total effort is responsive to changes in information -- the approach is adaptive. Identifying the crucial areas of uncertainty can also aid in generating new alternatives for future analysis.
Model Sequence
Typically, a decision analysis is performed not with one, but with a sequence of progressively more realistic models. These models generally will be in the form of computer programs. The first model in the sequence is the pilot model, an extremely simplified representation of the problem useful only for determining the most important relationships. Although the pilot model looks very little like the desired final product, it is indispensable in achieving that goal.
The next model in the sequence is the prototype model, a quite detailed representation of the problem that may, however, still be lacking a few important attributes. Although it will generally have objectionable features that must be eliminated, it does demonstrate how the final version will appear and perform.
The final model in the sequence is the production model; it is the most accurate representation of reality that decision analysis can produce. It should function well even though it may retain features that are treated in a less than ideal way.
Starting with the pilot model, sensitivity analyses are used throughout each phase to guide its further evolution. If decisions are insensitive to changes in some aspect of the model, there is no need to model that particular aspect in more detail. The goal of a good modeler is to model in detail only those aspects of the problem that have an impact on the decisions, while keeping the costs of this modeling commensurate with the level of the overall analysis.
Important aids in determining whether further modeling is economically justifiable are the calculations of the value of information. Some variables may be uncertain partially because detailed models have not bee" constructed. If the analyst can calculate the value of perfect information about these variables, he will have a standard to use in comparing the co of any additional modeling. If the cost of modeling is greater than the value of perfect information, the modeling is clearly not economically justifiable.
Using a combination of sensitivity analysis and calculations of the value of information, the analyst continually directs the development of model in an economically efficient way. An analysis conducted in this wa provides not only answers, but also often insights for creating new alternatives. When completed, the model should be able to withstand the test of any good engineering design: additional modeling resources could utilized with equal effectiveness in any part of the model. There is no such thing as a final or complete analysis; there is only an economic analysis given the resources available.
PART II: REFINEMENTS AND NEW DEVELOPMENTS IN DECISION ANALYSIS
Having seen the basic concepts of decision analysis and the main poi of its professional practice, let us now examine some of the evolution changes in the field over the last two decades.
The Decision Basis
It has become useful to have a name for the formal description of a decision problem; we call it the decision basis. The decision basis consists of a quantitative specification of the three elements of the basis: the alternatives, the information, and the preferences of the decision-maker. We can then think of two essential steps in any decision analysis: the development and the evaluation of the decision basis.
Basis Development
To develop the decision basis, the decision analyst must elicit each of the three elements from the decision-maker or from his delegates. For example, in a medical problem, the ultimate decision-maker should be the patient. The patient would provide the element of preference in the basis, probably in a series of interviews with the decision analyst. In most cases, however, the patient will delegate the alternative and information elements to doctors who, in turn, would be interviewed by the decision analyst. The analyst should be able to certify that the decision basis accurately represents the alternatives, information, and preferences provided directly or indirectly by the decision-maker. We should note here that the alternatives must include alternatives of information-gathering, such as tests, experimental programs, surveys, or pilot plants.
One key issue is the extent to which the decision analyst can provide substantive portions of the decision basis by acting as an expert. In many circumstances, the analyst cannot be an expert because he has only a lay knowledge of the decision field. Even when the analyst does have substantial knowledge of the subject area, he should make clear to the decision-maker when he has changed from the role of decision analyst to that substantive expert. Playing the role of expert can also force the analyst to defend his views against those of others; to this extent, he would be less of a "fair witness" in the subsequent analysis. Nevertheless, this possible loss of impartiality and fresh viewpoint must be balanced against the communication advantages of dealing with an analyst familiar with the decision field.
Basis Evaluation
Once the basis is developed, the next step is to evaluate it using the sensitivity analysis and value of information calculations described earlier. However, casting the problem as a decision basis shows that value-of-information calculations, important as they are, focus on only one element of the basis -- information.
Using the concept of the basis, we can also compute the value of a new alternative, which we might call the value of control. Such a calculation might well motivate the search for an alternative with certain characteristics and perhaps even the development of such an alternative.
One can perform a similar sensitivity analysis to preference with the intention not of changing preference, but of ensuring that preferences have been accurately assessed. A large change in value resulting from a small change in preference would indicate the need for more interviews about preference.
A Revised Cycle
Using the concept of the basis, we may wish to restructure the decision analysis cycle in the four-phase form shown in Figure 2. Here, the information gathering that must precede analysis or augment subsequent analyses has been included in a basis development phase. The deterministic and probabilistic phases are essentially unchanged, but the informational phase -- renamed "basis appraisal" -- is expanded to include the examination of all three basis elements.
A Refined Analysis Sequence
As a problem is analyzed, the analysis may progress through the decision analysis cycle several times in increasing levels of detail. The basic distinction is between the pilot and full-scale analysis. The pilot analysis is a simplified, approximate, but comprehensive, analysis of a decision problem. The dictionary defines pilot as "serving as a tentative model for future experiment or development." The full-scale analysis is an increasingly realistic, accurate, and justifiable analysis of a decisicision problem, where full-scale is defined as "employing all resources, not limited or partial." To understand these distinctions, we must explain in more detail what constitutes a pilot or full-scale analysis.
The purpose of a pilot analysis is to provide understanding and establish effective communication about the nature of the decision and the major issues surrounding it. The content of the pilot analysis is a simplified decision model, a tentative preference structure, and a rough characterization of uncertainty. From a pilot analysis, the decision-maker should expect preliminary recommendations for the decision and the analyst should expect guidance in conducting the full-scale analysis.
The purpose of the full-scale analysis is to find the most desirable action, given the fully developed decision basis. The full-scale analysis consists of a balanced and realistic decision model, preferences that have been certified by the decision-maker, and a careful representation of important uncertainties. From the full-scale analysis, the decision-maker should expect a recommended course of action.
While most analyses progress from pilot to full-scale, some are so complex that valuable distinctions may be made between different stages of full-scale analysis.
The first stage of full-scale analysis is the prototypical stage, which is intended to reveal weaknesses and excesses in the full-scale analysis that are worthy of correction. A prototype is defined as "an original type, form, or instance that serves as a model on which later stages are based or judged."
After the indicated corrections have been made, the analyst has an integrated stage of full-scale analysis that provides the decision-maker with confidence in having a unified, balanced, and economic analysis as a basis for decision. To integrate is "to make into a whole by bringing all parts together: unify." If a decision-maker is making a personal decision that will not require the support or approval of others, then the integrated stage of full-scale analysis is all that is required. However, if the decision-maker must convince others of the wisdom of the chosen course of action or even defend that course against hostile elements, then an additional stage of full-scale analysis will be necessary — the defensible stage.
The defensible stage of full-scale analysis is intended to demonstrate to supportive, doubtful, and possibly hostile audiences that the analysis provides an appropriate basis for decision. Defensible means "capable of being defended, protected, or justified." Typically, defensible analyses are necessary for important decisions in the public arena; however, even private enterprises may wish to conduct defensible analyses to win the support of workers, financial institutions, or venture partners. Defensible analyses are very demanding because they must show not only that the basis used is reasonable, but also that other possible bases that would lead to different decisions are not reasonable.
Contributions from Psychological Research
One of the most significant factors influencing the practice of decision analysis in recent years has been new knowledge about cognitive processes from the field of psychology. This research, centering on the contribution of Kahneman and Tversky, has had two major effects. First, the research on cognitive biases [10] has shown the need for subtlety and careful procedure in eliciting the probabilistic judgments on which decision analysis depends. Second, and perhaps even more important the descriptive research on how people actually make decisions [6,11] shows that man is considerably less skilled in decision-making than expected. The main thrust of this research shows that people violate the rules of probabilistic logic in even quite simple settings. When we say that people violate certain rules, we mean that when they are made aware of the implications of their choices, they often wish they had made another choice: that is, they realize they have made a mistake. While these mistakes can be produced in analyzing simple decision settings, they become almost unavoidable when the problem is complex.
These findings may change our interpretation of the logical axioms that are the foundations of decision analysis. We have always considered these axioms as normative: they must be satisfied if our decisions are to have many properties that we would regard as desirable. If a particular individual did not satisfy the axioms, then he would be simply making mistakes in the view of those who followed the axioms. While this interpretation is still possible, a more appropriate way to look at the axioms is that they describe what any person would do if faced with a situation as simple as the one described by the axioms. In other words, the axioms are descriptive of human behavior for simple situations. If, however, the situation becomes more complex, more "opaque" as opposed to "transparent," the axioms are no longer descriptive because the person may unintentionally violate the axiom systems.
We may now think of the job of the decision analyst as that of making "opaque" situations "transparent," so that the person clearly sees what t do. This interpretation of the work may not make it any easier, but it is far more humane than the view that the analyst is trying to impose logic willfully illogical world.
Influence Diagrams
The influence diagram is one of the most useful concepts developed in decision analysis [3]. The analyst has always faced the problem of how to reduce the multifaceted knowledge in people's heads to a form that could meet the rigid tests of explicitness and consistency required by a computer. The analyst has always faced the problem of how to reduce the multifaceted knowledge in people's heads to a form that could meet the rigid tests of explicitness and consistency required by a computer. The influence diagram is a major aid in this transformation because it cross the border between the graphic view of relationships that is very convenient for human beings and the explicit equations and numbers that are the province of present computers. To find a device that can readily be sketched by a layman and yet be so carefully defined that useful theorems concerning it can be proved by formal methods is rare. Although there is a danger that people who do not thoroughly understand influence diagrams may abuse them and be misled, there is an even greater promise that the influence diagram will be an important bridge between analyst and decision-maker.
Valuing Extreme Outcomes
One of the problems perplexing early users of decision analysis was how to treat outcomes so extreme that they seemed to be beyond analysis. For example, the question of how a person's death as the result of medical treatment can be balanced with other medical outcomes, like paralysis or even purely economic outcomes, was especially demanding. These problems appear to raise both ethical dilemmas and technical difficulties. One ethical dilemma centered on who had the right to value lives. A technical difficulty was revealed when an economist testifying in court on the value of a life was asked whether he would be willing to allow himself to be killed if he were given that amount of money. Nevertheless, once the ethical issue is clarified by acknowledging that a person may properly place a value on his own life, then the technical question of how to do it can be addressed quite satisfactorily, especially in the case of exposure to the many small risks present in modern life [4,5]. The results have major implications for many decisions affecting health and safety.
The development of ways to think about the unthinkable has shown that no decision problem lies beyond the realm of decision analysis. That is very satisfying, for were you faced with medical decisions about a loved one, would you want to use second-rate logic any more than a second-rate doctor?
Conclusion
When decision analysis was first developed, a common comment was, "If this is such a great idea, why doesn't [insert name of large, famous company] use it?" Today, it is difficult to find a major corporation that has not employed decision analysis in some form. There are some factors that should lead to even greater use. For example, decision analysis procedures are now more efficiently executable because the increased power of modern computers has reduced the costs of even very complex analyses to an affordable level. The problems that can be successfully attacked now run the gamut of all important decision problems. Increasing uncertainties and rapid change require fresh solutions rather than tested "rules of thumb." Some day, decision analysis of important decisions will perhaps become recognized as so necessary for conducting a provident life that it will be taught in grade school rather than in graduate school.
References:
1. Ronald A Howard, "Decision Analysis: Applied Decision Theory,"
Proceedings of the Fourth International Conference on Operational Research, Wiley-Interscience, New York, 1966, pp. 55-71.
2. Ronald A Howard, "The Foundations of Decision Analysis,"
IEEE Transactions on Systems Science and Cybernetics, SSC-4, No. 3, (September 1968): 211-19.
3. Ronald A Howard and James E Matheson, "Influence Diagrams," Department of Engineering-Economic Systems, Stanford University, July 1979.
4. Ronald A Howard, "On Making Life and Death Decisions," Societal Risk Assessment, How Safe Is Safe Enough?, Edited by Richard C. Schwing and Walter A Albers, Jr,
General Motors Research Laboratories, Plenum Press, New York, 1980.
5. Ronald A Howard, James E Matheson, and Daniel L Owen, "The Value of Life and Nuclear Design," Proceedings of the American Nuclear Society Topical Meeting on Probabilistic Analysis of Nuclear Reactor Safety,
American Nuclear Society, May 8-10, 1978.
6. Daniel Kahneman and Amos Tversky, "Prospect Theory: An Analysis of Decision under Risk,"
Econometrica, 47, No. 2 (March 1979): 263-291.
7. D Warner North, "A Tutorial Introduction to Decision Theory,"
IEEE Transactions on Systems Science and Cybernetics, SSC-4, No 3, (September 1968): 200-10.
8. Howard Raiffa,
Decision Analysis: Introductory Lectures on Choices under Uncertainty, Addison-Wesley, 1968.
9. Myron Tribus,
Rational Descriptions, Decisions, and Designs, Pergamon Press, 1969.
10. Amos Tversky and Daniel Kahneman, "Judgment under Uncertainty: Heuristics and Biases,"
Science, 185 (Sept 27, 1974): 1124-1131.
11. Amos Tversky and Daniel Kahneman, "The Framing of Decisions and the Psychology of Choice,"
Science, 211 (Jan 30, 1981): 453-458.
Republished with permission of The Stanford Decisions and Ethics Center