Carnegie Mellon University Regulation under … · 2 Department of Engineering and Public Policy 3...

39
1 1 Department of Engineering and Public Policy Carnegie Mellon University M. Granger Morgan Head, Department of Engineering and Public Policy Carnegie Mellon University tel: 412-268-2672 e-mail: [email protected] Regulation under Uncertainty OR Uncertainty Analysis and the Use of Expert Judgment RFF Workshop on Expert Judgment: Promises and Pitfalls 2006 March 13 2 Department of Engineering and Public Policy Carnegie Mellon University This morning I will talk about: Sources of uncertainty and the characterization of uncertainty. Two basic types of uncertainty. Uncertainty about coefficient values. Uncertainty about model functional form. Performing uncertainty analysis and making decisions in the face of uncertainty. Some summary guidance on reporting, characterizing and analyzing uncertainty.

Transcript of Carnegie Mellon University Regulation under … · 2 Department of Engineering and Public Policy 3...

1

1Department of Engineering and Public Policy

Carnegie Mellon University

M. Granger MorganHead, Department of Engineering and Public PolicyCarnegie Mellon Universitytel: 412-268-2672e-mail: [email protected]

Regulation underUncertainty

OR Uncertainty Analysis and the Use of

Expert JudgmentRFF Workshop on Expert Judgment:

Promises and Pitfalls2006 March 13

2Department of Engineering and Public Policy

Carnegie Mellon University

This morning I will talk about:• Sources of uncertainty and the characterization of

uncertainty.• Two basic types of uncertainty.

• Uncertainty about coefficient values.• Uncertainty about model functional form.

• Performing uncertainty analysis and making decisions inthe face of uncertainty.

• Some summary guidance on reporting, characterizingand analyzing uncertainty.

2

3Department of Engineering and Public Policy

Carnegie Mellon University

ProbabilityProbability is the basic language of uncertainty.

I will adopt a personalistic view of probability(sometimes also called a subjectivist or Bayesian view).

In this view, probability is a statement of the degree ofbelief that a person has that a specified event will occurgiven all the relevant information currently known bythat person.

P(X|i) where: X is the uncertain event i is the person's state of information.

4Department of Engineering and Public Policy

Carnegie Mellon University

The clairvoyant testEven if we take a personalist view of probability, the eventor quantity of interest must be well specified for aprobability, or a probability distribution, to be meaningful.

"The retail price of gasoline in 2008" does not pass thistest. A clairvoyant would need to know things such as:

• Where will the gas be purchased?

• At what time of year?

• What octane?

3

5Department of Engineering and Public Policy

Carnegie Mellon University

Does a subjectivist view meanyour probability can be arbitrary?

NO, because if they are legitimate probabilities,they must

• conform with the axioms of probability

• be consistent with available empirical data.

Lots of people ask, why deal with probability? Why notjust use subjective words such as "likely" and "unlikely" todescribe uncertainties? There are very good reasons not todo this.

6Department of Engineering and Public Policy

Carnegie Mellon University

The risks of using qualitativeuncertainty language

Qualitative uncertainty language is inadequatebecause:

- the same words can mean very different thingsto different people.

- the same words can mean very different thingsto the same person in different contexts.

- important differences in experts' judgmentsabout mechanisms (functional relationships),and about how well key coefficients are known,can be easily masked in qualitative discussions.

4

7Department of Engineering and Public Policy

Carnegie Mellon University

Mappingwords to

probabilities

Probability that subjects associated with the qualitative description

0.00.20.40.60.81.0

Almost certain

Probable

Likely

Good chance

Possible

Tossup

Unlikely

Improbable

Doubtful

Almost impossible

range of individualupper bound estimates

range of individuallower bound estimates

range from upperto lower medianestimate

Qu

alit

ativ

e d

escr

ipti

on

of

un

cert

ain

ty u

sed

Figure adapted from Wallsten et al., 1986.

This figure shows therange of probabilitiesthat people are asked toassign probabilities towords, absent anyspecific context.

8Department of Engineering and Public Policy

Carnegie Mellon University

Ex Com ofEPA SAB

SAB members:

Key:

likel

y

som

ethi

ng b

etw

een

likel

y an

d no

t lik

ely

not l

ikel

y

Other meeting participants:

1.0

0.9

0.7

0.5

0.3

0.1

0.0

5

0.0

1

0.0

01

0.0

001

0.0

0001

0.0

00001

Probability that the material is a human carcinogen

The minimum probabilityassociated with the word"likely" spanned four ordersof magnitude.The maximum probabilityassociated with the word "notlikely" spanned more than fiveorders of magnitude.There was an overlap of theprobability associated with theword "likely" and thatassociated with the word"unlikely"!

Figure from Morgan, HERA, 1998.

5

9Department of Engineering and Public Policy

Carnegie Mellon University

The bottom lineWithout at least some quantification,qualitative descriptions of uncertaintyconvey little, if any, useful information.The climate assessment community isgradually learning this lesson.Steve Schneider and Richard Moss have workedhard to promote a better treatment of uncertaintyin the work of the IPCC.At my insistence, U.S. national assessmentsynthesis team gave quantitative definitions to fiveprobability words:

10Department of Engineering and Public Policy

Carnegie Mellon University

BUT, in other fields…

recommended…"against routine use of formal quantitativeanalysis of uncertainty in risk estimation, particularly thatrelated to evaluating toxicology."While analysts were encouraged to provide "qualitative descriptions of risk-related uncertainty," the Commission concluded that "quantitativeuncertainty analyses of risk estimates are seldom necessary and are not usefulon a routine basis to support decision-making."

Slowly such views are giving way, but progress is slow.

…such as biomedical and healtheffects, progress has been much slower.A concrete example of this is providedby the recommendations ofPresidential/CongressionalCommission on Risk Assessment andRisk Management (1997) which

6

11Department of Engineering and Public Policy

Carnegie Mellon University

This afternoon I will talk about:• Sources of uncertainty and the characterization of

uncertainty.

• Two basic types of uncertainty.

• Uncertainty about coefficient values.

• Uncertainty about model functional form.

• Designing and performing expert elicitation.

• Performing uncertainty analysis

• Some summary guidance on reporting, characterizingand analyzing uncertainty.

12Department of Engineering and Public Policy

Carnegie Mellon University

We must consider twoquite different kinds of uncertainty1. Situations in which we know the relevant variables

and the functional relationships among them, but weno not know the values of key coefficients (e.g., the"climate sensitivity").

2. Situations in which we are not sure what all therelevant variables are, or the functional relationshipsamong them (e.g., will rising energy prices inducemore technical innovation?).

Both are challenging, but the first is much more easilyaddressed than the second. I'll talk more about the secondwhen I talk about uncertainty analysis.

7

13Department of Engineering and Public Policy

Carnegie Mellon University

Uncertainty about quantities

From Morgan and Henrion, Uncertainty, Cambridge, 1990/98.

14Department of Engineering and Public Policy

Carnegie Mellon University

PDFs and CDFs

A number of examples I am about to show arein the form of probability density functions(PDFs) or cumulative distribution functions(CDFs).

Since some of you may not make regular use ofPDFs and CDF's, let me take just a moment toremind you...

8

15Department of Engineering and Public Policy

Carnegie Mellon University

Probability density functionor PDF

V, Value of the uncertain quantity

Pro

ba

bil

ity

density

V V+!

16Department of Engineering and Public Policy

Carnegie Mellon University

Cumulativedistribution

functionor CDF

V, Value of the uncertain quantity

Pro

ba

bil

ity

density

V V+!

0.5

1.0

Cumulative

pro

ba

bil

ity

p

p

0

mean

median

mode

NOTE: In asymmetricdistributions with longtails, the mean may bemuch much larger than themedian.

9

17Department of Engineering and Public Policy

Carnegie Mellon University

If I have good data......in the form of many observations of a random process, then Ican construct a probability distribution that describes thatprocess. For example, suppose I have the 145 years of rainfalldata for San Diego,and I am prepared toassume that over thatperiod San Diego'sclimate has been"stationary" (that isthe basic underlyingprocesses that createthe year-to-yearvariability have notchanged)… Source: Inman et al., Scripps, 1998.

18Department of Engineering and Public Policy

Carnegie Mellon University

Then if I want…

0

20

40

60

…a PDF for future San Diegoannual rainfall, the simplestapproach would be toconstruct a histogram fromthe data, as illustrated to theright.If I want to make a predictionfor some specific future year,I might go on to look for timepatterns in the data. Evenbetter, I might try to relatethose time patterns to knownslow patterns of variation inthe regional climate, andmodify my PDF accordingly.

10

19Department of Engineering and Public Policy

Carnegie Mellon University

In that way…

20 40 600

1.0

0.5

0

Annual rainfall, cm

…I could construct a PDF andCDF for future San Diego rain-fall that would look roughlylike this.

However, suppose that what Ireally care about is theprobability that very largerainfall events will occur.Since there have only been twoyears in the past 145 yearswhen rainfall has been above60 cm/yr over, I'll need toaugment my data with somemodel or physical theory.

20Department of Engineering and Public Policy

Carnegie Mellon University

In summary…

…one should use available data, and well-establishedphysical and statistical theory, to describe uncertaintywhenever either or both are available.

However, often the available data and theory are notexactly relevant to the problem at hand, or they are notsufficiently complete to support the full objectiveconstruction of a probability distribution.

In such cases, I may have to rely on expert judgment. Thisbrings us to the problem of how to "elicit" expert judgment.

11

21Department of Engineering and Public Policy

Carnegie Mellon University

Expert elicitationtakes time and care

Eliciting subjective probabilistic judgments requirescareful preparation and execution.

Developing and testing an appropriate interviewprotocol typically takes several months. Each interviewis likely to require several hours.

When addressing complex, scientifically subtlequestions of the sorts involved with most problems inclimate change, there are no satisfactory short cuts.Attempts to simplify and speed up the process almostalways lead to shoddy results.

22Department of Engineering and Public Policy

Carnegie Mellon University

Over confidence is a ubiquitous problem

1900

300000

299800

299600 1920 1940 196018801860

299800

299790

299780

299770

299760

Recommendedvalue with reporteduncertainty

Year of experiment

299750

For details see: Henrion and Fischhoff, “Assessing Uncertatinty in Physical Constants,” American Journal of Physics, 54, pp791-798, 1986.

0% 10% 20% 30% 40% 50% 60%

Percentage of estimates in which the true value lay outside of the respondent’s assessed 98% confidence interval.For details see Morgan and Henrion, Uncertatinty, Cambridge Univ. Press, 1990, pg 117.

21 different studies onquestions with known answers:

Estimates of the speed of light

12

23Department of Engineering and Public Policy

Carnegie Mellon University

Cognitive heuristicsWhen ordinary people or experts make judgments aboutuncertain events, such as numbers of deaths from chanceevents, they use simple mental rules of thumb called"cognitive heuristics."In many day-to-day circumstances, these serve us verywell, but in some instances they can lead to bias - such asover confidence - in the judgments we make.This can be a problem for experts too.The three slides that follow illustrate three key heuristics:"availability," "anchoring and adjustment," and"representativeness."

24Department of Engineering and Public Policy

Carnegie Mellon University

Cognitive bias

from Lichtenstein et al., 1978.

Availability: probability judgment is driven by ease withwhich people can think of previous occurrences of theevent or can imagine such occurrences.

13

25Department of Engineering and Public Policy

Carnegie Mellon University

Cognitive bias…(Cont.)

Anchoring and adjustment: probability judgment isfrequently driven by the starting point which becomes an"anchor."

from Lichtenstein et al., 1978.

26Department of Engineering and Public Policy

Carnegie Mellon University

Cognitive bias…(Cont.)

I flip a fair coin 8 times. Which of the following twooutcomes is more likely?

Outcome 1: T, T, T, T, H, H, H, HOutcome 2: T, H, T, H, H, T, H, T

Of course, the two specific sequences are equallylikely...but the second seems more likely because itlooks more representative of the underlying randomprocess.

Representativeness: people judge the likelihood that an objectbelongs to a particular class in terms of how much it resemblesthat class.

14

27Department of Engineering and Public Policy

Carnegie Mellon University

Expert elicitation …(Cont.)Over the past three decades, my colleagues and I havedeveloped and performed a number of substantively detailedexpert elicitations. These have been designed to obtainexperts’ considered judgments. Examples include work on:

Health effects of air pollution fromcoal-fired power plants.

• M. Granger Morgan, Samuel C. Morris, Alan K. Meier and Debra L. Shenk, "AProbabilistic Methodology for Estimating Air Pollution Health Effects from Coal-FiredPower Plants," Energy Systems and Policy, 2, 287-310, 1978.

• M. Granger Morgan, Samuel C. Morris, William R. Rish and Alan K. Meier, "SulfurControl in Coal-Fired Power Plants: A Probabilistic Approach to Policy Analysis,"Journal of the Air Pollution Control Association, 28, 993-997, 1978.

• M. Granger Morgan, Samuel C. Morris, Max Henrion, Deborah A.L. Amaral andWilliam R. Rish, "Technical Uncertainty in Quantitative Policy Analysis: A Sulfur AirPollution Example," Risk Analysis, 4, 201-216, 1984 September.

• M. Granger Morgan, Samuel C. Morris, Max Henrion and Deborah A. L. Amaral,"Uncertainty in Environmental Risk Assessment: A case study involving sulfurtransport and health effects," Environmental Science and Technology, 19, 662-667, 1985August.

28Department of Engineering and Public Policy

Carnegie Mellon University

Expert elicitation…(Cont.)

• M. Granger Morgan and David Keith, "Subjective Judgments by Climate Experts," EnvironmentalScience & Technology, 29(10), 468-476, October 1995.

• Elizabeth A. Casman, M. Granger Morgan and Hadi Dowlatabadi, "Mixed Levels of Uncertainty inComplex Policy Models," Risk Analysis, 19(1), 33-42, 1999.

• M. Granger Morgan, Louis F. Pitelka and Elena Shevliakova, "Elicitation of Expert Judgments ofClimate Change Impacts on Forest Ecosystems," Climatic Change, 49, 279-307, 2001.

• Anand B. Rao, Edward S. Rubin and M. Granger Morgan, "Evaluation of Potential Cost Reductionsfrom Improved CO2 Capture Systems,"Proceedings of the 2nd National Conference on CarbonSequestration, Alexandria, VA, May 5-8, 2003.

• M. Granger Morgan, Peter J. Adams and David W. Keith, "Elicitation Of Expert Judgments ofAerosol Forcing," Climatic Change, in press.

• Kirsten Zickfeld, Anders Levermann, M. Granger Morgan, Till Kuhlbrodt, Stefan Rahmstorf, andDavid W. Keith, "Present State and Future Fate of the Atlantic Meridional Overturning Circulationas Viewed by Experts," in review at Climatic Change.

Climate science, climate impacts and mitigation technology:

• M. Granger Morgan, "The Neglected Art of Bounding Analysis," Environmental Science &Technology, 35, pp. 162A-164A, April 1, 2001.

• Minh Ha-Duong, Elizabeth A. Casman, and M. Granger Morgan, "Bounding PoorlyCharacterized Risks: A lung cancer example," Risk Analysis, 24(5), 1071-1084, 2004.

• Elizabeth Casman and M. Granger Morgan, "Use of Expert Judgment to Bound LungCancer Risks," Environmental Science & Technology, 39, 5911-5920, 2005.

Bounding uncertain health risks:

15

29Department of Engineering and Public Policy

Carnegie Mellon University

Expert elicitation…(Cont.)In all our elicitation studies, we've focused on creating aprocess that allows the experts to provide their carefullyconsidered judgment, supported by all the resources they maycare to use. Thus, we have:

- Prepared a background review of the relevant literatures.- Carefully iterated the questions with selected experts and run pilot

studies with younger (Post-doc) experts to distil and refine thequestions.

- Conducted interviews in experts' offices with full resources at hand.- Provide ample opportunity for subsequent review and revision of the

judgments provided.

All of these efforts have involved the development of newquestion formats that fit the issues at hand.

30Department of Engineering and Public Policy

Carnegie Mellon University

SOX health effects…elicited subjectivejudgments about oxidationrates and wet and drydeposition rates for SO2and SO4

= from nineatmospheric scienceexperts.

Source: G. Morgan, S. Morris, M. Henrion, D. Amaral, W. Rish, "Technical Uncertainty in Quantitative Policy Analysis: A sulfur airpollution example,” Risk Analysis, 4, 201-216, 1984.

16

31Department of Engineering and Public Policy

Carnegie Mellon University

SOX healtheffects…(Cont.)Then for each expert we builta separate plume model,exposing people on the groundusing known populationdensities.

Source: G. Morgan, S. Morris, M. Henrion, D. Amaral,W. Rish, "Technical Uncertainty in Quantitative PolicyAnalysis: A sulfur air pollution example,” Risk Analysis, 4,201-216, 1984.

32Department of Engineering and Public Policy

Carnegie Mellon University

SOX healtheffects…(Cont.)We elicited health damagefunctions for sulfate aerosolfrom seven health experts (ofwhom only five were able togive us probabilisticestimates).

Finally, for each air expert,we did an analysis using thehealth damage function ofeach health expert…

Source: D. Amaral, "Estimating Uncertainty in PolicyAnalysis: Health effects from inhaled sulfur oxides,"Ph.D. thesis, Department of Engineering and PublicPolicy, Carnegie Mellon, 1983.

17

33Department of Engineering and Public Policy

Carnegie Mellon University

SOX health effects…(Cont.)The results showed that while there was great discussionabout uncertainty in the atmospheric science, the uncertaintyabout the health damage functions completely dominated theestimate of health impacts.

Source: G. Morgan, S. Morris, M. Henrion, D. Amaral, W. Rish, "Technical Uncertainty in Quantitative Policy Analysis: A sulfur airpollution example,” Risk Analysis, 4, 201-216, 1984.

CDF of deaths/yr froma new (in 1984) super-critical 1GWe FGDequipped coal-firedplant in Pittsburgh.Availability factor =73%, net efficiency =35%.

34Department of Engineering and Public Policy

Carnegie Mellon University

Multiple expertsWhen different experts hold different views it is often bestnot to combine the results, but rather to explore theimplications of each expert's views so that decision makershave a clear understanding of whether and how much thedifferences matter in the context of the overall decision.

However, sophisticated methods have been developed thatallow experts to work together to combine judgments so as toyield a single overall composite judgment.

The community of seismologists have made the greatestprogress in this direction through a series of very detailedstudies of seismic risks to built structures (Hanks, T.C.,1997; Budnitz, R.J. et al., 1995).

18

35Department of Engineering and Public Policy

Carnegie Mellon University

Uncertainty versus variabilityVariability involves random change over time or space (e.g.,"the mid-day temperature in Beijing in May is variable").Recently, in the U.S., some people have been drawing asharp distinction between variability and uncertainty. Whilethe two are different, and sometimes require differenttreatments, the distinction can be overdrawn. In manycontexts, variability is simply one of several sources ofuncertainty (Morgan and Henrion, 1990).One motivation people have for trying to sharpen thedistinction is that variability can often be measuredobjectively, while other forms of uncertainty requiresubjective judgment.

36Department of Engineering and Public Policy

Carnegie Mellon University

This morning I will talk about:• Sources of uncertainty and the characterization of

uncertainty.• Two basic types of uncertainty.

• Uncertainty about coefficient values.• Uncertainty about model functional form.

• Performing uncertainty analysis and making decisions inthe face of uncertainty.

• Some summary guidance on reporting, characterizingand analyzing uncertainty.

19

37Department of Engineering and Public Policy

Carnegie Mellon University

Uncertainty about model formOften uncertainty about model form is as or moreimportant than uncertainty about values of coefficients.Until recently there had been little practical progress indealing with such uncertainty, but now there are severalgood examples:

• John Evans and his colleagues at the Harvard Schoolof Public Health (e.g., Evans et al., 1994).

• Alan Cornell and others in the seismic risk (e.g.,Budnitz et al., 1995).

• Hadi Dowlatabadi and colleagues at Carnegie Mellonin Integrated Assessment of Climate Change - ICAM(e.g., Morgan and Dowlatabadi, 1996).

• Also on climate, Lempert and colleagues at RAND(e.g., Lempert, Popper, Bankes, 2003).

38Department of Engineering and Public Policy

Carnegie Mellon University

John Evansand colleagues….…have developed amethod which lays out a"probability tree" todescribe all the plausibleways in which a chemicalagent might cause harm.Then experts are asked toassess probabilities on eachbranch.

For details see: John S. Evans et al., "A distributional approach to characterizing low-dosecancer risk," Risk Analysis, 14, 25-34, 1994; and John S. Evans et al., "Use of probabilisticexpert judgment in uncertainty analysis of carcinogenic potency," Regulatory Toxicology andPharmacology, 20, 15-36, 1994.

20

39Department of Engineering and Public Policy

Carnegie Mellon University

Energy &

Em iss ions

Atmospheric

Composition &

C l imate

Demographics

& Economics

INTERVENTION

Impacts of

Climate Change

StructureInputs Outputs

To run the model: 1 - Double click on INPUTS to set up the scenario inputs; 2 - Double click on STRUCTURE to set up the model; 3 - Double click on OUTPUTS and evaluate the indicators.

Regional !T

GHGModels

AerosolModel

ForcingChange in Long Wave

Forcing

Change in Short Wave

Forcing

ElicitedClimate Model

RITS Conc

ICAMIntegrated Climate Assessment Model

See for example:Hadi Dowlatabadi and M. Granger Morgan, "A Model Framework forIntegrated Studies of the Climate Problem," Energy Policy, 21(3), 209-221, March 1993.andM. Granger Morgan and Hadi Dowlatabadi, "Learning from IntegratedAssessment of Climate Change," Climatic Change, 34, 337-368, 1996.

A very large hierarchicallyorganized stochasticsimulation model builtin Analytica®.

40Department of Engineering and Public Policy

Carnegie Mellon University

ICAM deals with... …both of the types of uncertainty I've talked about:

1. It deals with uncertain coefficients by assigning PDFsto them and then performing stochastic simulation topropagate the uncertainty through the model.

2. It deals with uncertainty about model functional form(e.g., will rising energy prices induce more technicalinnovation?) by introducing multiple alternativemodels which can be chosen by throwing "switches."

21

41Department of Engineering and Public Policy

Carnegie Mellon University

ICAMThere is not enough time to present any details from our workwith the ICAM integrated assessment model. Here are a fewconclusions from that work:

• Different sets of plausible model assumptions givedramatically different results.

• No policy we have looked at is dominant over the widerange of plausible futures we’ve examined.

• The regional differences in outcomes are so vast thatfew if any policies would pass muster globally forsimilar decision rules.

• Different metrics of aggregate outcomes (e.g., $s versushours of labor) skew the results to reflect the OECD ordeveloping regional issues respectively.

42Department of Engineering and Public Policy

Carnegie Mellon University

These findings lead us......to switch from trying to project and examine the future, tousing the modeling framework as a test-bed to evaluate therelative robustness, across a wide range of plausible modelfutures, of alternative strategies that regional actors in themodel might adopt.

We populated the model's regions with simple decisionagents and asked, which behavioral strategies are robust inthe face of uncertain futures, which get us in trouble.

Thus, for example, it turns out that tracking and respondingto atmospheric concentration is more likely to lead regionalpolicy makers in the model to stable strategies than trackingand responding to emissions.

22

43Department of Engineering and Public Policy

Carnegie Mellon University

Our conclusionPrediction and policy optimization are pretty silly analyticalobjectives for much assessment and analysis related to the climateproblem.It makes much more sense to:

• Acknowledge that describing and bounding a range offutures may often be the best we can do.

• Recognize that climate is not the only thing that ischanging, and address the problem in that context.

• Focus on developing adaptive strategies and evaluating theirlikely robustness in the face of a range of possible climate,social, economic and ecological futures.

Recent work by Robert Lempert and colleagues takes a verysimilar approach (e.g., Lempert, Popper, Bankes, 2003).

44Department of Engineering and Public Policy

Carnegie Mellon University

The next several slides…

…which I will not show because time is short, talkabout two other important topics:

• A caution about the use of scenario analysis

• Some methods for dealing with extremeuncertaitinty.

I'll be happy to talk off-line or during a discussionsession with anyone who is interested.

Jump to slide 50

23

45Department of Engineering and Public Policy

Carnegie Mellon University

Scenarios……can be a useful device to help think about the future, but theycan also be dangerous. Remember the discussion of cognitiveheuristics.

Here is a scenario from an experiment run by Slovic,Fischhoff, and Lichtenstein:

Tom is of high intelligence, although lacking in true creativity. Hehas a need for order and clarity, and for neat and tidy systems inwhich every detail finds its appropriate place. His writing is ratherdull and mechanical, occasionally enlivened by somewhat cornypuns and by flashes of imagination of the sci-fi type. He has a strongdrive for competence. He seems to have little feel and littlesympathy for other people and does not enjoy interacting with others.

46Department of Engineering and Public Policy

Carnegie Mellon University

In light of these data……what is the probability that:

Tom W. willselect journal-ism as hiscollege major?

Tom W. willselect journalismas his collegemajor but becomeunhappy with hischoice?

Tom W. will selectjournalism as hiscollege major butbecome unhappy withhis choice and switch toengineering?

P = 0.21 P = 0.39 P = 0.41

From: P. Slovic, B. Fischhoff, and S. Lichtenstein, "Cognitive Processes and Societal Risk Taking," Cognition and Social Behavior, 1976.

24

47Department of Engineering and Public Policy

Carnegie Mellon University

Scenarios……that describe justsome single point inthe space of outcomesare of limited use andlogically cannot beassigned a probability.

If scenarios are to beused, its better to spanthe space of interestand then assign aprobability to each.

!Si = 1

i = 1

n

48Department of Engineering and Public Policy

Carnegie Mellon University

Cut the long causal chainsTypically, it is also better not to use detailed scenarios butrather to use simpler parametric methods.

Thus, for example, if future oil prices appear to be critical toa specific class of decisions, rather than develop long detailedstories about how those prices might be shaped by futuredevelopments in the U.S. and Canadian Arctic, the MiddleEast, and the Former Soviet Union, it is better to reflect on allthe possibilities and then truncate the causal chain by positinga range of possible future oil prices and work from there.

In multiplicative models, uniform PDFs are often quiteadequate to get good first-order estimates.

25

49Department of Engineering and Public Policy

Carnegie Mellon University

Limiteddomain of

model validity

Year

2075 21002050202520001975

0

1

2

3

4

5

6

Mean c

hange in

glo

bal te

mpera

ture

,°C

0

1

2

3

4

5

6

Mean c

hange in

glo

bal te

mpera

ture

,°C

0

1

2

3

4

5

6M

ean c

hange in

glo

bal te

mpera

ture

,°C

0.00

0.10

Pro

babili

ty o

f a c

hange

in

the c

limate

sta

te

0.20

0.30

0.00

0.10

Pro

babili

ty o

f a c

hange

in

the c

limate

sta

te

0.20

0.30

0.00

0.10

Pro

babili

ty o

f a c

hange

in

the c

limate

sta

te

0.20

0.30

Year

2075 21002050202520001975

Year

2075 21002050202520001975

Expert 1

Expert 8

Expert 15

Examples of warmingestimated via the ICAMmodel (dark curves) andprobability that theassociated climateforcing will induce astate change in theclimate system (lightcurves) using theprobabilistic judgmentsof three differentclimate experts.

50Department of Engineering and Public Policy

Carnegie Mellon University

Modelswitching

Schematic illustration ofthe strategy of switchingto progressively simplermodels as one movesinto less well understoodregions of the problemphase space, in this case,over time.

Time

period during

which one

thinks output

from detailed

model is

meaningful

period during

which one

thinks output

from a simple

order-of-

magnitude

model is

meaningful

period during

which only

bounding

analysis is

meaningful

complete

ignorance

0

1

26

51Department of Engineering and Public Policy

Carnegie Mellon University

Illustrationof modelswitching

Results of applyingthe model switch-over strategy to theICAM demographicmodel (until about2050) and anestimate of theupper-boundestimate of globalpopulation carryingcapacity based onJ. S. Cohen.

0.0

0.2

0.4

0.6

0.8

1.0Weight for ICAM model

Weight for Cohen model

2000 2050 2100 2150 2200 2250 2300

20

40

60

0

80

52Department of Engineering and Public Policy

Carnegie Mellon University

This morning I will talk about:• Sources of uncertainty and the characterization of

uncertainty.• Two basic types of uncertainty.

• Uncertainty about coefficient values.• Uncertainty about model functional form.

• Performing uncertainty analysis and making decisions inthe face of uncertainty.

• Some summary guidance on reporting, characterizingand analyzing uncertainty.

27

53Department of Engineering and Public Policy

Carnegie Mellon University

Propagation and analysis of uncertainty

Consider a basic modelof the form y = f(x 1, x 2)

Simplest form ofuncertainty analysis issensitivity analysisusing the slopes of ywith respect to theinputs x 1 and x 2

Source: Morgan and Henrion, 1990

54Department of Engineering and Public Policy

Carnegie Mellon University

Nominal range sensitivity

Source: Morgan and Henrion, 1990

28

55Department of Engineering and Public Policy

Carnegie Mellon University

Propagation of continuous distributions

Source: Morgan and Henrion, 1990

There are many software toolsavailable such as Analytica®,@risk®, Crystal Ball®.

56Department of Engineering and Public Policy

Carnegie Mellon University

Tools for analysis

Tools for continuous processes:• Exposure models• Dose response functions• etc.

Use of some of these tools used tobe very challenging and timeconsuming. Today such analysis isfacilitated by many software tools(e.g., Analytica®, @risk®, CrystalBall®, etc.).

Expendituresto limit losses

Losses due toclimate change

Regional GDPin 1975

GrossGDP

GDP net of climate change

PopulationGrowth

ProductivityGrowth

EconomicGrowth

Regional Popsin 1975

Population

DiscountRate

TotalMitigation Costs

& Losses

DiscountedLost GDP

$ Per cap lossdue to CC

Disc. per cap GDP Lossdue to CC

Energy &

Em iss ions

Atmospheric

Composition &

C l imate

Demographics

& Economics

INTERVENTION

Impacts of

Climate Change

StructureInputs Outputs

To run the model: 1 - Double click on INPUTS to set up the scenario inputs; 2 - Double click on STRUCTURE to set up the model; 3 - Double click on OUTPUTS and evaluate the indicators.

Tools for discrete events:• Failure modes and effects analysis• Fault tree models• etc.

29

57Department of Engineering and Public Policy

Carnegie Mellon University

All that fancy stuff……is only useful when one has some basis to predict functionalrelationships and quantify uncertainties about coefficient values.

When one does not have that luxury, order-of-magnitudearguments and bounding analysis (based on things likeconservation laws, etc.) may be the best one can do.

The conventional decision-analytic model assumes that researchreduces uncertainty:

Research

$

58Department of Engineering and Public Policy

Carnegie Mellon University

Research and uncertainty…(Cont.)

Research

$

or even grows…

Research

$

Sometimes this is what happens, but often, for variables orprocesses we really care about, the reality is that uncertaintyeither remains unchanged…

When the latter happens it is often because we find that theunderlying model we had assumed is incomplete or wrong.

30

59Department of Engineering and Public Policy

Carnegie Mellon University

There are formal methodsto support decision making…

Source: Henrion, Lumina Systems

…such as B-C analysis, decisionanalysis, analysis based on multi-attribute utility theory, etc.Things analysts need to remember:

Be explicit about decision rules (e.g., publichealth versus science; degree of precaution; etc.).Get the value judgments right and keep themexplicit.Don't leave out important issues just becausethey don't easily fit.Remember that many methods become opaquevery quickly, especially to non-experts.When uncertainty is high, it is often best to lookfor adaptive as opposed to optimal strategies.

60Department of Engineering and Public Policy

Carnegie Mellon University

Identify the problem

Gain full understandingof all relevant issues

Do research

Identify policy options

Inplement theoptimal policy

Solve the problem

A classic strategy forsolving problems:

When uncertainty is high(and perhaps irreducible) aniterative adaptive strategy isbetter:

Identify a problem

Identify adaptivepolicies and choose one that currently looks best

Do research

implement policy and observe howit works

Reassess policy in light of new understanding

Learn what you canand what you can’t know (at least now).

Continue research

Refine problem identificaton as needed

31

61Department of Engineering and Public Policy

Carnegie Mellon University

This morning I will talk about:• Sources of uncertainty and the characterization of

uncertainty.• Two basic types of uncertainty.

• Uncertainty about coefficient values.• Uncertainty about model functional form.

• Performing uncertainty analysis and making decisions inthe face of uncertainty.

• Some summary guidance on reporting, characterizingand analyzing uncertainty.

62Department of Engineering and Public Policy

Carnegie Mellon University

CCSP Guidance Document

Along with severalcolleagues I am currentlycompleting a guidancedocument for the U.S.Climate Change ScienceProgram.

The final slides reproduceour draft summary advice.

We'd welcome feedback andsuggestions.

32

63Department of Engineering and Public Policy

Carnegie Mellon University

Doing a good job……of characterizing and dealing with uncertainty can never bereduced to a simple cookbook. One must always think criticallyand continually ask questions such as:

• Does what we are doing make sense?• Are there other important factors which are as or more

important than the factors we are considering?• Are there key correlation structures in the problem which

are being ignored?• Are there normative assumptions and judgments about

which we are not being explicit?

That said, the following are a few words of guidance…

64Department of Engineering and Public Policy

Carnegie Mellon University

Reporting uncertaintyWhen qualitative uncertainty words (such as likely andunlikely) are used, it is important to clarify the range ofsubjective probability values that are to be associated with thosewords. Unless there is some compelling reason to do otherwise,we recommend the use of the framework shown below:

0.25 0.5 0.75 1.00

likely greater than an even chance> 0.5 to ~ 0.8

very likely~ 0.8 to < 0.99

vertually certain> 0.99

about an even chance~ 0.4 to ~ 0.6

unlikely less than an even chance~ 0.2 to > 0.5

vertually impossible > 0.99

very unlikely> 0.99 to ~ 0.2

probability that a statement is true

33

65Department of Engineering and Public Policy

Carnegie Mellon University

Reporting uncertainty…(Cont.)Another strategy is to display the judgment explicitly as shown:

0.25 0.5 0.75 1.00

probability that a statement is true

This approach provides somewhat greater precision andallows some limited indication of secondary uncertainty forthose who feel uncomfortable making precise probabilityjudgments.

66Department of Engineering and Public Policy

Carnegie Mellon University

Reporting uncertainty…(Cont.)In any document that reports uncertainties in conventionalscientific format (e.g 3.5+0.7), it is important to be explicitabout what uncertainty is being included and what is not, and toconfirm that the range is plus or minus one standard deviation.This reporting format is generally not appropriate for largeuncertainties or where distributions have a lower or upperbound and hence are not symmetric.

Care should be taken in plotting and labeling the vertical axeswhen reporting PDFs. The units are probability density (i.e.,probability per unit interval along the horizontal axis), notprobability.

Since many people find it difficult to read and correctlyinterpret PDFs and CDFs, when space allows it is best practiceto plot the CDF together with the PDF on the same x-axis.

34

67Department of Engineering and Public Policy

Carnegie Mellon University

Reporting uncertainty…(Cont.)When many uncertain results must be reported, box plots (firstpopularized by Tukey, 1977) are often the best way to do this in acompact manner. There are several conventions. Ourrecommendation is shown below, but what is most important is tobe clear about the notation.

minimum possiblevalue

maximumpossiblevalue

0.05 0.25 0.75 0.95

medianvalue

meanvalue

cumulative probability values moving from left to right

X, value of the quantity of interest

Tukey, John W., Exploratory Data Analysis, Addison-Wesley, 688pp., 1977.

68Department of Engineering and Public Policy

Carnegie Mellon University

Reporting uncertainty…(Cont.)

While there may be circumstances in which it is desirableor necessary to address and deal with second-orderuncertainty (e.g., how sure an expert is about the shape ofan elicited CDF), one should be very careful to determinethat the added level of such complication will aide in, andwill not unnecessarily complicate, subsequent use of theresults.

35

69Department of Engineering and Public Policy

Carnegie Mellon University

Characterizing and analyzing uncertainty

Unless there are compelling reasons to do otherwise,conventional probability is the best tool forcharacterizing and analyzing uncertainty.

70Department of Engineering and Public Policy

Carnegie Mellon University

Characterizing and analyzing…(Cont.)

The elicitation of expert judgment, often in the form ofsubjective probability distributions, can be a useful way tocombine the formal knowledge in a field as reflected in theliterature with the informal knowledge and physical intuition ofexperts. Elicitation is not a substitute for doing the neededscience, but it can be a very useful tool in support of researchplanning, private decision making, and the formulation of publicpolicy.

HOWEVER the design and execution of a good expertelicitation takes time and requires a careful integration ofknowledge of the relevant substantive domain with knowledgeof behavioral decision science.

36

71Department of Engineering and Public Policy

Carnegie Mellon University

Characterizing and analyzing…(Cont.)

When eliciting probability distributions from multipleexperts, if they disagree significantly, it is generally better toreport the distributions separately then to combine them intoan artificial "consensus" distribution.

There are a variety of software tools available to supportprobabilistic analysis using Monte Carlo and relatedtechniques. As with any powerful analytical tool, theirproper use requires careful thought and care.

72Department of Engineering and Public Policy

Carnegie Mellon University

Characterizing and analyzing…(Cont.)In performing uncertainty analysis, it is important to thinkcarefully about possible sources of correlation. One simpleprocedure for getting a sense of how important this may be isto run the analysis with key variables uncorrelated and thenrun it again with key variables perfectly correlated. Often, inanswering questions about aggregate parameter values, expertsassume correlation structures between the various componentsof the aggregate value being elicited. Sometimes it isimportant to elicit the component uncertainties separately fromthe aggregate uncertainty in order to reason out why specificcorrelation structures are being assumed.

37

73Department of Engineering and Public Policy

Carnegie Mellon University

Characterizing and analyzing…(Cont.)

Methods for describing and dealing with data pedigree (see forexample Funtowicz and Ravetz, 1990) have not been developedto the point that they can be effectively incorporated inprobabilisitic analysis. However, the quality of the data on whichjudgments are based is clearly important and should beaddressed, especially when uncertain information of varyingquality and reliability is combined in a single analysis.

While full probabilistic analysis can be useful, in many contextsimple parametric analysis, or back-to-front analysis (that worksbackwards from an end point of interest) may be as or moreeffective in identifying key unknowns and critical levels ofknowledge needed to make better decisions.

Funtowicz, S.O. and J.R. Ravetz, Uncertainty and quality in science for policy,Kluwer Academic Publishers, 229 pp,1990.

74Department of Engineering and Public Policy

Carnegie Mellon University

Characterizing and analyzing…(Cont.)Scenarios analysis can be useful, but also carries risks. Specificdetailed scenarios can become cognitively compelling, with theresult that people may overlook many other pathways to thesame end-points. It is often best to "cut the long causal chains"and focus on the possible range of a few key variables whichcan most affect outcomes of interest.

Scenarios which describe a single point (or line) in a multi-dimensional space, cannot be assigned probabilities. If, as isoften the case, it will be useful to assign probabilities toscenarios, they should be defined in terms of intervals in thespace of interest, not in terms of point values.

38

75Department of Engineering and Public Policy

Carnegie Mellon University

Characterizing and analyzing…(Cont.)

Analysis that yields predictions is very helpful when ourknowledge is sufficient to make meaningful predictions.However, the past history of success in such efforts suggestsgreat caution (see for example Ch.s 3 and 6 in Smil, 2005).When meaningful prediction is not possible, alternativestrategies, such as searching for responses or policies that willbe robust across a wide range of possible futures, deservecareful consideration.

Smil, Vaclav, Energy at the Crossroads, MIT Press, 448pp., 2005.

76Department of Engineering and Public Policy

Carnegie Mellon University

Characterizing and analyzing…(Cont.)

For some problems there comes a time when uncertainty is sohigh that conventional modes of probabilistic analysis(including decision analysis) may no longer make sense.While it is not easy to identify this point, investigators shouldcontinually ask themselves whether what they are doing makessense and whether a much simpler approach, such as abounding or order-of-magnitude analysis, might be superior(see for example Casman et al., 1999).

Casman, Elizabeth A., M. Granger Morgan and Hadi Dowlatabadi, "Mixed Levelsof Uncertainty in Complex Policy Models," Risk Analysis, 19(1), 33-42, 1999.

39

77Department of Engineering and Public Policy

Carnegie Mellon University

The use of subjective probability

Lynn Johnson, Thur., Nov. 10, 2005.