Be ea-talk-final

20
Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 1 Possibilistic prediction and risk analysis Bruce Edmonds Centre for Policy Modelling Manchester Metropolitan University

Transcript of Be ea-talk-final

Page 1: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 1

Possibilistic prediction

and risk analysis

Bruce Edmonds

Centre for Policy Modelling

Manchester Metropolitan University

Page 2: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 2

UK Election: forecast and results

Although Nate Silver got ALL 50 state presidential

results right last time, his (colleagues’) forecast for

the UK 2015 election was as erroneous as others.

Even in large numbers people do not behave

stochastically and can surprise en masse

Prediction

and 90%

confidence

intervals

Page 3: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 3

Very Complex Systems can be…

(the pessimistic view!)

• Counter-intuitive – outcomes may go against all ‘commonsense’ expectations

• Intricate – the processes are complicated and interact in detailed and complex ways

• Non-linear – insignificant changes in conditions can cause unlimited difference in the outcomes

• Qualitatively changeable – outcomes may be of a different kind, not just of a different amount

• Specific – principles/patterns that hold for one case do not work in an apparently similar case

• Causally unbounded – the arrival of new factors may radically change the ‘rules of the game’

What do we do when its an open, complex system?

Page 4: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 4

Consequences of such Complexity

• You can’t work out what will happen by thinking

about it – experience of past situations helps but

you can’t know when these will be unhelpful

• Long-term planning of solutions difficult (though

long-term development of capacity & tools that

might be part of the solution useful)

• Forecasting “most likely” outcomes is not

possible, even using sophisticated models (even

low precision with wide confidence intervals)

• Indeed ‘most likely’ forecasts can be dangerous,

because they give a false sense of security

Page 5: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 5

How can you tell whether a system is

complex in this sense?

• Very hard!

– Do deviations from your models/expectations look

random or are they systematic but in complicated and

unpredictable ways?

– Does the behaviour of the system seem to change

qualitatively to different ‘modes’ of being?

– Can some changes apparently come out of ‘nowhere’

with no discernable cause?

• Maybe in the end it depends on the criticality of its

management – how critical it is that you do not get

it wrong – if it is, better assume its complex

Page 6: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 6

New techniques for complexity

• New techniques (vaguely grouped under labels of “complexity science” or “big data”) are making progress in understanding these kinds of system

• Including: data-mining, visualisation, agent-based simulation, data integration, non-parametric stats

• However these give different kinds of information about these systems from former analyses of simpler (so called “linear”) systems

• The models themselves are complex and need further analysis to understand them (previous presentation)

• But habits of analysts are slow to change, in particular a “predict & evaluate” strategy to inform policy will not give reliable results

Page 7: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 7

However this new Science and the

Policy World Do Not Mix Well..

Because (among other reasons):

• What they are concerned with is hard to understand!

• Policy makers can either (a) judge if results are

consistent with what they know (b) just have to trust

• Complexity Scientists and Policy People have very

different goals (understanding vs. action)

• They have to work in very different ways with respect

to very different sub-cultures/peers

• The more decision makers use the complexity

science the more they have to abdicate responsibility

–decision making is ‘technocatised’

Page 8: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 8

Some example problems

• “You have 3 months to give me your best forecast, however preliminary”

• Policy makers are unwilling to outsource any control over the process to researchers

• Important data is not available to researchers

• Policy makers already know what they will do, they are just looking for a justification/story to support this

• Complex models make policy debate difficult

• The outsourcing of blame: “The decision was made based on the best scientific advice”

• The worry of researchers that the caveats underlying their models will be lost/ignored

Page 9: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 9

Strategy 1: “predictive” “black-box”

macro-level models

• Model the system relating macro-level properties

of the whole system (using differential equations,

rate equations, systems dynamics etc.)

• With a view to predicting the effects of different

policy options (albeit with large error bounds)

• It is possible to make such models, of social

phenomena but understanding is often not the

main way of doing this but by using trial&error – in

other words model adaptation (Nate Silver)

• But this only works if nothing essential changes –

these models only give “surprise free” projections

Page 10: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 10

Strategy 2: partial, complex, micro-

level models used as an analogy

• Model some processes/aspects at the micro-level system to observe the emergence of outcomes

• Does not easily relate to data, does not predict, and remains quite abstract from the observed

• Provides a new way of understanding the issues

• But the model remains more of an analogy, because the mapping to any observed case is unclear and remade by each interpreter

• This gives a “story” why, but is difficult to relate to particular policy options/questions

• Tends to give “negative” conclusions w.r.t. policy

Page 11: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 11

Strategy 3: brute-force ‘big data’

models

• Take a stream of a lot of very detailed data

• Condition a model upon this data (e.g. hidden state

markov models)

• Use this to predict what will happen in the future

• This can capture detail that other approaches miss

• But the model is almost as incomprehensible as the

original data

• One does not understand the basis on which the

model predicts

• In particular, one has no idea when it will start to fail

Page 12: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 12

Robots in Uncertain Environments

• 60s/70s AI/Robotics approach: the robot kept a model of its world and tried to evaluate alternative actions via predicting their effects

• These were not good at coping with complex and uncertain environments. What worked better was:– Using the world as a model of itself –

frequent sampling

– Fast adaption in response to immediate conditions

– Different levels of abstraction and/or control

– Using and integrating all available data

Page 13: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 13

Strategy 4: find some of the possible

trajectories

• Not what is probable, but what might be possible

• Techniques (e.g. agent-based modelling) can reveal possible underlying processes/outcomes that would not have been envisioned otherwise

• But will not ever get all of the possibilities

• A complex “risk analysis” of what might happen

• Understanding of these possible emergent processes can be used to design visualisationsand indicators of current data to give an up-to-date understanding of the current situation

• Which, in turn, can be understood and used in political decision making to “drive” policy

Page 14: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 14

An illustration of Strategy 3 in action

Modelling

micro-

aspects

Data

analysis

Expert

opinion

ABM and

other

analysis

Understanding

processual

possibilities

Views of the

data

(visualisation,

measures…)

Policy

Decisions

Consequences

Wider Public

Policy WorldResearch World

DataData

Page 15: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 15

Integration via ABMs to “Views”

Micro-levelNarrative data

Psychology

Data-mining

Survey data

Network data

Participatory

input

Meso-level

Macro-level

Time-series

Aggregate

Statistics

Survey

summaries

ABM etc.

Archetypal

Stories of

Individuals

Complex

Visualisations

Key Global

Indicators

Deliv

ere

d t

o P

olicy

Wo

rld

Scenarios

Page 16: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 16

A Summary of this Strategy

Steps:

• integrate the streams of evidence/data

• discover what the possibilities in terms of social

processes are (using data mining, simulation

modelling etc.)

• use this to focus on what would indicate the early

onset of these processes occurring and their progress

• present tools for these using a variety of means

(visualisations, statistics, graphs, interactives etc.)

• but in a targeted and relevant manner

• be willing to let go, so others can develop their use

ABM and other techniques

What is delivered to the policy world

Page 17: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 17

Example: Why people vote

In the SCID project:

• A complex model of voter turnout behaviour

• Data and evidence driven

• But FAR too complicated to ever use in prediction or even directly relevant “what if” exploration

• Explored the interaction between different, interacting processes that may cause individuals to vote or not

Page 18: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 18

Suggested possible processes and

hypotheses…

• High and low turnout ‘regimes’ that are self-reinforcing

• That the network structure mattered – in some a bistablearea where “clumpiness” changed the response (e.g. within well connected clumps)

• Clumpiness of society can be monitored to identify ‘cut off’ sub-communities

• The measure to monitor this is what is policy relevant not the model it derives from

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

11 71 131

191

251

311

371

431

491

551

611

671

731

791

851

911

971

1031

1091

1151

1211

1271

1331

1391

1451

1511

1571

1631

1691

1751

1811

1871

1931

1991

2051

2111

2171

2231

2291

2351

2411

ti...

Sum of tur...

influence-r...

Page 19: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 19

Conclusions

When dealing with a very complex system…

• Reliance on forecasting is hazardous – whether the forecast comes from intuition or a fancy model

• Better is a strategy of ‘risk analysis’:– Building a ‘tool-kit’ of responses that can be deployed

– Doing contingency planning for many critical situations

– Not delegating decision making to an “technocracy”

– Driving policy using up-to-date “views” of data to reveal emergent trends – reacting quickly

• The data views and planning can be based on an understanding of some of the complex possibilities (that comes from complex models)

Page 20: Be ea-talk-final

Possibilistic prediction and risk analyses, Bruce Edmonds, EA Conference: Planning, Prediction, Scenarios, Bonn, May 2015. slide 20

The End

Centre for Policy Modelling:

http://cfpm.org

The slides will be uploaded at:

http://slideshare.net/BruceEdmonds