1 ABSVal Goal: Adapt accepted Best Practices of VV&A* to simulation models that... a.Display...

Post on 18-Jan-2016

221 views 0 download

Tags:

Transcript of 1 ABSVal Goal: Adapt accepted Best Practices of VV&A* to simulation models that... a.Display...

1

ABSVal

Goal: Adapt accepted Best Practices of VV&A* to simulation models that...

a. Display emergent behavior

b. Are used to model military effects on population dynamics and social phenomena as well as military decision making

c. Support analyses*maybe refine the general Best Practices

2

B.L.U.F.WHAT I THINK WE LEARNED or CONFIRMED

• VV&A universally reviled.– EXCEPTION: DMSO employees and alumni

• VV&A general principles do not map well to analysis domain.– describing anticipated use is problematic– analysis = exploration

• ABS translates to Models exhibiting Emergent Behavior.– pre-production experimentation focused on achieving top-down control (building predictive

capability) on dynamics that initially display emergence– scientifically examining emergence is interesting

• We can help analysts and decision makers distinguish good analysis from bad analysis.

• Minimally acceptable sim-to-app validation is recognizable, achievable, useful, and rare.

3

OUTLINE1. SIMULATION

MODELS

2. A LITTLE EMERGENT BEHAVIOR

3. THINKING ANALYSIS

4. EXPOSING THE MATCH

4

“In contrast to this interest in model-related technology, therehas been far too little interest in the substance of the models and the validity of the lessons learned from using them. In ourview, the DoD does not appreciate that in many cases themodels are built on a base of sand.”

The Base of Sand Problem: A White Paper on the State of Military Combat Modeling.Paul K. DavisDonald Blumenthal

5

WORKABLE DEFINITIONS

• Conceptual Model – Description of the system in some abstracted/symbolic formalism, usually mathematics.

• Verification – The simulation executable faithfully reflects the Conceptual Model

• Validation – The degree to which the system described in the conceptual model is appropriate in supporting the intended use.

• Accreditation – The judgement, made by someone responsible for the outcome, that the simulation is adequately verified and valid for the intended use.

6

COMMENTS

• Conceptual models are always incomplete.

• Verification of a simulation is a scientific endeavor if the conceptual model is complete.

• A simulation is never “Valid.”

• Analytical intended uses are difficult to deal with…– Repetition is very rare.– Analysts have no way to scientifically express the intended use.– Analysts often accept very poor data and models, and often

express grave caveats for their results.

7

formal transitionsT(M)

implementationand design

executablecode

conceptualmodel

ideal sim

naturalsystem

IDEALIZEDDEVELOPMENTPROCESS

8

formal transitionsT(M)

implementationand design

executablecode

conceptualmodel

ideal sim

naturalsystem

IDEALIZEDDEVELOPMENTPROCESS

abstraction

modeling

mapping tosim design pattern

coding andtesting

softwaredesign

9

formal transitionsT(M)

implementationand design

executablecode

conceptualmodel

ideal sim

naturalsystem

IDEALIZEDDEVELOPMENTPROCESS

abstraction

modeling

mapping tosim design pattern

coding andtesting

softwaredesign

Driven by analytictask. More later...

10

executablecode

ideal sim

naturalsystem

REALITY FORBIG-IRON SIMS

abstraction

data development

11

formal transitionsT(M)

implementationand design

executablecode

conceptualmodel

ideal simFOR A GIVEN ANALYSIS

naturalsystem

FOR ANOTHERDAY

FO

CU

S

12

"The more complex the model, the harder it is to distinguish unusual

emergent behavior from programming bugs."

Douglas Samuelson

Renowned Operations

Research Analyst

13

ABSVal PROJECT

• ABSVal Framework

• Test Examples– Pythagoras COIN– SZ/BZ Obstacle Reduction Analysis

• Conclusions

14

VALIDATION

• First-Principles Validation– Assess the theory behind

the conceptual model• And predict its impact on

the ensuing analysis– Examine the

implementation of the theory

• And predict its impact on the ensuing analysis

– Examine the combinations of theories used together

• Results Validation– Compare output data to

data from another source• Historical case• Another model• Intuition

15

VALIDATION

• First-Principles Validation– Assess the theory behind

the conceptual model• And predict its impact on

the ensuing analysis– Examine the

implementation of the theory

• And predict its impact on the ensuing analysis

– Examine the combinations of theories used together

• Results Validation– Compare output data to

data from another source• Historical case• Another model• Intuition

VALIDATION = EXPOSITION + ASSESSMENTThe EVIDENCE on which acceptance is based.

16

formal transitionsT(M)

implementationand design

executablecode

conceptualmodel

ideal sim

naturalsystem•Simulations displaying

emergent behavior are difficult to validate because it is difficult to predict their behavior from the Conceptual Model

•Therefore there is greater pressure to use results validation.

17

“All models are wrong, some are useful.”

George Box

Wartime Statistician

18

ANALYSIS

• Predict the response (absolute)• Predict the response (as compared to a baseline)• Predict the functional form of the response for a set of

independent variables• Predict the sign of the gradient (set of 1st derivatives)• Is there any response?• Predict the min/max of the response over a high-

dimensional domain• Predict xi in [Li, Ui] such that response > c• Characterize the probabilistic nature of the response

Compare to physical sciences -- These are very humble goals.Might a medical/biological mindset be more appropriate?

19

…more ANALYSIS

Provide the best decision support possible, and include useful assessment of the value of the analysis vis. the questions & issues at hand.

20

IDEAL STUDY PROCESS

DETERMINETHE QUESTION

DETERMINETHE QUESTION

DETERMINETHE MOE’s and the EEA’s

DETERMINETHE MOE’s and the EEA’s

FIND or BUILDthe BEST (SIMULATION)MODEL

FIND or BUILDthe BEST (SIMULATION)MODEL

PRODUCTIONRUNS

PRODUCTIONRUNS

PRESENT (and DEFEND)RESULTS

PRESENT (and DEFEND)RESULTS

21

SIMULATION-SUPPORTED ANALYSIS

• Baseline/Excursion or Factorial Experiment

• Driven to answer Analysis Questions

• Key Elements of Analysis

• Constraints, Limitations, and Assumptions

22

Schism• Agent-based simulations use modular rules

and local reasoning to produce realistic and/or interesting emergent aggregate behavior.– Surprise is good**

• Successful simulation testing (core to face/results validation) based on demonstrating credibility across the range of potential input.– Surprise not good**

** Refined later in this talk

23

GOAL: STOP BEING SURPRIZED

Surprise

Explore

Explain

Accept/reject

ProductionRuns

How do w

e te

ll ab

out this

exper

ience

?

In control,no more surprises

1. “Unnatural acts” reflect negatively on a sim

2. Once we achieve top-down control, is there still emergent behavior?

24

ELEMENTSAdaptation of the Yost Scale

SIMULATION DYNAMICS• based on accepted physical

laws• based on accepted social

dynamics• based on common sense• distillation

– simple model relic required to facilitate actions

– simple model relic required to maintain consistency

• top-down human intervention

DATA• authoritative value• measured• witnessed• argued by logic• sensible range• guess/arbitrary• dimensionless

RELEVANT DYNAMICS + REQUIRED DATA = ELEMENTe.g. underwater detection

using observed detection range data in a cookie-cutter model

25

ELEMENTSAdaptation of the Yost Scale

SIMULATION DYNAMICS• based on accepted physical

laws• based on accepted social

dynamics• based on common sense• distillation

– simple model relic required to facilitate actions

– simple model relic required to maintain consistency

• top-down human intervention

DATA• authoritative value• measured• witnessed• argued by logic• sensible range• guess/arbitrary• dimensionless

RELEVANT DYNAMICS + REQUIRED DATA = ELEMENTe.g. underwater detection

using observed detection range data in a cookie-cutter model

CO

NT

RO

LA

BL

E A

BS

TR

AC

TIO

NA

NA

LY

TIC

AL

LY

DE

SIR

AB

LE

26

“It’s the Data, Stupid.”Phalanx, DEC 07George Akst

27

Constraints, Limitations, and Assumptions Guide

TRADOC Analysis Center255 Sedgwick Avenue

Fort Leavenworth, KS 66027-2345

TRAC-TD-05-011 (rev. 1)January 2008

Mike Bauman

28

PARSING SOURCES OF VARIABILITY

CORE: drive the results of your experiment, align with the keyelements of analysis

DYNAMIC CONTEXT: has impact on the circumstances relevant to exercising the core model dynamics, create situations, not elements of analysis

CASES: details necessary tosupport the model, cases to be consideredto achieve analytical goals

C.L.A.: Constraints, Limitations, and Assumptionsnecessary to give scope the analysis,and interpret the results

29

IMPACT ON ANALYSIS

• Agent-based design is reputed to enable fast and easy construction of dynamic context

• Dynamic Context elements can display emergent behavior to add variability

– Emergent behavior is often not predictable/controllable• Big-iron simulations often have parametric (knob) control

over Case elements– impossible to promote these to Dynamic Context or Core

elements– should NOT be elements of analysis

• Ideally, analysts should have the most faith in their Core elements

– should have high-quality data (high on the YOST scale)– should have well-studied dynamics (high on the YOST scale)– must not display uncontrolled emergent behavior

• Limitations on the Core = Limitation of the simulation for analytical purposes

• Core and Dynamic Context Elements should results-proven to be consistent with SME (explainable 1st derivative)

• Core elements should be results-proven to be highly influential (see Scientific Method of Choosing Model Fidelity)

30

IMPACT ON ANALYSIS• Agent-based design is reputed to enable fast and easy construction

of dynamic context• Dynamic Context elements can display emergent behavior to add

variability– Emergent behavior is often not predictable/controllable

• Big-iron simulations often have parametric (knob) control over Case elements

– impossible to promote these to Dynamic Context or Core elements– should NOT be elements of analysis

• Ideally, analysts should have the most faith in their Core elements– should have high-quality data (high on the YOST scale)– should have well-studied dynamics (high on the YOST scale)– must not display uncontrolled emergent behavior

• Limitations on the Core = Limitation of the simulation for analytical purposes

• Core and Dynamic Context Elements should results-proven to be consistent with SME (explainable 1st derivative)

• Core elements should be results-proven to be highly influential (see Scientific Method of Choosing Model Fidelity)

Taxonomy for sources of variability reflecting the relationship between modeldynamics and analytical goals.

** Jargon for communicating how a sim element relates to the analysis.

** Identifies appropriate role for elements withemergent behavior in an analysis.

31

GOLDEN GATE BRIDGE

a solid connection…

simulation/data

capabilities

analytical

requirements

32

TACOMA NARROWS BRIDGE

or, not so much.

simulation/data

capabilities

analytical

requirements

33

Core Experimentaln Parametric Settings

Dynamic

ContextStochastic

1 Parametric Setting

Cases Discrete Cases m Cases

CLA Static1 fixed set of assumptions

RECOMMENDED HANDLING

34

EXAMPLE

• Question: What is the tactical value of LW components to a rifle squad?

• Core: weapon, computer/comm/SA, sight/NVG

• Dynamic Context: paths of maneuver, acquisitions and detections, paths & actions of threat, attrition, …

• Cases: terrain type (urban, jungle, alpine), scale (company, platoon), mission (HVT, defend a FOB)

• CLA: kinetic outcome, unambiguous threat, terrain representation

35

EXAMPLE

• Question: What is the tactical value of LW components to a rifle squad?

• Core: weapon, computer/comm/SA, sight/NVG

• Dynamic Context: paths of maneuver, acquisitions and detections, paths & actions of threat, attrition, …

• Cases: terrain type (urban, jungle, alpine), scale (company, platoon), mission (HVT, defend a FOB)

• CLA: kinetic outcome, unambiguous threat, terrain representation

emergent behavior

dynamics fit here

don’t average over

these cases

36

“Those claims to knowledge that are potentially falsifiable can then be admitted to the body of empirical science, and then further differentiated according to whether they are (so far) retained or indeed are actually falsified.”

Carl PopperPhilosopher of Science

37

NEGATIVE INFORMATION for IN-VALIDATION• Elements not data-driven• Elements not controllable• Element displays undesired emergent

behavior• Element displays unexplainable 1st-order

influence (results schism unexplainable)• Element not in the anticipated layer

– level of influence is more/less than anticipated by analyst

– dynamics or data are...• too low on the Yost scale• mismatched vis. the Yost scale

38

NEGATIVE INFORMATION = IN-VALIDATION?

no concern show stopper

Negative information: scopes analytical value of results.Analyst’s art: responsibly expand this scope.

“This approach uses a very unrealistic model of certain dynamics,but it creates adequate dynamic context to stimulate the core elementsin a way useful to our analytic goals.”

39

“Computer programs should be verified,Models should be validated,

and Analysts should be accredited.”

Alfred G. BrandsteinRenowned Military Operations Research Analyst

Founder of Project Albert

40

THE ANALYST• Prior to any experience with the simulation, can the Analyst...

– Pose analytic questions mathematically?– Describe the experiment?– Identify Core vs. Dynamic Context elements?– Specify CLA elements?– Evaluate Core elements on Yost scale?– Disclose all outcomes the analyst anticipates matching with the simulation (Test

Cases)?• Once experience has been gained, can the Analyst...

– Explain changes to anticipated Core/Dynamic Context/Case/CLA classification?– Describe all testing and tuning required?– Quantify level of influence of each Core & Dynamic Context statistically?– Avoid integrating (averaging) Cases?– Explain the impact of each CLA on the results?– Statistically determine the level of agreement of the simulation outcomes with the

Test Cases?• Resulting analysis should be peer-reviewed

41

Bottom line

42

• Have lots of computational experience with your model.

• Understand and be able to control it emergent behavior.

• Plan and execute experiments, document.

• Disclose relationship between each important sim element and the analytical goal.– Core– Dynamic Context– Cases– CLA

43

QUESTIONS?