Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol...

32
Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

Transcript of Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol...

Page 1: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

Chapter 10 Verification and Validationof Simulation Models

Banks, Carson, Nelson & Nicol

Discrete-Event System Simulation

Page 2: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

2

Purpose & Overview

The goal of the validation process is: To produce a model that represents true behavior of the system

closely enough for decision-making purposes To increase the model’s credibility to an acceptable level to be

used by managers and other decision makers

Validation is an integral part of model development Verification – building the model correctly (correctly implemented

with good input and structure) Validation – building the correct model (an accurate

representation of the real system)

Most methods are informal subjective comparisons while a few are formal statistical procedures

Page 3: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

Overview

Validation is an integral part of model development Verification – building the model correctly (correctly implemented

with good input and structure) Validation – building the correct model (an accurate

representation of the real system) Usually achieved through calibration

Most methods are informal subjective comparisons while a few are formal statistical procedures Statistical procedures on output are the subject of chapters 11

and 12

3

Page 4: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

10.1 Model building, verification, and validation

First step in model building is observing the real system Interactions of components, collecting data Take advantage of people with special knowledge

Construct a conceptual model Assumptions about components - hypotheses Structure of the system

Implementation of an operational model using software

Not a linear process Will return to each step many times while building, verifying and

validating the model

4

Page 5: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

5

Modeling-Building, Verification & Validation

Page 6: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

6

10.2 Verification

Purpose: ensure the conceptual model is reflected accurately in the computerized representation.

Conceptual model usually involves some abstraction or some amount of simplification of actual operations

Many common-sense suggestions, for example: Have someone else check the model. Make a flow diagram that includes each logically possible action

a system can take when an event occurs. Closely examine the model output for reasonableness under a

variety of input parameter settings. (Often overlooked!) Print the input parameters at the end of the simulation, make

sure they have not been changed inadvertently.

Page 7: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

Verification continued

Verify the animation imitates the real system Monitor the simulation via a debugger

Step it and check values Use graphical interfaces as a form of self documentation

Are these the steps a software engineer would use?

7

Page 8: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

8

Examination of Model Output for Reasonableness see page 392 [Verification]

Example: A model of a complex network of queues consisting many service centers. Response time is the primary interest, however, it is important to

collect and print out many statistics in addition to response time.

Two statistics that give a quick indication of model reasonableness are current contents and total counts, for example:

If the current content grows in a more or less linear fashion as the simulation run time increases, it is likely that a queue is unstable

If the total count for some subsystem is zero, indicates no items entered that subsystem, a highly suspect occurrence

If the total and current count are equal to one, can indicate that an entity has captured a resource but never freed that resource.

Compute certain long-run measures of performance, e.g. compute the long-run server utilization and compare to simulation results

Page 9: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

9

Other Important Tools [Verification]

Documentation A means of clarifying the logic of a model and verifying

its completeness

Use of a trace A detailed printout of the state of the simulation model

over time. Gives the value of every variable in a program, every time one

changes in value. Can use print/write statements Use a selective trace in eclipse

Page 10: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

10.3 Calibration and Validation

Verification and validation are usually conducted simultaneously

Validation: the overall process of comparing the model and its behavior to the real system.

Calibration: the iterative process of comparing the model to the real system and making adjustments. Some tests subjective and other objective

Objective tests require data on the system’s behavior and Corresponding data produced by the model

10

Page 11: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

11

Calibration and Validation

Page 12: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

12

Calibration and Validation

No model is ever a perfect representation of the system The modeler must weigh the possible, but not guaranteed,

increase in model accuracy versus the cost of increased validation effort.

Three-step approach: Build a model that has high face validity. Validate model assumptions. Compare the model input-output transformations with the real

system’s data.

Page 13: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

13

10.3.1 High Face Validity [Calibration & Validation]

Ensure a high degree of realism: Potential users should be involved in model construction (from its conceptualization to its implementation).

Sensitivity analysis can also be used to check a model’s face validity. Example: In most queueing systems, if the arrival rate of

customers were to increase, it would be expected that server utilization, queue length and delays would tend to increase.

Page 14: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

14

10.3.2 Validate Model Assumptions [Cal & Val]

General classes of model assumptions: Structural assumptions: how the system operates. Data assumptions: reliability of data and its statistical analysis.

Bank example: customer queueing and service facility in a bank. Structural assumptions, e.g., customer waiting in one line versus

many lines, served FCFS versus priority. Data assumptions, e.g., interarrival time of customers, service

times for commercial accounts. Verify data reliability with bank managers. Test correlation and goodness of fit for data (see Chapter 9 for more

details).

Page 15: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

Analysis of input data from Chapter 9

1. Identify an appropriate probability distribution

2. Estimate the parameters of the hypothesized distribution

3. Validate the assumed statistical model by goodness-of-fit test, such as the chi-square or Kolmogorov-Smirnov test, and by graphical methods

15

Page 16: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

16

10.3.3 Validate I-O Transformations [Cal & Val]

Goal: Validate the model’s ability to predict future behavior The only objective test of the model as a whole. If input variables were to increase or decrease, model should

accurately predict what would happen in the real system The structure of the model should be accurate enough to make

good predictions for the range of input data sets of interest.

In this process the model is viewed as an input-output transformation

One possible approach: use historical data that have been reserved for validation purposes only.

Criteria: use the main responses of interest. If model is used for different purpose later, revalidate it

Page 17: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

Validate I-O Transformations [Cal & Val]

Note that for validation of the I-O, some version of the system under study must exist. Without the system other types of validation should be used Maybe just subsystems exist

Transfer to a new set input parameters may cause changes in operational model and may require revalidation This is an interesting validation problem that really concerns

what-if questions about the model.

17

Page 18: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

Changing the Input-Output transformations

The model may be used to compare different system designs or

The model may be used to investigate system behavior under new input conditions

What can be said of the about the validity of the model under these new conditions?

Responses of the two models used to compare two systems Hope that confidence in old model can be transferred to new

18

Page 19: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

Typical changes in the operational model

Minor changes of single numerical parameter such as arrival rate of customers

Minor changes of the form of the statistical distribution such as the service time

Major changes involving the logical structure of a subsystem Major changes involving a different design for the new system (say

computerizing the inventory)

Minor changes can be carefully verified and model accepted with confidence Possible to do a partial validation on the major changes No way to validate the input-output transformations of a non-existing system

completely

19

Page 20: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

20

Example 10.2: The Fifth National Bank of Jasper[Validate I-O Transform]

One drive-in window serviced by one teller, only one or two transactions are allowed at the window

So it was assumed that each service time was from some underlying population.

Data collection: 90 customers during 11 am to 1 pm. Observed service times {Si, i = 1,2, …, 90}.

Observed interarrival times {Ai, i = 1,2, …, 90}.

Data analysis (Ch 9) led to the conclusion that: Interarrival times: exponentially distributed with rate = 45/hr

That is, a Poisson arrival process

Service times assumed to be N(1.1, (0.2)2)

Page 21: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

21

The Black Box[Bank Example: Validate I-O Transformation]

A model was developed in close consultation with bank management and employees

Model assumptions were validated Resulting model is now viewed as a “black box”:

Input Variables

Possion arrivals = 45/hr: X11, X12, …Services times, N(D2, 0.22): X21, X22, …

D1 = 1 (one teller)D2 = 1.1 min (mean service time)D3 = 1 (one line)

Uncontrolled variables, X

Controlled Decision variables, D

Model Output Variables, Y

Primary interest:Y1 = teller’s utilizationY2 = average delayY3 = maximum line length

Secondary interest:Y4 = observed arrival rateY5 = average service timeY6 = sample std. dev. of service timesY7 = average length of time

Model“black box”

f(X,D) = Y

Page 22: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

22

Comparison with Real System Data[Bank Example: Validate I-O Transformation]

Real system data are necessary for validation. System responses should have been collected during the same

time period (from 11am to 1pm on the same Friday.)

Compare the average delay from the model Y2 with the actual delay Z2: Average delay observed, Z2 = 4.3 minutes, consider this to be the

true mean value 0 = 4.3.

When the model is run with generated random variates X1n and X2n, Y2 should be close to Z2.

Six statistically independent replications of the model, each of 2-hour duration, are run.

Page 23: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

23

Hypothesis Testing[Bank Example: Validate I-O Transformation]

Compare the average delay from the model Y2 with the actual delay Z2 (continued): Null hypothesis testing: evaluate whether the simulation and the

real system are the same (w.r.t. output measures):

If H0 is not rejected, then, there is no reason to consider the model invalid

If H0 is rejected, the current version of the model is rejected, and the modeler needs to improve the model

minutes 3.4

minutes 34

21

20

): E(YH

.): E(YH

Page 24: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

24

Hypothesis Testing[Bank Example: Validate I-O Transformation]

Conduct the t test: Chose level of significance ( = 0.5) and sample size (n = 6),

see result in Table 10.2. Compute the same mean and sample standard deviation over

the n replications:

Compute test statistics:

Hence, reject H0. Conclude that the model is inadequate. Check: the assumptions justifying a t test, that the observations

(Y2i) are normally and independently distributed.

minutes 51.21

122

n

iiY

nY minutes 81.0

1

)(1

222

n

YYS

n

ii

test)sided-2 a(for 571.2 5.24 6/82.0

3.451.2

/02

0

criticaltnS

Yt

Page 25: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

25

Hypothesis Testing[Bank Example: Validate I-O Transformation]

Similarly, compare the model output with the observed output for other measures: Y4 Z4, Y5 Z5, and Y6 Z6

Page 26: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

26

Type II Error [Validate I-O Transformation]

For validation, the power of the test is: Probability[ detecting an invalid model ] = 1 – = P(Type II error) = P(failing to reject H0|H1 is true)

Consider failure to reject H0 as a strong conclusion, the modeler would want to be small.

Value of depends on: Sample size, n The true difference, , between E(Y) and :

In general, the best approach to control error is: Specify the critical difference, Choose a sample size, n, by making use of the operating

characteristics curve (OC curve).

)(YE

Page 27: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

27

Type I and II Error[Validate I-O Transformation]

Type I error (): Error of rejecting a valid model. Controlled by specifying a small level of significance .

Type II error (): Error of accepting a model as valid when it is invalid. Controlled by specifying critical difference and find the n.

For a fixed sample size n, increasing will decrease .

Page 28: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

28

Confidence Interval Testing[Validate I-O Transformation]

Confidence interval testing: evaluate whether the simulation and the real system are close enough.

If Y is the simulation output, and = E(Y), the confidence interval (C.I.) for is:

Validating the model (see Figure 10.6): Suppose the C.I. does not contain

If the best-case error is > , model needs to be refined. If the worst-case error is , accept the model. If best-case error is , additional replications are necessary.

Suppose the C.I. contains If either the best-case or worst-case error is > , additional

replications are necessary. If the worst-case error is , accept the model.

nStY n /1,2/

Page 29: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

29

Confidence Interval Testing[Validate I-O Transformation]

Bank example: , and “close enough” is = 1 minute of expected customer delay. A 95% confidence interval, based on the 6 replications is

[1.65, 3.37] because:

Falls outside the confidence interval, the best case |3.37 – 4.3| = 0.93 < 1, but the worst case |1.65 – 4.3| = 2.65 > 1, additional replications are needed to reach a decision.

)6/82.0(51.23.4

/5,025.0

nStY

Page 30: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

30

10.3.4 Using Historical Input Data [Validate I-O Transformation]

An alternative to generating input data: Use the actual historical record. Drive the simulation model with the historical record and then

compare model output to system data. In the bank example, use the recorded interarrival and service

times for the customers {An, Sn, n = 1,2,…}.

Procedure and validation process: similar to the approach used for system generated input data.

You should follow the Candy Factory example (Example 10.3) to get familiar with using the computer system trace historical data.

Page 31: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

31

Using a Turing Test [Validate I-O Transformation]

Use in addition to statistical test, or when no statistical test is readily applicable.

Utilize persons’ knowledge about the system. For example:

Present 10 system performance reports to a manager of the system. Five of them are from the real system and the rest are “fake” reports based on simulation output data.

If the person identifies a substantial number of the fake reports, interview the person to get information for model improvement.

If the person cannot distinguish between fake and real reports with consistency, conclude that the test gives no evidence of model inadequacy.

Page 32: Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.

32

Summary

Model validation is essential: Model verification Calibration and validation Conceptual validation

Best to compare system data to model data, and make comparison using a wide variety of techniques.

Some techniques that we covered (in increasing cost-to-value ratios): Insure high face validity by consulting knowledgeable persons. Conduct simple statistical tests on assumed distributional forms. Conduct a Turing test. Compare model output to system output by statistical tests.