How To Conduct A Simulation Study
How To Conduct A Simulation Study
How To Conduct A Simulation Study
Averill M. Law
We now discuss some important concepts that need to be In Figure 1 we present a seven-step approach for conduct-
addressed in any simulation study. Verification is con- ing a successful simulation study. The activities that take
cerned with determining whether the conceptual simulation place in each step are discussed in the following sections.
66
Law
67
Law
Detailed descriptions of each subsystem (in bul- 3.3 Step 3. Is the Conceptual Model Valid?
let format for easy review in Step 3) and how
they interact. • Perform a structured walk-through of the conceptual
What simplifying assumptions were made and model before an audience that includes the project
why. A simulation model should be a simplifica- manager, analysts, and SMEs. This critical activity,
tion or abstraction of the real system, with just which is called conceptual-model validation, is very
enough detail to answer the questions of interest. often skipped.
Summaries of model input data – technical details Helps ensure that the model’s assumptions are
and complicated mathematical/statistical calcula- correct and complete.
tions should be in appendices. The conceptual Fosters interaction among members of the project
model should be readable by the decision-maker team – having members of the project team read
as well as by the analysts and the SMEs. the conceptual model on their own is recom-
Sources of important or controversial information, mended but is definitely not sufficient.
so that this information can be confirmed by an Promotes ownership of the model, which can help
interested party. lessen political problems.
• Collect performance data from the existing system (if Takes place before “programming” begins to
any) to use for model validation in Step 5. avoid significant reprogramming later.
• The level of model detail should depend on the follow- • If errors or omissions are discovered in the conceptual
ing: model, which is virtually always the case, then the
Project objectives. conceptual model must be updated before proceeding
Performance measures of interest. to programming in Step 4.
Data availability.
Credibility concerns – in some cases it might be 3.4 Step 4. Program the Model
necessary to put more detail into the model than
would be dictated strictly from a validity point of • Program the conceptual model in either a general-
view. purpose programming language (e.g., C or C++) or in
Computer constraints. a commercial simulation-software product. Several
Opinions of SMEs. This is one of the most- advantages of a programming language are familiarity,
important methods for determining what aspects greater program control, and lower software purchase
of the real system impact most on performance cost. On the other hand, the use of a commercial
measures of interest and, thus, have to be care- simulation product will reduce “programming” time
fully modeled. and overall project cost. There are two main types of
Time and money constraints. commercial simulation-software products: general pur-
• There should not be a one-to-one correspondence be- pose (e.g., Arena, Extend, SIMUL8, and SLX) and
tween each element of the model and each element of application oriented (e.g., AutoMod, Flexsim, Pro-
the system. Start with a “simple” model and embellish Model, SIMPROCESS, and WITNESS).
it as needed. Unnecessary model detail might result in • Verify (debug) the computer program.
excessive model execution time, in a missed deadline,
or in obscuring those system factors that are really im- 3.5 Step 5. Is the Programmed Model Valid?
portant.
• Interact with the decision-maker (and other key project • If there is an existing system, then compare perform-
personnel) on a regular basis, which has the following ance measures from a simulation model of the existing
benefits: system with the comparable performance measures
Helps ensure that the correct problem is solved – collected from the actual existing system (see Step 2).
the greatest model for the wrong problem will be This is called results validation, and is the most-
of little value to the decision-maker. important model validation technique that is available.
The decision-maker’s interest in and involvement Several real-world examples of this technique are
with the study are maintained, which are very im- given in Law and Kelton (2000, pp. 279-281). If re-
portant for project success. sults validation is successful, then it also lends credi-
The model is more credible because the decision- bility to the simulation model.
maker understands and agrees with the model’s • Regardless of whether there is an existing system, the
assumptions. simulation analysts and SMEs should review the simula-
tion results for reasonableness. If the results are consis-
tent with how they perceive the system should operate,
then the simulation model is said to have face validity.
68
Law
• Sensitivity analyses should be performed on the pro- • Lack of knowledge of simulation methodology and
grammed model to see which model factors have the also probability and statistics
greatest impact on the performance measures and,
thus, have to be modeled carefully [see Law and Kel- 4.2 Simulation Software
ton (2000, pp. 278-279)].
• Inappropriate simulation software – either too inflexi-
3.6 Step 6. Design, Conduct, and Analyze ble or too difficult to use
Simulation Experiments • Belief that so-called “easy-to-use software” requires a
lower level of technical competence – regardless of the
• For each system configuration of interest, decide on
software used, one still has to deal with such issues as
tactical issues such as simulation run length, length of
problem formulation, what data to collect, model vali-
the warmup period (generally necessary if the steady-
dation, etc.
state behavior of a system is of interest), and the num-
ber of independent model replications. A major pitfall • “Blindly” using software without understanding its
here is to make one replication of the simulation underlying assumptions, which might be poorly
model of some arbitrary length and then to assume that documented
the resulting output statistics are, in fact, the true per- • Misuse of animation – making an important decision
formance measures for the model. We recommend about the system of interest based primarily on view-
that a confidence interval be constructed for a per- ing an animation for a short period of time, rather than
formance measure of interest. on the basis of a careful statistical analysis of the
• Analyze the results and decide if additional experi- simulation output data
ments are required.
4.3 Modeling System Randomness
3.7 Step 7. Document and Present the
Simulation Results • Replacing an input probability distribution by its mean
• Incorrect choice of input probability distributions –
• The documentation for the model (and the associated normal or uniform distributions will rarely be correct
simulation study) should include the conceptual model • Cavalier use of the triangular distribution when system
(critical for future reuse of the model, which is particu- data could be collected – triangular distributions can-
larly important in the defense community where most not accurately represent a source of randomness whose
analyses are done using legacy models), a detailed de- density function has a long right tail, a common situa-
scription of the computer program, and the re- tion in practice
sults/conclusions for the current study.
• The final presentation for the simulation study should 4.4 Design and Analysis of
include animations and a discussion of the model build- Simulation Experiments
ing/validation process to promote model credibility.
• Misinterpretation of simulation results – treating simu-
4 PITFALLS IN SIMULATION MODELING lation output statistics as if they were the true model
performance measures
We discuss seventeen critical pitfalls in simulation model- • Failure to have a warmup period when the steady-state
ing, which are grouped into four categories. behavior of the system is of interest
• Analyzing (correlated) output data from one replication
4.1 Modeling and Validation of a simulation model using formulas that assume inde-
pendence – variances might be grossly underestimated
• Failure to have a well-defined set of objectives at the
beginning of the study 5 REFERENCES
• Misunderstanding of simulation by management
• Failure to communicate with the decision-maker on a Banks, J., J. S. Carson, B. L. Nelson, and D. M. Nicol.
regular basis 2001. Discrete-Event System Simulation, Third Edi-
• Failure to collect good system data tion, Prentice-Hall, Upper Saddle River, N. J.
Law, A. M. and W. D. Kelton. 2000. Simulation Modeling
• Inappropriate level of model detail – this is one of the
and Analysis, Third Edition, McGraw-Hill, New York.
most common errors, particularly among new analysts
• Treating a simulation study as if it were primarily an
exercise in computer programming
69
Law
AUTHOR BIOGRAPHY
70