Download presentation
1
INTRODUCTION TO SIMULATION
IE 324 SIMULATION INTRODUCTION TO SIMULATION
2
WHAT IS SIMULATION? To feign, to obtain the essence of, without reality. [Webster’s Collegiate Dictionary] The imitation of the operation of a real-world process or system over time. [Banks et al. (2005)] The process of designing a logical or mathematical model of a real system and then conducting computer based experiments with the model to describe, explain, and predict the behaviour of the real system. [Taylor (1984)]
3
WAYS TO STUDY A SYSTEM System Experiment with the actual system
with a model of the system Physical Model Abstract Model Analytical Model Simulation
4
INPUT/OUTPUT PROCESS (Y) (X) Y=f(X) REAL-LIFE SYSTEM
Decision variables Parameters SIMULATION MODEL System response (Y) (X) Y=f(X)
5
EXAMPLE: HEALTH CENTER
Number of Doctors Capacity of equipment Arrival rate SIMULATION MODEL OF HEALTH CENTRE Time in the system Utilization of doctors Waiting time
6
EXAMPLE:SERIAL PRODUCTION LINE
……. 1 2 3 N Length of the line Size of buffers Processing times SIMULATION MODEL OF PRODUCTION LINE Throughput Interdeparture time variability Utilizations
7
ORIGIN OF SIMULATION Lie in statistical sampling theory, e.g., random numbers, random sampling (Before the 2nd world war) Monte Carlo simulation (During the 2nd world war) Modern Applications (After the 2nd world war)
8
POPULARITY OF SIMULATION
Consistently ranked as the most useful, popular tool in the broader area of operations research / management science
9
SYSTEM A system is a group of objects (or elements) that are joined together in some regular interactions towards the accomplishment of some stated objective or purpose
10
COMPONENTS OF A SYSTEM Entity: is an object of interest in the system (which requires an explicit representation in the system model) Example: Health Center Patients, Doctors & Nurses, Rooms & beds Lab equipment, X-Ray machine, etc.
11
COMPONENTS OF A SYSTEM Attribute: is a characteristic of an entity
Example: Patient Type of illness, Age, Sex, etc.
12
SYSTEM STATE A collection of variables that contains all the information necessary to describe the system at any time Example: Health Center Number of patients in the system, Status of doctors (busy or idles), Number of idle doctors, Status of Lab equipment, etc
13
EVENT An instantaneous occurrence that may change the state of the system Example: Health Centre Arrival of a new patient, Completion of a service (i.e., examination) Failure of medical equipment, etc.
14
VARIABLES Relevant variables Irrelevant variables
affect the system performance ………..Don’t affect ……... Endogenous variables Exogenous variables
15
EXOGENOUS VARIABLES Input variables
External to the model (i.e., exist independently of the model) Exogenous variables Controllable variables (Decision Variables) Uncontrollable variables (Parameters) That can be manipulated to an extent by the DM Cannot be manipulated
16
ENDOGENOUS VARIABLES Output variables
Internal to the model and are functions of the exogenous variables and the model dynamics Examples: Performance measures State variables
17
BE CAREFUL !!! Classification of relevant vs. irrelevant variables depends on: Purpose of the study Scope (Level of Detail) Classification of controllable vs. uncontrollable variables depends on: Purpose of the study (existing vs. new) Resources that are under control of the DM
18
MODEL Why do we need a model? Why do we study a system?
To design new systems To improve system performance To solve problems affecting the system performance Why do we need a model? The system does not exist (i.e., conceptual stage) Impractical or too costly to experiment with the actual system
19
MODEL A representation of a system for the purpose of studying the system ….by Banks et al. (2005).
20
CLASSIFICATION OF MODELS
Physical Models Abstract Models Resemble the real system physically (a small scale representation Use symbolic notation & mathematical equations to represent a systems BEGIN; EI=BI+PROD-DEMAND . END;
21
ABSTRACT MODELS Prescriptive (Normative Models) Descriptive Models
Used to formulate & solve a problem Used to describe the system behaviour Examples: Examples: Linear Programming Dynamic Programming Simulation Queuing Models
22
ABSTRACT MODELS Analytical Numerical Examples: Examples:
Employ the deductive reasoning of mathematics to solve the model Employ computational procedures to solve the mathematical models Examples: Examples: Queuing models Differential calculus Simulation Linear programming
23
ABSTRACT MODELS Stochastic Deterministic Examples: Examples:
Contains one or more random variables Does not contain a random variable Examples: Examples: Simulation Stochastic programming LP, MIP and DP Simulation
24
ABSTRACT MODELS Static Dynamic Examples: Examples:
Represents the system at a particular point in time Represents the system as it changes over time Examples: Examples: Many optimization models covered in our curriculum Monte Carlo simulation Simulation Dynamic Programming Control Models Queueing Models
25
ABSTRACT MODELS Discrete Continuous Examples: Examples:
May only take a limited or specified values May take on the value of any real number Examples: Examples: Integer Programming Simulation Simulation Queueing Models
26
CHARACTERISTICS OF SIMULATION
Abstract Numerical Descriptive Deterministic/Stochastic Static/Dynamic Discrete/Continuous
27
ANALYTICAL VS SIMULATION
Use analytical model whenever possible Use simulation when 1) Complete mathematical formulation does not exist or an analytical solution cannot be developed 2) Analytical methods are available, but the mathematical procedures are so complex that simulation provides a simpler solution 3) It is desired to observe a simulated history of the process over a period of time in addition to estimating certain system performances
28
CAPABILITIES OF SIMULATION
Time compression and expansion Explains “why?” Allows to explore possibilities (What if?”) Helps diagnosing problems & identify constraints Requires fewer assumptions Handles randomness and uncertainty Handles dynamic behaviour Flexible and easy to change Credible and results are easier to explain
29
LIMITATIONS OF SIMULATION
“Run” rather than “solve” Random output obtained from stochastic simulations (Statistical analysis of output is required) Cannot generate optimal solution on its own Requires specialized training (probability, statistics, computer programming, modelling, system analysis, simulation methodology) Costly (software and hardware)
30
SIMULATION APPLICATIONS
31
STEPS IN A SIMULATION STUDY
Model conceptualization No Experimental Design Yes Setting of objectives and overall project plan Model translation Verified? Yes Validated? Problem formulation Production runs and analysis No Yes Yes More runs? Data collection No No Implementation Documentation and reporting
32
PROBLEM FORMULATION A statement of the problem
the problem is clearly understood by the simulation analyst the formulation is clearly understood by the client Problem, but not symptoms Criteria for selecting a problem: Technical, economic and political feasibility Perceived urgency for a solution
33
SETTING OBJECTIVES & PROJECT PLAN (PROJECT PROPOSAL)
Determine the questions that are to be answered Identify scenarios to be investigated Scope (Level of detail) Determine the end-user Determine data requirements Determine hardware, software, & personnel requirements Prepare a time plan Cost plan and billing procedure
34
MODEL DEVELOPMENT Conceptual model Logical model Simulation model
Real World System Conceptual model Logical model Simulation model
35
CONCEPTUAL MODEL Conceptual model Real World System Subsystem of
interest Conceptual model
36
CONCEPTUAL MODEL Questions to be answered Scope (Level of detail)
Performance measures Events, entities, attribute, exogenous variables, endogenous variables, & their relationships Data requirements
37
SCOPE More detailed the model is, more representative it is of the actual system (if the modeling is done correctly) A more detailed model requires: more time and effort longer simulation runs more likely to contain errors
38
Accuracy of the model Scope & level of detail Cost of model Scope & level of detail
39
Tends toward too much detail Tends toward greater detail
SCOPE Modeller Novice Modeller Experienced Modeller Tends toward too much detail Tends toward greater detail KISS
40
LEVELS OF DETAIL Evaluate if the candidate systems work
Compare two or more systems to determine better ones Accurately predict the performance of selected system Scope
41
LOGICAL (FLOWCHART) MODEL
Shows the logical relationships among the elements of the model start Read data check Set new event Generate data Calculate Stats check Print Stop
42
SIMULATION (OPERATIONAL) MODEL
The model that executes the logic contained in the flow-chart model Coding Special Purpose Languages & Environments General Purpose Languages Examples: Examples: FORTRAN, C, PASCAL SIMAN, ARENA. AUTOSIM
43
SIMAN MODEL --- MODEL FILE --- BEGIN;
CREATE,1:,EXPO(40):EX(40):MARK(1); QUEUE,1; SEIZE:DOCTOR; DELAY:EXPO(30); TALLY:1,INT(1); RELEASE:DOCTOR; COUNT:1:DISPOSE; END: ----EXPERIMENTAL FILE ----- PROJECT,HEALTH_CENTRE, IHSA SABUNCUOGLU,24/1/2000; DISCRETE,100,1,1; RESOURCES:1, DOCTORS; DSTATS:1,NQ(!),NUMBER_IN_QUEUE: 2,NR(1),DOCTOR UTILIZATION; TALLIES:1, TIME IN HEALTH_CENTRE; COUNTERS:1,No. OF PATIENTS SERVED;
44
ARENA MODEL
45
Java Model // Loop until first "TotalCustomers" have departed
while(NumberOfDepartures < TotalCustomers ) { Event evt = (Event)FutureEventList.getMin(); // get imminent event FutureEventList.dequeue(); // be rid of it Clock = evt.get_time(); // advance simulation time if( evt.get_type() == arrival ) ProcessArrival(evt); else ProcessDeparture(evt); } ReportGeneration();
46
DATA COLLECTION & ANALYSIS
The client often collects the data & submit it in electronic format Simulation Analyst: Determines the random variables Determines the data requirements Analyses the data Fits distribution functions
47
VERIFICATION AND VALIDATION
Verification: the process of determining if the operational logic of the model is correct. Validation:the process of determining if the model accurate representation of the system.
48
VERIFICATION AND VALIDATION
Real World System Conceptual model VALIDATION Logical model VERIFICATION Simulation model
49
EXPERIMENTAL DESIGN Alternative scenarios to be simulated
Type of output data analysis (steady state vs. transient state analysis) Number of simulation runs Length of each run Initialization Variance reduction
50
ANALYSIS OF RESULTS Determine the simulation runs necessary to estimate the performance measures Statistical tests for significance and ranking Interpretation of results
51
DOCUMENTATION General model logic Key elements of the model
Data structures Alternative scenarios Performance measures or criteria used Results of experiments Recommendations
52
IMPLEMENTATION ? FAILURE SUCCESS
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.