Presentation is loading. Please wait.

Presentation is loading. Please wait.

Graduate Program in Engineering and Technology Management

Similar presentations


Presentation on theme: "Graduate Program in Engineering and Technology Management"— Presentation transcript:

1 Graduate Program in Engineering and Technology Management
INPUT modeling Simulation-4 Aslı Sencer

2 Steps of input modeling
Collect data from real system of interest Requires substantial time and effort Use expert opinion in case of no sufficient data Identify a probability distribution to represent the input process Draw frequency distribution, histograms Choose a family of theoretical distribution Estimate the parameters of the selected distribution Apply goodness-of-fit tests to evaluate the chosen distribution and the parameters Chi-square tests Kolmogorov Smirnov Tests If these tests are not justified, choose a new theoretical distribution and go to step 3! If all theoretical distributions fail, then either use emprical distribution or recollect data.

3 Step 1: Data Collection includes lots of difficulties
Nonhomogeneous interarrival time distribution; distribution changes with time of the day, days of the week, etc. You can’t merge all these data for distribution fitting! Two arrival processes might be dependent; like demand for washing machines and dryers. You shouldn’t treat them seperately! Start and end of service durations might not be clear; You should split the service into well defined processes! Machines may breakdown randomly; You should collect data for up and down times!

4 Step 2.1: Identify the Probability Distribution
Raw Data 10 8 5 1 6 4 2 3 9 7 11 Histogram with Discrete Data Arrivals per period Frequency 12 1 10 2 19 3 17 4 5 8 6 7 9 11

5 Step 2.1: Identify the Probability Distribution
Raw Data Histogram with Continuous Data 79.919 3.081 0.062 1.961 5.845 3.027 6.505 0.021 0.013 0.123 6.769 59.899 1.192 34.760 5.009 18.387 0.141 43.565 24.420 0.433 2.663 17.967 0.091 9.003 0.941 0.878 3.148 2.157 7.579 0.624 5.380 3.371 7.078 23.960 0.590 1.928 0.300 0.002 0.543 7.004 31.764 1.005 1.147 0.219 3.217 14.382 1.008 2.336 4.562 Component Life (days) Frequency [0-3) 23 [3-6) 10 [6,9) 5 [9-12) 1 [12-15) [15-18) 2 [18-21) [21-24) [24-27) [27-30) [30-33) [33-36) ... [42-45) [57-60) [78-81) [ )

6 Step 2.2: Selecting the family of distributions
The purpose of preparing a histogram is to infer a known pdf or pmf. This theoretical distribution is used to generate random variables like interarrival times and service times during simulation runs. Exponential, normal and poisson ditributions are frequently encountered and are not difficult to analyze. Yet there are beta, gamma and weibull families that provide a wide variety of shapes.

7 Applications of Exponential Distribution
Used to model time between independent events, like arrivals or breakdowns Inappropriate for modeling process delay times

8

9 Applications of Poisson Distribution
Discrete distribution, used to model the number of independent events occuring per unit time, Eg. Batch sizes of customers and items If the time betweeen successive events is exponential, then the number of events in a fixed time intervals is poisson.

10

11

12 Applications of Beta Distribution:
Often used as a rough model in the absence of data Represent random proportions Can be transformed into scaled beta sample Y=a+(b-a)X

13

14 Applications of Erlang Distribution
Used to represent the time required to complete a task which can be reprsented as the sum of k exponentially distributed durations. For large k, Erlang approaches normal distribution. For k=1, Erlang is the exponential distribution with rate=1/β. Special case of gamma distribution in which α, the shape parameter of gamma distribution is k.

15 Applications of Gamma Distribution
Used to represent time required to complete a task Same as Erlang distribution when the shape parameter α is an integer.

16 Applications of Johnson Dist.
Flexible domain being bounded or unbounded allows it to fit many data sets. If δ>0, the domain is bounded If δ<0, the domain is unbounded

17 Applications of Lognormal Distribution
Used to represent quantities which is the product of large number of random quantities Used to represent task times which are skewed to right. If X~LOGN( ), then lnX ~NORM(μ,σ)

18

19 Applications of Weibull Distribution
Widely used in reliability models to represent lifetimes. If the system consists of large number of parts that fail independently, time between successive failures can be Weibull. Used to model nonnegative task times that are skewed to left. It turns out to be exponential distribution when =1.

20 Applications of Continuous Empirical Distribution
Used to incorporate empirical data as an alternative to theoretical distribution, when there are multimodes, significant outliers, etc.

21 Applications of Discrete Empirical Distribution
Used for discrete assignments such as job type, visitation sequence or batch size

22 Step 3: Estimate the parameters of the selected distribution
A theoretical distribution is specified by its parameters that are obtained from the whole population data. Ex: Let V,W,X,Y,Z be random variables, then V~N(µ,σ2), where µ is the mean and σ2 is the variance. W~Poisson (λ), where λ is the mean X~Exponential (β), where β is the mean Y~Triangular (a,m,b), where a, m,b are the minimum,mod and the maximum of the data Z~Uniform (a,b), where a and b are the minimum and maximum of the data These parameters are estimated by using the point estimators defined on the sample data

23 Step 3: Estimate the parameters of the selected distribution
Sample mean and the sample variance are the point estimators for the population mean and population variance Let Xi; i=1,2,...,n iid random variables (raw data are known) , then the sample mean and sample variance s2 are calculated as Discrete Raw Data Continuous Raw Data 10 8 5 1 6 4 2 3 9 7 11 79.919 3.081 0.062 1.961 5.845 3.027 6.505 0.021 0.013 0.123 6.769 59.899 1.192 34.760 5.009 18.387 0.141 43.565 24.420 0.433 2.663 17.967 0.091 9.003 0.941 0.878 3.148 2.157 7.579 0.624 5.380 3.371 7.078 23.960 0.590 1.928 0.300 0.002 0.543 7.004 31.764 1.005 1.147 0.219 3.217 14.382 1.008 2.336 4.562

24 Step 3: Estimate the parameters of the selected distribution
If the data are discrete and have been grouped in a frequency distribution, i.e., the raw data are not known, then where k is the number of distinct values of X and fj; j=1,2,...,k is the observed frequency of the value Xj of X. Arrivals per period Frequency 12 6 7 1 10 5 2 19 8 3 17 9 4 11

25 Step 3: Estimate the parameters of the selected distribution
If the data are discrete or continuous and have been grouped in class intervals, i.e., the raw data are not known, then where fj; j=1,2,...,c is the observed frequency of the jth class interval and mj is the midpoint of the jth interval. Component Life (days) Frequency [0-3) 23 [21-24) 1 ... [3-6) 10 [24-27) [57-60) [6,9) 5 [27-30) [9-12) [30-33) [78-81) [12-15) [33-36) [15-18) 2 [ ) [18-21) [42-45)

26 Step 3: Estimate the parameters of the selected distribution
The minimum, mod (i.e., data value with the highest frequency) and maximum of the population data are estimated from the sample data as Xt is the data value that has the highest frequency.

27 Step 4: Goodness of fit test
Goodness of fit tests (GFTs) provide helpful guidance for evaluating the suitability of the selected input model as a simulation input. GFTs check the discrepancy between the emprical and the selected theoretical distribution to decide whether the sample is taken from that theoretical distribution or not. The role of sample size, n: If n is small, GFTs are unlikely to reject any theoretical distribution, since discrepancy is attributed to the sampling error! If n is large, then GFTs are likely to reject almost all distributions.

28 Step 4: Goodness of fit tests Chi square test
Chi square test is valid for large sample sizes and for both discrete and continuous assumptions when parameters are estimated with maximum likelihood. Hypothesis test: Ho: The random variable X conforms to the theoretical distribution with the estimated parameters Ha: The random variable does NOT conform to the theoretical distribution with the estimated parameters We need a test statistic to either reject or fail to reject Ho. This test statistic should measure the discrepency between the theoretical and the emprical distribution. If this test statistic is high, then Ho is rejected, Otherwise we fail to reject Ho! (Hence we accept Ho)

29 Step 4: Goodness of fit tests Chi square test
Test statistic: Arrange n observations into a set of k class intervals or cells. The test statistic is given by where Oi is the observed frequency in the ith class interval and Ei is the expected frequency in the ith class interval. where pi is the theoretical probability associated with the ith class, i.e., pi =P(random variable X belongs to ith class).

30 Step 4: Goodness of fit tests Chi square test
Recommendations for number of class intervals for continuous data It is suggested that In case it is smaller, then that class should be combined with the adjacent classes. Similarly the corresponding Oi values should also be combined and k should be reduced by every combined cell. Sample Size, n Number of Class Intervals k 20 Do not use chi-square test 50 5-10 100 10 to 20 >100 to n/5

31 Step 4: Goodness of fit tests Chi square test
Evaluation Let α =P(rejecting Ho when it is true); the significance level is 5%. If probability of the test statistic < α, reject Ho and the distribution otherwise, fail to reject Ho. follows the chi-square distribution with k-s-1 degress of freedom, where s is the number of estimated parameters. Fail to Reject Ho Reject Ho

32 Chi-square distribution table
(k-s-1) α 𝜒 𝛼,𝑘−𝑠−1 2

33 Step 4: GFT - chi square test Ex: poisson distribution
Consider the discrete data we analyzed in step 2. Ho: # arrivals, X~ Poisson (λ=3.64) Ha: ow λ is the mean rate of arrivals, =3.64 The following probabilities are found by using the pmf P(0)=0.026 P(6)=0.085 P(1)=0.096 P(7)=0.044 P(2)=0.174 P(8)=0.020 P(3)=0.211 P(9)=0.008 P(4)=0.192 P(10)=0.003 P(5)=0.140 P(>11)=0.001

34 Step 4: GFT - chi square test Ex: poisson distribution
Calculation of the chi-square test statistic with k-s-1=7-1-1=5 degrees of freedom and α=0,05. So, Ho is rejected!

35 Step 4: GFT - chi square test Ex: arena input analyzer
Distribution Summary Distribution: Normal Expression: NORM(225, 89) Square Error: Chi Square Test Number of intervals = 12 Degrees of freedom = 9 Test Statistic = 1.22e+004 Corresponding p-value < Data Summary Number of Data Points = Min Data Value = 1 Max Data Value = 1.88e+003 Sample Mean = 225 Sample Std Dev = 89 Histogram Summary Histogram Range = to 1.88e+003 Number of Intervals = 40 Reject Normal distribution at 5% significance level! Fit all summary Function Sq Error Normal Gamma Beta Erlang Weibull Lognormal Exponential Triangular Uniform

36 Step 4: GFT - chi square test Ex: arena input analyzer
Distribution Summary Distribution: Lognormal Expression: 2 + LOGN(145, 67.9) Square Error: Chi Square Test Number of intervals = 4 Degrees of freedom = 1 Test Statistic = 207 Corresponding p-value < Data Summary Number of Data Points = Min Data Value = 2 Max Data Value = 6.01e+003 Sample Mean = 146 Sample Std Dev = 79.5 Histogram Summary Histogram Range = 2 to 6.01e+003 Number of Intervals = 40 Reject Lognormal distribution at 5% significance level!

37 Step 4: GFT - chi square test Ex: arena input analyzer
Distribution Summary Distribution: Weibull Expression: WEIB(94.7, 0.928) Square Error: Chi Square Test Number of intervals = 20 Degrees of freedom = 17 Test Statistic = 838 Corresponding p-value < Data Summary Number of Data Points = Min Data Value = 1 Max Data Value = 1.47e+003 Sample Mean = 108 Sample Std Dev = 135 Histogram Summary Histogram Range = to 1.47e+003 Number of Intervals = 40 Reject Weibull distribution at 5% significance level!

38 Step 4: Goodness of fit tests Drawbacks of Chi-square GFT
The Chi-square test uses the estimates of the parameters obtained from the sample that decreases the degrees of freedom. Chi-square test requires the data to be placed in class intervals in the continuous distributions where these classes are arbitrary and affects the value of the chi-square test statistic. The distribution of the chi-square test statistic is known approximately and the power of the test (probability of rejecting an incorrect theoretical distribution) is sometimes low. Hence other GFTs are also needed!

39 Step 4: Goodness of fit tests Kolmogorov-Smirnov test
Useful when the sample sizes are small and when no parameters are estimated from the sample data. Compares the cdf of the theoretical distribution, F(x) with the emprical cdf, SN(x) of the sample of N observations. Hypothesis test: Ho: Data follow the selected pdf Ha: Data do NOT follow the selected pdf Test Statistic: The largest deviation, D between F(x) and SN(x).

40 Step 4: Goodness of fit tests Kolmogorov-Smirnov test
Steps of K-S Test: Rank the data so that Calculate the maximum discrepancy D between F and SN,

41 Step 4: Goodness of fit tests Kolmogorov-Smirnov test
If F is discrete , where If F is continuous

42 Step 4: Goodness of fit tests Kolmogorov-Smirnov test
Evaluation

43 Step 4: Goodness of fit tests Example: Kolmogorov-Smirnov test
Consider the data: 0.44, 0.81, 0.14, 0.05, 0.93 Ho: Data are uniform between (0,1) Ha: ow i 1 2 3 4 5 0.05 0.14 0.44 0.81 0.93 0.20 0.40 0.60 0.80 1.00 0.15 0.26 0.16 - 0.07 0.04 0.21 0.13 Since D=0.26 < = Ho is not rejected! Data are uniform between (0,1)


Download ppt "Graduate Program in Engineering and Technology Management"

Similar presentations


Ads by Google