1 Spare part modelling – An introduction Jørn Vatn.

Slides:



Advertisements
Similar presentations
Chapter 12 Maintainability and Availability introduction Availability analysis of the system requires a knowledge: (1)How the components are functional.
Advertisements

Longest Common Subsequence
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
The General Linear Model Or, What the Hell’s Going on During Estimation?
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Markov Chains.
1 Non-observable failure progression. 2 Age based maintenance policies We consider a situation where we are not able to observe failure progression, or.
Fault Tree Analysis Part 12 – Redundant Structure and Standby Units.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
1 The 4 standard failure models -to be used in maintenance optimization, with focus on state modelling Professor Jørn Vatn.
Operations Research: Applications and Algorithms
Markov Analysis Jørn Vatn NTNU.
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
SMJ 4812 Project Mgmt and Maintenance Eng.
TCOM 501: Networking Theory & Fundamentals
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Entropy Rates of a Stochastic Process
Reliable System Design 2011 by: Amir M. Rahmani
Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.
Lecture 13 – Continuous-Time Markov Chains
Chapter 4: Stochastic Processes Poisson Processes and Markov Chains
Contents Introduction Related problems Constructions –Welch construction –Lempel construction –Golomb construction Special properties –Periodicity –Nonattacking.
Efficient Estimation of Emission Probabilities in profile HMM By Virpi Ahola et al Reviewed By Alok Datar.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
Markov Chains Chapter 16.
Matrix Definition A Matrix is an ordered set of numbers, variables or parameters. An example of a matrix can be represented by: The matrix is an ordered.
The effect of New Links on Google Pagerank By Hui Xie Apr, 07.
Final Exam Review II Chapters 5-7, 9 Objectives and Examples.
Systems of Linear Equation and Matrices
ECES 741: Stochastic Decision & Control Processes – Chapter 1: The DP Algorithm 1 Chapter 1: The DP Algorithm To do:  sequential decision-making  state.
Generalized Semi-Markov Processes (GSMP)
Chapter 2 Machine Interference Model Long Run Analysis Deterministic Model Markov Model.
WEEK 8 SYSTEMS OF EQUATIONS DETERMINANTS AND CRAMER’S RULE.
Markov Decision Processes1 Definitions; Stationary policies; Value improvement algorithm, Policy improvement algorithm, and linear programming for discounted.
1 Basic probability theory Professor Jørn Vatn. 2 Event Probability relates to events Let as an example A be the event that there is an operator error.
Maintenance Policies Corrective maintenance: It is usually referred to as repair. Its purpose is to bring the component back to functioning state as soon.
Lecture 4: State-Based Methods CS 7040 Trustworthy System Design, Implementation, and Analysis Spring 2015, Dr. Rozier Adapted from slides by WHS at UIUC.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Continuous Probability Distributions Statistics for Management and Economics Chapter 8.
Generalized stochastic Petri nets (GSPN)
1 Component reliability Jørn Vatn. 2 The state of a component is either “up” or “down” T 1, T 2 and T 3 are ”Uptimes” D 1 and D 2 are “Downtimes”
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
1 a1a1 A1A1 a2a2 a3a3 A2A Mixed Strategies When there is no saddle point: We’ll think of playing the game repeatedly. We continue to assume that.
Stochastic Processes and Transition Probabilities D Nagesh Kumar, IISc Water Resources Planning and Management: M6L5 Stochastic Optimization.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 17 Queueing Theory.
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 60 Chapter 8 Markov Processes.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Reliability Engineering
Prof. Enrico Zio Availability of Systems Prof. Enrico Zio Politecnico di Milano Dipartimento di Energia.
Discrete-time Markov chain (DTMC) State space distribution
Availability Availability - A(t)
Medium Access Control Protocols
Non-observable failure progression
Industrial Engineering Dep
Lecture on Markov Chain
Software Reliability Models.
Chapter 5 Markov Analysis
Lecture 4: Algorithmic Methods for G/M/1 and M/G/1 type models
Queueing Theory II.
Continuous distributions
Presented By Farheen Sultana Ist Year I SEM
Presentation transcript:

1 Spare part modelling – An introduction Jørn Vatn

2 Motivation For single component optimization models (wrt  ) indicates that there might be beneficial to keep a spare in order to reduce the MDT, and hence the cost of a failure If we only have one component, we can compare the situation with, and without a spare, and find the best solution In case of many components, there will be a “competition” on achieving the spare in case of simultaneous failures Thus, we may consider to have more than one spare in stock What will be the optimal number of spares to keep

3 Content Situation 1 A situation with only one maintenance base, where failed components achieve a spare from the stock if available The failed component is repaired in a workshop, there are infinite number of repair men Situation 2 Components can not be repaired any longer, and at a critical stock size, n, we order m new spares There is a lead time before spares arrive Lead time is gamma distributed What we do not cover More than one maintenance base Several local, and one central stock Many other aspects

4 Model assumptions, situation1 Constant rate of failure (total for many components) = Number of spares = s An inventory (stock) holds available spares Failed spares are repaired in the workshop Number of spares in the workshop = X Repair rate for each failed component =  Infinite number of repair men

5 Modelling

6 Modelling, cont

7 Visual basic Function PrepModel1(lambda As Single, mu As Single, _ sMax As Integer, p() As Single, _ R() As Single, EBO() As Single) Dim lm As Single lm = lambda / mu p(0) = Exp(-lm) R(0) = 1 - p(0) EBO(0) = lm For s = 0 To sMax - 1 p(s + 1) = lm * p(s) / (s + 1) R(s + 1) = R(s) - p(s + 1) EBO(s + 1) = EBO(s) - R(s) Next s End Function

8 Simple cost model

9 Visual basic Function OptimizeModel1() Dim p(0 To 10) As Single Dim R(0 To 10) As Single Dim EBO(0 To 10) As Single Dim Cu As Single Dim Cs As Single Dim s As Integer Cu = Cs = 1 PrepModel1 0.01, 0.1, 10, p, R, EBO For s = 0 To 10 Debug.Print s, Cs * s + Cu * EBO(s) Next s End Function

10 Example result s Cost

11 Markov modelling Since failures and repairs are exponentially distributed an alternative modelling approach will be to use Markov We may implement different strategies, e.g., an finite number of repair men We may also introduce semi-Markov models to treat non- exponential repair times, or use virtual states in a phase type modelling approach Drawbacks It is hard to treat an infinite number of back orders We need manually to specify the transition matrix For huge systems, time and storage capacity is a limitation

12 Markov diagram

13 Solutions Transition matrix State vector Steady state solution Visiting frequencies

14 Transition matrix The indexing generally starts on 0, and moves to r, e.g., there are r +1 system states (we need special indexing) Each cell in the matrix has two indexes, where the first (row index) represent the ”from” state, whereas the second (column index) represent the “to” state. The cells represent transition rates from one state to another a ij is thus the transition rate from state i to state j The diagonal elements shall fulfil the condition that all cells in a row adds up to zero

15 State probabilities Let P i (t) represent the probability that the system is in state i at time t Now introduce vector notation, i.e. P(t) = [P 0 (t), P 1 (t),…,P r (t)] From the definition of the matrix diagram it might be shown that the Markov state equations are given by: P(t) A = d P(t)/d t These equations may be used to establish both the steady state probabilities, and the time dependent solution

16 The steady state solution In the long run when the system has stabilized we must have that d P(t)/d t = 0, hence P A = 0 This system of equations is over-determined, hence we may delete one column, and replace it with the fact that P 0 + P 1 +…+P r = 1 Hence, we have

17 The steady state solution P A 1 = b where and b = [0,0, …,0,1]

18 Transition matrix To -3To -2To -1To 0To 1To s From -3 =(s+3)*Mu0000 From -2=Lambda =(s+2)*Mu000 From -10=Lambda =(s+1)*Mu00 From 000=Lambda =(s+0)*Mu0 From 1000=Lambda =(s-1)*Mu From s0000=Lambda

19 Solution our example Steady state solution is obtained by P A 1 = b Fraction of time with spare part shortage = 1 is found by: U 1 = P -1 Fraction of time with spare part shortage = 2 is found by: U 2 = P -2 etc. Total unavailability U = U 1 + 2U 2 + 3U 3

20 Results Steady state pr. BOContr P E E-07 P E E-06 P E P04.524E-0300 P19.048E-0200 P29.048E-0100 EBO= Cost=

21 Assume only 1 repair man

22 Structure of transition matrix Define “infinity”, e.g., “-  = -3 ” (as high as feasible) The transition matrix starts with row index “ -  ” There are altogether  s rows The elements below the diagonal is always If infinite number of repair men, the elements above the diagonal starts with repair rate (  + s)  and decreases with  for each cell downwards the diagonal Each row sums up to 0 in order to find the diagonal elements To -3To -2To -1To 0To 1To s From -3 =(s+3)*Mu0000 From -2=Lambda =(s+2)*Mu000 From -10=Lambda =(s+1)*Mu00 From 000=Lambda =(s+0)*Mu0 From 1000=Lambda =(s-1)*Mu From s0000=Lambda

23 Model assumptions, situation 2 Constant rate of failure = Mean lead time when ordering new spares = MLT Lead times are Gamma (Erlang) distributed with parameters  = 4, and  =  / MLT Note that  = 4 may be changed to account for general value of SD(LT) =  ½ /  Ordering totally m new spares when stock level equals n

24 Phase type distribution and semi-Markov A continuous-time stochastic process is called a semi- Markov process if the embedded jump chain is a Markov chain, and where the holding times (time between jumps) are random variables with any distribution, whose distribution function may depend on the two states between which the move is made Semi-Markov processes are hard to work with A phase-type distribution is a probability distribution that results from a system of one or more inter-related Poisson processes occurring in sequence, or phases. The distribution can be represented by a random variable describing the time until absorption of a Markov process with one absorbing state. Example  Erlang distribution

25 Diagram for stock

26 Markov diagram, step 1 (failures)

27 Markov diagram, step 2 (“repair”)

28 Markov diagram, step 2 (“repair”)

29 Markov diagram, step 3 (still failing…)

30 Markov diagram, step 4 (complete model)

31 Solution for our example Steady state solution is obtained by P A 1 = b Fraction of time with spare part shortage = 1 is found by: U 1 = P -1 + P P P -1 3 Fraction of time with spare part shortage = 2 is found by: U 2 = P -2 + P P P -2 3 etc. Total unavailability formula in Excel: U = U 1 + 2U 2 + 3U 3 Frequency of running out of spares equals F = (P -0 + P P P -0 3 )

32 What is ? Assume that we have a huge number ( N ) of components with Weibull distributed life times Consider the situation where t <  In this period there is no PM, i.e., we may assume a corrective strategy The total rate of failures as a function of t is N  w(t) where w(t) =  W(t)/  t is the renewal rate Initially N  w(t) should form the basis for the total failure rate, After some time we set = N [1/  + E (  )]