Download presentation
1
Reliability Engineering
Markov Model
2
State Space Method Example: parallel structure of two components
Possible System States: 0 (both components in failed state); 1 (component 1 functioning, component 2 in failed state); 2 (component 2 functioning, component 1 in failed state); 3 (both components functioning).
3
State Space Diagram 3 2 1
4
Markov Processes The event means that the system at time t is in state j and j = 0,1, 2, …,r. The probability of this event is denoted by The transitions between the states may be described by a stochastic process A stochastic process satisfying the Markov property is called the Markov process.
5
Markov Property Given that a system is in state i at time t, i.e. X(t)=i, the future states X(t+v) do not depends on the previous states X(u), u<t. For all possible x(u) and 0≦u<t.
6
Stationary Transition Probability
A Markov process with stationary transition probabilities is often called a process with no memory.
7
Properties of Transition Probabilities
Chapman-Kolmogorov equation
8
Transition Rate In the same way as the failure rate is defined, the transition rate from state i to state j can be defined as:
9
Derivation of State Equation (1)
From Chapman-Kolmogorov equation Substitute
10
Derivation of State Equation (2)
After dividing by Δt, letting Δt→0, we get the state equations.
11
State Equations
12
Simplified State Equations
Since the initial state is known, the state equations can be simplified by omitting the first index i
13
State Equations in Matrix Notation
Let Then where
14
Additional Properties
Notice that the sums of the columns of the transition rate matrix add up to zero. Since this implies that the matrix is singular, the following additional constraint must be imposed The mean staying time in state j
16
Alternative Solution This equation is often computationally convenient way of approximating P(t).
17
Example Consider a single component with two states: 1 (the component is working) and 0 (the component is in a failed state). Thus, The state equations:
18
Example Since It can be derived that
19
Irreducible Markov Process
A state j is said to be reachable from state i if for some t>0 the transition rate The process is said to be irreducible if every state is reachable from every other state. For an irreducible Markov process, the following limits always exist and are independent of the initial state of the process.
20
Asymptotic Probabilities
21
Frequency of Departure from State j to State k
The unconditional probability of a departure from state j to state k in the time interval (t, t+Δt] is The frequency of departure is defined as
22
Frequency of Departure from State j at Steady State
The total frequency of departure
23
Frequency of Arrival to State j at Steady State
The frequency of arrival from state k to state j at the steady state The total frequency of arrivals to state j (from state equations at steady state)
24
Visit Frequency The visit frequency to state j is defined as the expected number of visits to state j per unit time. At steady state!
25
Mean Duration of a Visit
The total departure rate from state j Since the departure rate is constant, the duration of a stay in state j should be exponentially distributed with parameter Thus, the mean duration of stay is
26
A Useful Relation The mean proportion of time the system is spending in state j ( ) A special case is the formula for unavailability under corrective maintenance policy
27
System Availability Let S={1, 2, …, r} be the set of all possible states of a system. Let B denote the subset of states in which the system is functioning. Let F=S-B denote the states in which the system is failed. Then, the average (or long-term) system availability and unavailability are
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.