Reliability Engineering

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov.
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
MARKOV CHAIN EXAMPLE Personnel Modeling. DYNAMICS Grades N1..N4 Personnel exhibit one of the following behaviors: –get promoted –quit, causing a vacancy.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Markov Chains.
Fault Tree Analysis Part 12 – Redundant Structure and Standby Units.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Markov Analysis Jørn Vatn NTNU.
TCOM 501: Networking Theory & Fundamentals
G12: Management Science Markov Chains.
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
Dependability Evaluation through Markovian model.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Al-Imam Mohammad Ibn Saud University
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Entropy Rates of a Stochastic Process
Continuous Time Markov Chains and Basic Queueing Theory
Reliable System Design 2011 by: Amir M. Rahmani
Lecture 13 – Continuous-Time Markov Chains
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
Queueing Theory (2). Home Work 12-9 and Due Day: October 31 (Monday) 2005.
Performance analysis for high speed switches Lecture 6.
TCOM 501: Networking Theory & Fundamentals
Chapter 4: Stochastic Processes Poisson Processes and Markov Chains
Question 11 – 3.
Markov Chains Chapter 16.
Orthogonality and Least Squares
1 Spare part modelling – An introduction Jørn Vatn.
Stochastic Process1 Indexed collection of random variables {X t } t   for each t  T  X t is a random variable T = Index Set State Space = range.
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.
Lecture 11 – Stochastic Processes
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Introduction to Stochastic Models GSLM 54100
Entropy Rate of a Markov Chain
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Generalized Semi-Markov Processes (GSMP)
Intro. to Stochastic Processes
Chapter 2 Machine Interference Model Long Run Analysis Deterministic Model Markov Model.
NETE4631:Capacity Planning (2)- Lecture 10 Suronapee Phoomvuthisarn, Ph.D. /
Markov Decision Processes1 Definitions; Stationary policies; Value improvement algorithm, Policy improvement algorithm, and linear programming for discounted.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
Lecture 10 – Models of DNA Sequence Evolution Correct for multiple substitutions in calculating pairwise genetic distances. Derive transformation probabilities.
Maintenance Policies Corrective maintenance: It is usually referred to as repair. Its purpose is to bring the component back to functioning state as soon.
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
CS433 Modeling and Simulation Lecture 06 – Part 02 Discrete Markov Chains Dr. Anis Koubâa 11 Nov 2008 Al-Imam Mohammad.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Stochastic Processes and Transition Probabilities D Nagesh Kumar, IISc Water Resources Planning and Management: M6L5 Stochastic Optimization.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
ST3236: Stochastic Process Tutorial 6
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Markov Chains.
Discrete-time Markov chain (DTMC) State space distribution
Availability Availability - A(t)
Medium Access Control Protocols
Industrial Engineering Dep
V5 Stochastic Processes
Lecture on Markov Chain
Lecture 4: Algorithmic Methods for G/M/1 and M/G/1 type models
Reliability Engineering
Probability Fundamentals
Lecture 11 – Stochastic Processes
Presentation transcript:

Reliability Engineering Markov Model

State Space Method Example: parallel structure of two components Possible System States: 0 (both components in failed state); 1 (component 1 functioning, component 2 in failed state); 2 (component 2 functioning, component 1 in failed state); 3 (both components functioning).

State Space Diagram 3 2 1

Markov Processes The event means that the system at time t is in state j and j = 0,1, 2, …,r. The probability of this event is denoted by The transitions between the states may be described by a stochastic process A stochastic process satisfying the Markov property is called the Markov process.

Markov Property Given that a system is in state i at time t, i.e. X(t)=i, the future states X(t+v) do not depends on the previous states X(u), u<t. For all possible x(u) and 0≦u<t.

Stationary Transition Probability A Markov process with stationary transition probabilities is often called a process with no memory.

Properties of Transition Probabilities Chapman-Kolmogorov equation

Transition Rate In the same way as the failure rate is defined, the transition rate from state i to state j can be defined as:

Derivation of State Equation (1) From Chapman-Kolmogorov equation Substitute

Derivation of State Equation (2) After dividing by Δt, letting Δt→0, we get the state equations.

State Equations

Simplified State Equations Since the initial state is known, the state equations can be simplified by omitting the first index i

State Equations in Matrix Notation Let Then where

Additional Properties Notice that the sums of the columns of the transition rate matrix add up to zero. Since this implies that the matrix is singular, the following additional constraint must be imposed The mean staying time in state j

Alternative Solution This equation is often computationally convenient way of approximating P(t).

Example Consider a single component with two states: 1 (the component is working) and 0 (the component is in a failed state). Thus, The state equations:

Example Since It can be derived that

Irreducible Markov Process A state j is said to be reachable from state i if for some t>0 the transition rate The process is said to be irreducible if every state is reachable from every other state. For an irreducible Markov process, the following limits always exist and are independent of the initial state of the process.

Asymptotic Probabilities

Frequency of Departure from State j to State k The unconditional probability of a departure from state j to state k in the time interval (t, t+Δt] is The frequency of departure is defined as

Frequency of Departure from State j at Steady State The total frequency of departure

Frequency of Arrival to State j at Steady State The frequency of arrival from state k to state j at the steady state The total frequency of arrivals to state j (from state equations at steady state)

Visit Frequency The visit frequency to state j is defined as the expected number of visits to state j per unit time. At steady state!

Mean Duration of a Visit The total departure rate from state j Since the departure rate is constant, the duration of a stay in state j should be exponentially distributed with parameter Thus, the mean duration of stay is

A Useful Relation The mean proportion of time the system is spending in state j ( ) A special case is the formula for unavailability under corrective maintenance policy

System Availability Let S={1, 2, …, r} be the set of all possible states of a system. Let B denote the subset of states in which the system is functioning. Let F=S-B denote the states in which the system is failed. Then, the average (or long-term) system availability and unavailability are