Condition State Transitions and Deterioration Models H. Scott Matthews March 10, 2003.

Slides:



Advertisements
Similar presentations
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Advertisements

P ROBABILITY T HEORY APPENDIX C P ROBABILITY T HEORY you can never know too much probability theory. If you are well grounded in probability theory, you.
IERG5300 Tutorial 1 Discrete-time Markov Chain
Discrete Time Markov Chains
Models with Discrete Dependent Variables
The loss function, the normal equation,
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Lecture 3: Markov processes, master equation
Discrete-Time Markov Chains. © Tallal Elshabrawy 2 Introduction Markov Modeling is an extremely important tool in the field of modeling and analysis of.
Entropy Rates of a Stochastic Process
Hidden Markov Models Theory By Johan Walters (SR 2003)
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
Operations Research: Applications and Algorithms
Planning under Uncertainty
Visual Recognition Tutorial
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
Maximum likelihood (ML) and likelihood ratio (LR) test
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Course outline and schedule Introduction Event Algebra (Sec )
2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)
Reliability / Life Cycle Cost Analysis H. Scott Matthews February 17, 2004.
1 lBayesian Estimation (BE) l Bayesian Parameter Estimation: Gaussian Case l Bayesian Parameter Estimation: General Estimation l Problems of Dimensionality.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Course outline and schedule Introduction (Sec )
Maximum likelihood (ML)
Group exercise For 0≤t 1
Modern Navigation Thomas Herring
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy
1 Introduction to Stochastic Models GSLM Outline  discrete-time Markov chain  motivation  example  transient behavior.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Lecture 11 – Stochastic Processes
The effect of New Links on Google Pagerank By Hui Xie Apr, 07.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
1. An Overview of the Data Analysis and Probability Standard for School Mathematics? 2.
0 Pattern Classification, Chapter 3 0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda,
General information CSE : Probabilistic Analysis of Computer Systems
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
Fundamentals of Data Analysis Lecture 4 Testing of statistical hypotheses.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Lecture 8: Generalized Linear Models for Longitudinal Data.
Lecture 3: Inference in Simple Linear Regression BMTRY 701 Biostatistical Methods II.
Modern Navigation Thomas Herring
Statistical Applications Binominal and Poisson’s Probability distributions E ( x ) =  =  xf ( x )
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Lecture 7: What is Regression Analysis? BUEC 333 Summer 2009 Simon Woodcock.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
 { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains.
Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004 Assignment 3.
CS433 Modeling and Simulation Lecture 06 – Part 02 Discrete Markov Chains Dr. Anis Koubâa 11 Nov 2008 Al-Imam Mohammad.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Elements of a Discrete Model Evaluation.
Education 793 Class Notes Inference and Hypothesis Testing Using the Normal Distribution 8 October 2003.
By: Jesse Ehlert Dustin Wells Li Zhang Iterative Aggregation/Disaggregation(IAD)
10.1 Properties of Markov Chains In this section, we will study a concept that utilizes a mathematical model that combines probability and matrices to.
Stochastic Processes and Transition Probabilities D Nagesh Kumar, IISc Water Resources Planning and Management: M6L5 Stochastic Optimization.
Fundamentals of Data Analysis Lecture 4 Testing of statistical hypotheses pt.1.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
CS498-EA Reasoning in AI Lecture #19 Professor: Eyal Amir Fall Semester 2011.
Probabilistic Analysis of Computer Systems
Chapter 7. Classification and Prediction
3.1 Examples of Demand Functions
CH 5: Multivariate Methods
Of Probability & Information Theory
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
Analytics – Statistical Approaches
Lecture 11 – Stochastic Processes
Presentation transcript:

Condition State Transitions and Deterioration Models H. Scott Matthews March 10, 2003

Announcement  Midterm Grades will be posted today  Based on first 2 homework sets  Mostly As, a few Bs, C  Intended to convince you to keep effort level consistent for rest of semester

Recap of Last Lecture  Threats, Vulnerabilities, and Risks  Risk Assessment and Management  Classic and Modern Defense Models  Critical Infrastructure Protection  Focused on physical attacks from terrorists

Linear Regression (in 1 slide)  Arguably simplest of statistical models, have various data and want to fit an equation to it  Y (dependent variable)  X: vector of independent variables   : vector of coefficients   : error term  Y =  X +   Use R-squared, related metrics to test model and show how ‘robust’ it is

Markov Processes  Markov chain - a stochastic process with what is called the Markov property  Discrete and continuous versions  Discrete: consists of sequence X 1,X 2,X 3,.... of random variables in a "state space", with X n being "the state of the system at time n".  Markov property - conditional distribution of the "future" X n+1, X n+2, X n+3,.... given the "past” (X 1,X 2,X 3,...X n ), depends on the past only through X n.  i.e. ‘no memory’ of how X n reached  Famous example: random walk

Markov (cont.)  i.e., knowledge of the most recent past state of the system renders knowledge of less recent history irrelevant.  Markov chain may be identified with its matrix of "transition probabilities", often called simply its transition matrix (T).  Entries in T given by p ij =P(X n+1 = j | X n = i )  p ij : probability that system in state j "tomorrow" given that it is in state i "today".  ij entry in the k th power of the matrix of transition probabilities is the conditional probability that k "days" in the future the system will be in state j, given that it is in state i "today".

Markov Applications  Markov chains used to model various processes in queuing theory and statistics, and can also be used as a signal model in entropy coding techniques such as arithmetic coding.  Note Markov created this theory from analyzing patterns in words, syllables, etc.

Infrastructure Application  Used to predict/estimate transitions in states, e.g. for bridge conditions  Used by Bridge Management Systems, e.g. PONTIS, to help see ‘portfolio effects’ of assets under control  Helps plan expenditures/effort/etc.  Need empirical studies to derive parameters  Source for next few slides: Chase and Gaspar, Journal of Bridge Engineering, November 2000.

Sample Transition Matrix T = [ ]  Thus p ii suggests probability of staying in same state, 1- p ii probability of getting worse  Could ‘simplify’ this type of model by just describing vector P of p ii probabilities (1 - p ii ) values are easily calculated from P  Condition distribution of bridge originally in state i after M transitions is C i T M

Superstructure Condition  NBI instructions:  Code 9 = Excellent  Code 0 = Failed/out of service  If we assume no rehab/repair effects, then bridges ‘only get worse over time’  Thus transitions (assuming they are slow) only go from Code i to Code i-1  Need 10x10 matrix T  Just an extension of the 5x5 example above

Empirical Results  P = [0.71, 0.95, 0.96, 0.97, 0.97, 0.97, 0.93, 0.86, 1]  Could use this kind of probabilistic model result to estimate actual transitions

More Complex Models  What about using more detailed bridge parameters to guess deficiency?  Binary : deficient or not  What kind of random variable is this?  What types of other variables needed?

Logistic Models  Want Pr(j occurs) = Pr (Y=j) = F(effects)  Logistic distribution:  Pr (Y=1) = e  X / (1+ e  X )  Where  X is our usual ‘regression’ type model  Example: sewer pipes