Approximate methods for calculating probability of failure

Slides:



Advertisements
Similar presentations
Approximate methods for calculating probability of failure Monte Carlo simulation First-order second-moment method (FOSM) Working with normal distributions.
Advertisements

The Normal Distribution
1 8. Numerical methods for reliability computations Objectives Learn how to approximate failure probability using Level I, Level II and Level III methods.
MEGN 537 – Probabilistic Biomechanics Ch.7 – First Order Reliability Methods Anthony J Petrella, PhD.
Chapter 8: Estimating with Confidence
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Chapter 10 Curve Fitting and Regression Analysis
Structural Reliability Theory
Structural Reliability Analysis – Basics
May 2007 Hua Fan University, Taipei An Introductory Talk on Reliability Analysis With contribution from Yung Chia HSU Jeen-Shang Lin University of Pittsburgh.
Review of Basic Probability and Statistics
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Section 6-4 Sampling Distributions and Estimators Created by.
Class 5: Thurs., Sep. 23 Example of using regression to make predictions and understand the likely errors in the predictions: salaries of teachers and.
Maximum likelihood (ML)
Maximum likelihood (ML) and likelihood ratio (LR) test
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Exam includes sections from Chapter 5 –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33,
Continuous Random Variables and Probability Distributions
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Maximum likelihood (ML)
The Multivariate Normal Distribution, Part 1 BMTRY 726 1/10/2014.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Modern Navigation Thomas Herring
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Component Reliability Analysis
Probability distribution functions
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
Chapter 4-5: Analytical Solutions to OLS
Probabilistic Mechanism Analysis. Outline Uncertainty in mechanisms Why consider uncertainty Basics of uncertainty Probabilistic mechanism analysis Examples.
Copyright ©2011 Nelson Education Limited The Normal Probability Distribution CHAPTER 6.
Mid-Term Review Final Review Statistical for Business (1)(2)
MEGN 537 – Probabilistic Biomechanics Ch.7 – First Order Reliability Methods Anthony J Petrella, PhD.
1 6. Reliability computations Objectives Learn how to compute reliability of a component given the probability distributions on the stress,S, and the strength,
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
8 Sampling Distribution of the Mean Chapter8 p Sampling Distributions Population mean and standard deviation,  and   unknown Maximal Likelihood.
1 Introduction What does it mean when there is a strong positive correlation between x and y ? Regression analysis aims to find a precise formula to relate.
What is a Confidence Interval?. Sampling Distribution of the Sample Mean The statistic estimates the population mean We want the sampling distribution.
: An alternative representation of level of significance. - normal distribution applies. - α level of significance (e.g. 5% in two tails) determines the.
Math 4030 – 6a Joint Distributions (Discrete)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Separable Monte Carlo Separable Monte Carlo is a method for increasing the accuracy of Monte Carlo sampling when the limit state function is sum or difference.
Sampling Theory and Some Important Sampling Distributions.
Chapter 20 Statistical Considerations Lecture Slides The McGraw-Hill Companies © 2012.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Optimization formulation Optimization methods help us find solutions to problems where we seek to find the best of something. This lecture is about how.
Uncertainty and Reliability Analysis D Nagesh Kumar, IISc Water Resources Planning and Management: M6L2 Stochastic Optimization.
Global predictors of regression fidelity A single number to characterize the overall quality of the surrogate. Equivalence measures –Coefficient of multiple.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Inference for the Mean of a Population
MEGN 537 – Probabilistic Biomechanics Ch
Probability Theory and Parameter Estimation I
Inference for Proportions
The distribution function F(x)
Chapter 3 Component Reliability Analysis of Structures.
Math 4030 – 12a Correlation.
MEGN 537 – Probabilistic Biomechanics Ch
Statistical Methods For Engineers
Estimating probability of failure
Tutorial 9: Further Topics on Random Variables 2
MEGN 537 – Probabilistic Biomechanics Ch
Sampling Distribution of the Mean
Presentation transcript:

Approximate methods for calculating probability of failure Working with normal distributions is appealing First-order second-moment method (FOSM) Most probable point First order reliability method (FORM) The material for this lectures relies on Choi, Grandhi and Canfield’s Reliability Based Structural Design Chapter 4, Section 4.1.

The normal distribution is attractive It has the nice property that linear functions of normal variables are normally distributed. Also, the sum of many random variables tends to be normally distributed. Probability of failure varies over many orders of magnitude. Reliability index, which is the number of standard deviations away from the mean solves this problem.

Approximation about mean Predecessor of FORM called first-order second-moment method (FOSM)

Beam under central load example Probability of exceeding plastic moment capacity All normal P L W (plastic section modulus) T (yield stress) mean 10kN 8m 0.0001m^3 600,000 kN/m^2 Standard deviation 2kN 0.1m 0.00002m^3 100,000kN/m^2

Reliability index for example Using the linear approximation get Example 4.2 of CGC shows that if we change to g=T-0.25PL/W we get 3.48 (0.00025, exact is 2.46 or Pf=0.0069)

Top Hat question In the beam example, the error in estimating the reliability index was due to Linear approximation g is not normal both

Most probable point (MPP) The error due to the linear approximation is exacerbated due to the fact that the expansion may be about a point that is far from the failure region (due to the safety margin). Hasofer and Lind suggested remedying this problem by finding the most probable point and linearizing about it. The joint distribution of all the random variables assigns a probability density to every point in the random space. The point with the highest density on the line g=0 is the MPP.

Response minus capacity illustration r=randn(1000,1)*1.25+10; c=randn(1000,1)*1.5+13; f=@(x) x; fplot(f,[5,20]) hold on plot(r,c,'ro') xlabel('r') ylabel('c')

Recipe for finding MPP with independent normal variables Transform into standard normal variables (zero mean and unity standard deviation) Find the point on g=0 of minimum distance to origin. The point will be the MPP and the distance to the origin will be the reliability index based on linear approximation there.

First order reliability method (FORM) Limit state g(X). Failure when g<0. Linear approximation of limit state together with assumption that random variables are normal. Approximate around most probable point. Then limit state is also normal variable. Reliability index is the distance of the mean of g from zero measured in standard deviations.

Visual

Linear Example For linear example Then The failure boundary Distance from origin

Check by MCS r=randn(1000,1)*1.25+10; c=randn(1000,1)*1.5+13; >> s=0.5*(sign(r-c)+1);pf=sum(s)/1000=0.0550 Six repetitions gave: 0.0660, 0.0640, 0.0670,0.0450, 0.0550, 0.0640 With million samples got 0.06258 So even with a million samples, accurate only to two digits.

General case If random variables are normal but correlated, a linear transformation will transform them to independent variables. If random variables are not normal, can be transformed to normal with similar probability of failure. See Section 4.1.5 of CGC (It is called the Rosenblatt transformation) Murray Rosenblatt, Remarks on a Multivariate Transformation, Ann. Math. Statist. Volume 23, Number 3 (1952), 470-472.

Approximate transformation If we want the transformed variable u to have the same CDF as the original x at a certain value of x we would require Unfortunately we cannot enforce that everywhere. If we focus on MPP we get the following transformation (Eqs. 4.38, 4.39), which must be used iteratively, starting from some guess. X* replaced by xMPP in equations.