“High-Confidence” Fragility Functions. “High-Confidence” Subjective Fragility Function Estimation Suppose 19 experts give 19 opinions on fragility median.

Slides:



Advertisements
Similar presentations
Bayesian Learning & Estimation Theory
Advertisements

Pattern Recognition and Machine Learning
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Design of Experiments Lecture I
Linear Regression.
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
Chapter 4 Functions of Random Variables Instructor: Prof. Wilson Tang CIVL 181 Modelling Systems with Uncertainties.
©Towers Perrin Emmanuel Bardis, FCAS, MAAA Cane Fall 2005 meeting Stochastic Reserving and Reserves Ranges Fall 2005 This document was designed for discussion.
Pattern Recognition and Machine Learning
SA-1 Probabilistic Robotics Planning and Control: Partially Observable Markov Decision Processes.
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Sensitivity Analysis In deterministic analysis, single fixed values (typically, mean values) of representative samples or strength parameters or slope.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
©GoldSim Technology Group LLC., 2004 Probabilistic Simulation “Uncertainty is a sign of humility, and humility is just the ability or the willingness to.
Deterministic Seismic Hazard Analysis Earliest approach taken to seismic hazard analysis Originated in nuclear power industry applications Still used for.
Regulatory Network (Part II) 11/05/07. Methods Linear –PCA (Raychaudhuri et al. 2000) –NIR (Gardner et al. 2003) Nonlinear –Bayesian network (Friedman.
A Review of Probability and Statistics
458 Fitting models to data – II (The Basics of Maximum Likelihood Estimation) Fish 458, Lecture 9.
Lecture 8 The Principle of Maximum Likelihood. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.
Statistical inference form observational data Parameter estimation: Method of moments Use the data you have to calculate first and second moment To fit.
Computer vision: models, learning and inference
Environmental Data Analysis with MatLab Lecture 7: Prior Information.
Lecture II-2: Probability Review
Recitation 1 Probability Review
High-Confidence, Low- Probability-of-Failure Screening January 11, 2014 RLGM? P[P[Failure|Eq]
PATTERN RECOGNITION AND MACHINE LEARNING
Chapter 4 – Modeling Basic Operations and Inputs  Structural modeling: what we’ve done so far ◦ Logical aspects – entities, resources, paths, etc. 
EM and expected complete log-likelihood Mixture of Experts
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 8-1 Confidence Interval Estimation.
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
VI. Evaluate Model Fit Basic questions that modelers must address are: How well does the model fit the data? Do changes to a model, such as reparameterization,
Applied Bayesian Inference, KSU, April 29, 2012 § ❷ / §❷ An Introduction to Bayesian inference Robert J. Tempelman 1.
Empirical Research Methods in Computer Science Lecture 7 November 30, 2005 Noah Smith.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
On Predictive Modeling for Claim Severity Paper in Spring 2005 CAS Forum Glenn Meyers ISO Innovative Analytics Predictive Modeling Seminar September 19,
CIA Annual Meeting LOOKING BACK…focused on the future.
Confidence Interval & Unbiased Estimator Review and Foreword.
Machine Design Under Uncertainty. Outline Uncertainty in mechanical components Why consider uncertainty Basics of uncertainty Uncertainty analysis for.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
Separable Monte Carlo Separable Monte Carlo is a method for increasing the accuracy of Monte Carlo sampling when the limit state function is sum or difference.
Machine Learning 5. Parametric Methods.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Chapter 8 Relationships Among Variables. Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Overwiew of Various System Reliability Analysis Methods Kim Hyoung Ju 1.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
CSC321: Lecture 8: The Bayesian way to fit models Geoffrey Hinton.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Estimating standard error using bootstrap
Data Modeling Patrice Koehl Department of Biological Sciences
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Supplementary Chapter B Optimization Models with Uncertainty
Lecture 1.31 Criteria for optimal reception of radio signals.
PDF, Normal Distribution and Linear Regression
Probability Theory and Parameter Estimation I
HAZARD AND FRAGILITY ANALYSIS
ACCURACY IN PERCENTILES
Special Topics In Scientific Computing
Review of Probability and Estimators Arun Das, Jason Rebello
Estimation Maximum Likelihood Estimates Industrial Engineering
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Modelling data and curve fitting
RAM XI Training Summit October 2018
Mathematical Foundations of BME Reza Shadmehr
Estimation Maximum Likelihood Estimates Industrial Engineering
#21 Marginalize vs. Condition Uninteresting Fitted Parameters
Parametric Methods Berlin Chen, 2005 References:
Applied Statistics and Probability for Engineers
Presentation transcript:

“High-Confidence” Fragility Functions

“High-Confidence” Subjective Fragility Function Estimation Suppose 19 experts give 19 opinions on fragility median A m and (logarithmic) standard deviation For each (discrete) value of strength y, find maximum of 19 cdfs and connect with a curve F max (y) – P[Strength < y|Expert A m and  ], (or A m and some percentile) – Assume each maximum represents the upper 95% confidence limit (“High-Confidence”) – Fit a lognormal distribution to curve to represent a 95% “High- Confidence” fragility function using (weighted) least squares P[F(y) ≤ F max (y) for all y]  0.95! Functional confidence curve is not French curve linking 95-th percentiles at several strength values

By Weighted Least Squares

What if there aren’t 19 experts? Bootstrap Correl(X1,X2) estimate requires at least one subjective opinion of distribution of X1|X2 Use least squares to combine experts’ subjective distribution information – Sum of squared errors indicted magnitude of experts’ deviation from lognormal distribution

Bootstrap 10 experts

NoFail.xlsm Spreadsheets Spreadsheet NameContents FreqMle of lognormal fragility parameters from 19 non-failure responses, assuming 20-th would be failure BayesMoM estimates of lognormal fragility parameters of a-posteriori distribution of P[Failure|Eq] from 19 non-failure responses assuming non-informative prior BayesCorrSame as Bayes, including estimate of fragility correlations Imagine inspections after earthquakes indicate component responses and failure or non-failure

Freq Spreadsheet Input iid responses for which no failures occurred – 19 responses were simulated for example and convenient interpretation of mean and standard deviation estimates as “High-Confidence” – Assume ln[Stress]-ln[Strength} ~ N[muX-muY, Sqrt(sigmaX^2+sigmaY^2)] Use Solver to maximize log likelihood of  P[Non- failure|Response] subject to constraint – Either constrain CV or P[Failure] Output is ln(Am) and 

Bayes Spreadsheets Bayes estimate of reliability r = P{ln[Stress]- ln{strength] > 0] – Non-informative prior distribution of r – Same inputs as Freq: responses and non-failures – Use MoM to find ln(Am) and  to match posterior E[r] = n/n+1 and Var[r] = n/((n+1)^2*(n+2)) – Ditto to find correlation  from third moment of a- posteriori distribution of r Bayes posterior P[ESEL component life > 72 hours|Eq and plant test data]

Parameter Estimates from 19 Non-Failure Responses Given 19 earthquake responses with ln(Median) = 0.5 and  = 0.1 and reliability P[ln(Stress) < ln(strength)] ~ 95% Bayes non-informative prior on reliability P[Response posterior distribution Use Method of Moments to estimate parameters for a-posteriori distribution of reliability ParameterAssume 20-th is failure BayesBayes Correlation  (ln(strength))  ln(strength)  ln(strength) N/A

What is the correlation of fragilities? See SubjFrag.xlsx:SubjCorr and NoFail.xlsm:BayesCorr spreadsheets to estimate correlations from subjective opinions on Y1|Y2 or from no-failure response observation HCLPF ignores fragility correlation Risk doesn’t ignore it

What if multiple, co-located components? Could assume responses are same; simplifies computations In series? Parallel? RBD? Fault tree? Using event trees, some people argue that HCLPF for one component is representative of all like, co-located components. – If all like, co-located components are all in the same safety system and not in any others

What if like-components are dependent? Fragilities could be dependent too! But not necessarily all fail if one fails – True, P[Response > strength] may be same for all like, co-located components – But what is P[g(Stress, strength) = failure] for system structure function g(.,.)?

References NAP, “Review of Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and the Use of Experts (1997) / Treatment of Uncertainty,” National Academies Press, f_margins_and_uncertainties f_margins_and_uncertainties