Download presentation
Presentation is loading. Please wait.
Published byRoger Harmon Modified over 9 years ago
1
JHU Workshop: 13-14 November 2003 Elements of Predictability Robust System Analysis, Model Updating and Model Class Selection James L. Beck Applied Mechanics & Civil Engineering California Institute of Technology
2
Introduction Part I: Present probabilistic framework for robust system analysis and its updating based on system data Robust means uncertainties in system modeling explicitly addressed Part II: Present probabilistic framework for model class selection based on system data
3
Introduction Features Framework based on probability axioms and no other ad-hoc criteria or concepts. ( Repeated use of Total Prob. Theorem and Bayes Theorem). Employs probability of models - uses Cox-Jaynes interpretation of probability as a multi-valued logic quantifying plausibility of statements conditional on specified information Careful tracking of all conditioning information since all probabilities are conditional on probability models and other specified information.
4
Introduction Features Involves integrations over high dimensional input and model parameter spaces but computational tools are available and are improving with research Framework is general but our focus is on dynamical systems in engineering so much prior information is available
5
Part I: Robust System Analysis Stochastic System Modeling Probabilistic input-output model for system with uncertainties in system modeling addressed Prior System Analysis Uncertainties in system excitation addressed Reliability analysis Posterior System Analysis Bayesian updating based on system data Updated reliability analysis
6
Stochastic System Modeling Predictive model: Gives probabilistic input- output relation for system : System Input Output (unknown) (known) where input (if available) and output time histories:
7
Usually have a set of possible predictive models for system: Nominal prior predictive model: Select single model, e.g. most plausible model in set But there is uncertainty in which model gives most accurate predictions Stochastic System Modeling (Continued)
8
Robust prior predictive model: Select to quantify the plausibility of each model in set, then from Total Prob. Theorem: Stochastic System Modeling (Continued)
9
Stochastic System Model: Example 1 Complete system input known: Define deterministic input-output model for System InputOutput PDF for prediction-error time history gives Prediction error: Can take prediction errors as zero-mean Gaussian & independent in time (max. entropy distribution), so is Gaussian with mean and covariance matrix
10
Stochastic System Model: Example 2 Complete system input not known: Define state-space dynamic model for system by: System Input Output (unknown) (known) Probability models for initial state and the time histories of and define
11
Excitation Uncertainty: Choose probability model over set of possible system inputs: Nominal prior predictive analysis: Find the probability that system output lies in specified set using nominal model: Reliability problem corresponds to defining ‘failure’ (= specified unacceptable performance of system) Primary computational tools for complex dynamical systems are advanced simulation methods (examples later) and Rice’s out-crossing theory for simple systems Prior System Analysis
12
Robust prior predictive analysis: Robust reliability if defines failure Primary computational tools: Simulation e.g. importance sampling with ISD at peak(s) of integrand (needs optimization) Asymptotic approximation w.r.t. curvature of the peak(s) of integrand (needs optimization) Huge differences possible between nominal and robust failure probabilities Prior System Analysis (Continued)
13
Comparisons between nominal and robust failure probabilities available in: Papadimitriou, Beck & Katafygiotis, “Updating Robust Reliability using Structural Test Data”, Prob. Eng. Mech., April 2001 Au, Papadimitriou and Beck, “Reliability of Uncertain Dynamical Systems with Multiple Design Points” Structural Safety, 1999 Prior System Analysis (Continued)
14
Available System Data: Update by Bayes Theorem: Optimal posterior predictive model: Select most plausible model in set based on data i.e. that maximizes the posterior PDF (if unique) Optimal posterior predictive analysis: Difficulties: Non-convex multi-dimensional optimization (‘parameter estimation’); ignores model uncertainty Posterior System Analysis
15
Robust posterior predictive model: Use all predictive models weighted by their updated probability (exact solution based on prob. axioms): Robust posterior predictive analysis: Primary computational tools are advanced simulation methods and asymptotic approximation w.r.t. sample size Posterior System Analysis (Continued)
16
Asymptotic approximation for large N for robust posterior predictive analysis (Beck & Katafygiotis, J. Eng. Mech., April 1998; Papadimitriou, Beck & Katafygiotis, Prob. Eng. Mech., April 2001) Assumes system is identifiable based on the data, i.e. finite number of MPVs that locally maximize posterior PDF, so needs optimization. Uses Laplace’s method for asymptotic approximation Posterior System Analysis (Continued)
17
The weights are proportional to the volume under the peak of the posterior PDF at Globally identifiable case justifies using MPV for posterior predictive model when there is large amounts of data: Gives a rigorous justification for doing predictions with MPV or MLE model Error in approx. is
18
Posterior System Analysis (Continued) Unidentifiable case corresponds to a continuum of MPVs lying on a lower dimensional manifold in the parameter space Our interest in this case is driven by finite-element model updating Asymptotic approximation for posterior predictive model for large amount of data is an integral over this manifold – feasible if it is low dimension (<4?) (Katafygiotis, Papadimitriou and Lam, Soil Dyn. Eq. Eng., 1998; Papadimitriou, Beck and Katafygiotis, Prob. Eng. Mech., April 2001) All MPV models give similar predictions at observed DOFs but may be quite different at unobserved DOFs
19
Posterior System Analysis (Continued) Simulation approaches: Very challenging because most of probability content of posterior PDF concentrated in a small volume of parameter space (IS does not work) Potential of avoiding difficult non-convex multi- dimensional optimization and handling unidentifiable case in higher dimensions Markov Chain Monte Carlo simulation using Metropolis-Hastings algorithm shows promise. (Beck & Au, J. Eng. Mech., April 2002: Adaptive method that works OK for up to 10 or so model parameters)
20
Recent Simulation Tools for Dynamic Reliability Calculations
21
Dynamic Reliability: First-Excursion Problem What is the probability that any response quantity of interest exceeds some limits within a given duration of interest? +b -b “Failure”
22
Advanced Simulation Methods Goal is to achieve accurate estimates of small failure probabilities using much fewer samples than Standard Monte Carlo Simulation which is not efficient for small failure probabilities General philosophy of improving simulation is to gain more information about failure region from samples or prior analysis
23
Recent Advanced Simulation Methods ISEE ISEE (Importance Sampling using Elementary Events): For linear dynamical systems with uncertain excitation Requires less that 50 system analyses for any first-excursion failure probability to get c.o.v. 20% (Au and Beck, Prob. Eng. Mech., July 2001) Subset Simulation: Applicable to general dynamical systems with both excitation and modeling uncertainties Substantial improvement in efficiency over standard MCS for small failure probabilities (Au and Beck, Prob. Eng. Mech., October 2001 Au and Beck, J. Eng. Mech., August 2003)
24
For simulation, formulate failure probability as a multi- dimensional integral over input variable space: First-Excursion Problem: Discrete Form Challenge: High dimension of input variable space
25
Importance Sampling Simulation Estimation of failure probability:
26
Importance Sampling Density Goal is to choose importance sampling density (ISD) to reduce variance of the estimator of the failure probability Theoretical optimal choice for ISD is the conditional PDF given failure But not feasible since the normalizing constant is the unknown failure probability! The ISD should be chosen to approximate the conditional PDF, which requires information about failure region Usual choice is weighted sum of Gaussian PDFs located at “design points”, i.e. maxima of
27
Solution of First-Excursion Problem by Importance Sampling Challenges: The dimension of the uncertain input parameter space is extremely large (e.g. 1000) Calculation of design points is a difficult non-convex optimization problem, in general Geometry of failure region is difficult to visualize Solving first-excursion problem by importance sampling for general dynamical systems is not yet successful
28
ISEE ISEE: Key Aspects ISEE focuses on linear dynamical systems, so Information about failure region can be obtained analytically Design points can be readily obtained using unit impulse response functions
29
ISEE ISEE: Elementary failure events/regions Consider one output response:
30
ISEE ISEE: Elementary failure regions Analytical study of elementary failure region in standard Gaussian input space Linear dynamics, so failure surface is two parallel hyperplanes
31
ISEE ISEE: Elementary failure region How to obtain each design point? Elementary failure region Standard Gaussian Space
32
Design point is ‘critical excitation’ (Drenick 1970), i.e. the excitation with least ‘energy’ that drives the response to level at time Readily obtained from unit impulse response functions Euclidean norm: ISEE ISEE: Design point for each elementary failure region
33
We know a lot about each elementary failure region: Design point Probability content Efficient generation of random samples according to conditional PDF ISEE ISEE: Elementary failure regions
34
ISEE ISEE: Union of elementary failure regions Complexity of first excursion stems from union of elementary failure events In addition to the ‘global’ design point, a large number of neighboring design points are important in accounting for the failure probability, and hence should be used in constructing the importance sampling density
35
ISEE ISEE: Conventional choice of ISD
36
ISEE ISEE: Optimal choice of ISD
37
ISEE ISEE: Failure Probability Estimator Counting the no. of time instants at which response fails Estimate assuming elementary failure events are mutually exclusive (analytical) Correction term to be estimated by simulation =1 if elementary failure events are mutually exclusive
38
ISEE ISEE: Applicability Linear dynamics with Gaussian input Non-stationary input, e.g., white noise modulated by envelope function Multiple input excitations Multiple output responses Non-white input, e.g., filtered white noise
39
ISEE ISEE: Linear 3-bay-by-6-story steel frame Atkinson-Silva Model for Ground Motion
40
ISEE ISEE: Linear 3-bay-by-6-story steel frame Failure Probability estimates
41
ISEE ISEE: Concluding Remarks ISEE is built upon the following observations: Analysis of failure region as a union of elementary failure regions corresponding to failure at different time instants For linear systems, the design points can be obtained in terms of unit impulse response functions In addition to the global design point, a large number of design points are important in contributing to the failure probability The importance sampling density is constructed as a weighted sum of conditional failure PDFs
42
ISEE ISEE: Concluding Remarks Continued: It’s remarkably efficient for first-excursion problem with linear dynamics and Gaussian excitation Failure probability estimator for N samples has c.o.v. where decreases slightly with decreasing failure probability, in contrast to Monte Carlo simulation where
43
SS Subset Simulation: Basic Idea For c.o.v. of 30%: Can we avoid simulation of rare events?
44
100? 100? 100? SS Subset Simulation: Basic Idea Express failure probability as a product of conditional failure probabilities, e.g. P(IDR>1.5%)= P(IDR>1.5%|IDR>1%) P(IDR>1%)P(IDR>1%|IDR>0.5%)P(IDR>0.5%) 0.0010.1 = 10,000 # samples: How do we efficiently simulate conditional samples to estimate conditional probabilities?
45
SS Subset Simulation with Markov Chain Monte Carlo (MCMC) Generate conditional samples by MCMC simulation for each nested failure region: Use specially designed Markov chain so that at each level conditional samples are distributed as MH algorithm: Metropolis et al. (1954), Hastings (1971) Samples are dependent but same failure probability estimator as MCS can be used
46
Subset Simulation: Concluding Remarks Efficient failure probability estimation for general dynamical systems Expresses a small failure probability as a product of larger conditional failure probabilities Substantial improvement over standard Monte Carlo for small failure probabilities (1,400 samples at 0.001 compared with 10,000 for same c.o.v). Applicable in high dimensions (e.g. 1000 or more)
47
Continued: Failure probability estimator for N samples has c.o.v. where and whereas Monte Carlo simulation has Subset Simulation: Concluding Remarks
48
Appendix to Part I: Foundations of plausibility logic and probability
49
Plausibility Logic and Probability Axioms Axiomatic Foundations: [R. T. Cox: 1946, 1961: “The Algebra of Probable Inference”, Johns Hopkins Press; E. T. Jaynes: 1957, 2003: “Probability Theory: The Logic of Science”, C.U.P.] Consider extension of binary Boolean logic: if a is more plausible than b, given c What is the calculus of plausibility logic? How do we get
50
Plausibility Logic and Probability Axioms Impose conventions: Impose consistency with Boolean logic: if c logically implies a is true ( ) if c logically implies a is false ( )
51
Plausibility Logic and Probability Axioms Theorem (Cox 1946 + Aczel 1966): Under a strict monotonicity condition on : Scale of plausibility is arbitrary, so set
52
Plausibility Logic and Probability Axioms Gives axioms for plausibility measure: De Morgan’s Law:, so Plausibility logic axioms=Probability logic axioms!
53
Plausibility Logic and Probability Axioms Conclusion: Interpret probability based on PDF as a measure of the plausibility of the model specified by given the information
54
References: R. T. Cox: 1946, 1961: “The Algebra of Probable Inference”, Johns Hopkins Press; E. T. Jaynes: 1957, 2003: “Probability Theory: The Logic of Science”, C.U.P.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.