EGGN 598 – Probabilistic Biomechanics Ch.7 – First Order Reliability Methods Anthony J Petrella, PhD.

Slides:



Advertisements
Similar presentations
Ch 2.7: Numerical Approximations: Euler’s Method
Advertisements

Equation of Tangent line
Notes Over 10.3 r is the radius radius is 4 units
Chapter 9: Vector Differential Calculus Vector Functions of One Variable -- a vector, each component of which is a function of the same variable.
The gradient as a normal vector. Consider z=f(x,y) and let F(x,y,z) = f(x,y)-z Let P=(x 0,y 0,z 0 ) be a point on the surface of F(x,y,z) Let C be any.
1 8. Numerical methods for reliability computations Objectives Learn how to approximate failure probability using Level I, Level II and Level III methods.
Open Methods Chapter 6 The Islamic University of Gaza
MEGN 537 – Probabilistic Biomechanics Ch.7 – First Order Reliability Methods Anthony J Petrella, PhD.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Newton’s Method Application to LMS Recursive Least Squares Exponentially-Weighted.
Reliability Based Design Optimization. Outline RBDO problem definition Reliability Calculation Transformation from X-space to u-space RBDO Formulations.
PARTIAL DERIVATIVES 14. PARTIAL DERIVATIVES 14.6 Directional Derivatives and the Gradient Vector In this section, we will learn how to find: The rate.
Copyright 2001, J.E. Akin. All rights reserved. CAD and Finite Element Analysis Most ME CAD applications require a FEA in one or more areas: –Stress Analysis.
Open Methods Chapter 6 The Islamic University of Gaza
x – independent variable (input)
Radial Basis Functions
PARAMETRIC EQUATIONS AND POLAR COORDINATES 9. Usually, we use Cartesian coordinates, which are directed distances from two perpendicular axes. Here, we.
Linear Separators. Bankruptcy example R is the ratio of earnings to expenses L is the number of late payments on credit cards over the past year. We will.
Efficient Methodologies for Reliability Based Design Optimization
Ch 2.1: Linear Equations; Method of Integrating Factors
Open Methods Chapter 6 The Islamic University of Gaza
Single station location Multiple station location
PARTIAL DERIVATIVES PARTIAL DERIVATIVES One of the most important ideas in single-variable calculus is:  As we zoom in toward a point on the graph.
Chapter 14 Section 14.3 Curves. x y z To get the equation of the line we need to know two things, a direction vector d and a point on the line P. To find.
 By River, Gage, Travis, and Jack. Sections Chapter 6  6.1- Introduction to Differentiation (Gage)  The Gradient Function (Gage)  Calculating.
Circles Notes. 1 st Day A circle is the set of all points P in a plane that are the same distance from a given point. The given distance is the radius.
October 8, 2013Computer Vision Lecture 11: The Hough Transform 1 Fitting Curve Models to Edges Most contours can be well described by combining several.
Component Reliability Analysis
Boyce/DiPrima 9 th ed, Ch 2.7: Numerical Approximations: Euler’s Method Elementary Differential Equations and Boundary Value Problems, 9 th edition, by.
CSE554Laplacian DeformationSlide 1 CSE 554 Lecture 8: Laplacian Deformation Fall 2012.
Chemistry 330 The Mathematics Behind Quantum Mechanics.
06 - Boundary Models Overview Edge Tracking Active Contours Conclusion.
October 14, 2014Computer Vision Lecture 11: Image Segmentation I 1Contours How should we represent contours? A good contour representation should meet.
Math 3120 Differential Equations with Boundary Value Problems Chapter 2: First-Order Differential Equations Section 2-6: A Numerical Method.
Newton's Method for Functions of Several Variables Joe Castle & Megan Grywalski.
MEGN 537 – Probabilistic Biomechanics Ch.7 – First Order Reliability Methods Anthony J Petrella, PhD.
Application of Derivative - 1 Meeting 7. Tangent Line We say that a line is tangent to a curve when the line touches or intersects the curve at exactly.
MEGN 537 – Probabilistic Biomechanics Ch.5 – Determining Distributions and Parameters from Observed Data Anthony J Petrella, PhD.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Sec 15.6 Directional Derivatives and the Gradient Vector
4.5: Linear Approximations, Differentials and Newton’s Method.
© 2011 Autodesk Freely licensed for use by educational institutions. Reuse and changes require a note indicating that content has been modified from the.
4.5: Linear Approximations, Differentials and Newton’s Method Greg Kelly, Hanford High School, Richland, Washington.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Derivation Computational Simplifications Stability Lattice Structures.
STROUD Worked examples and exercises are in the text PROGRAMME 8 DIFFERENTIATION APPLICATIONS 1.
Tangent Planes and Normal Lines
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 7: Linear and Generalized Discriminant Functions.
Ch 2.1: Linear Equations; Method of Integrating Factors A linear first order ODE has the general form where f is linear in y. Examples include equations.
Steepest Descent Method Contours are shown below.
EEE502 Pattern Recognition
Classification Course web page: vision.cis.udel.edu/~cv May 14, 2003  Lecture 34.
STROUD Worked examples and exercises are in the text Programme 9: Tangents, normals and curvature TANGENTS, NORMALS AND CURVATURE PROGRAMME 9.
CAD and Finite Element Analysis Most ME CAD applications require a FEA in one or more areas: –Stress Analysis –Thermal Analysis –Structural Dynamics –Computational.
Discriminative n-gram language modeling Brian Roark, Murat Saraclar, Michael Collins Presented by Patty Liu.
MEGN 537 – Probabilistic Biomechanics Ch.5 – Determining Distributions and Parameters from Observed Data Anthony J Petrella, PhD.
Operators in scalar and vector fields
Advanced Algebra H Notes Section 9.3 – Graph and Write Equations of Circles Objective: Be able to graph and write equations of circles. A _________ is.
CHAPTER 9.10~9.17 Vector Calculus.
Unit 2 Lesson #3 Tangent Line Problems
MEGN 537 – Probabilistic Biomechanics Ch
Equations of Tangents.
Ch 2.1: Linear Equations; Method of Integrating Factors
DIFFERENTIATION APPLICATIONS 1
EGGN 598 – Probabilistic Biomechanics Ch
Chapter 3 Component Reliability Analysis of Structures.
MEGN 537 – Probabilistic Biomechanics Ch
Fitting Curve Models to Edges
Response Surface Method
MEGN 537 – Probabilistic Biomechanics Ch
MEGN 537 – Probabilistic Biomechanics Ch
EGGN 598 – Probabilistic Biomechanics Ch
Presentation transcript:

EGGN 598 – Probabilistic Biomechanics Ch.7 – First Order Reliability Methods Anthony J Petrella, PhD

Review: Reliability Index To address limitations of risk-based reliability with greater efficiency than MC, we introduce the safety index or reliability index,  Consider the familiar limit state, Z = R – S, where R and S are independent normal variables Then we can write, and POF = P(Z ≤ 0), which can be found as follows…

Review: MPP Failure Safe

Recognize that the point on any curve or n-dimensional surface that is closest to the origin is the point at which the function gradient passes through the origin Distance from the origin is the radius of a circle tangent to the curve/surface at that point (tangent and gradient are perpendicular) Geometry of MPP MPP→closest to the origin →highest likelihood on joint PDF in the reduced coord. space * gradient direction gradient → perpendicular to tangent direction

Review: AMV Example For example, consider the non-linear limit state, where,

Review: AMV Geometry Recall the plot depicts (1) joint PDF of l_fem and h_hip, and (2) limit state curves in the reduced variate space (l_fem’,h_hip’), To find g(X) at a certain prob level, we wish to find the g(X) curve that is tangent to a certain prob contour of the joint PDF – in other words, the curve that is tangent to a circle of certain radius  We start with the linearization of g(X) and compute its gradient We look outward along the gradient until we reach the desired prob level This is the MPP because the linear g(X) is guaranteed to be tangent to the prob contour at that point

Review: AMV Geometry The red dot is (l_fem’*,h_hip’*), the tangency point for g linear_90% When we recalculate g 90% at (l_fem’*,h_hip’*) we obtain an updated value of g(X) and the curve naturally passes through (l_fem’*,h_hip’*) Note however that the updated curve may not be exactly tangent to the 90% prob circle, so there may still be a small bit of error (see figure below)

AMV+ Method The purpose of AMV+ is to reduce the error exhibited by AMV AMV+ simply translates to…“AMV plus iterations” Recall Step 3 of AMV: assume an initial value for the MPP, usually at the means of the inputs Recall Step 5 of AMV: compute the new value of MPP AMV+ simply involves reapplying the AMV method again at the new MPP from Step 5 AMV+ iterations may be continued until the change in g(X) falls below some convergence threshold

AMV+ Method (NESSUS) Assuming one is seeking values of the performance function (limit state) at various P-levels, the steps in the AMV+ method are: 1.Define the limit state equation 2.Complete the MV method to estimate g(X) at each P-level of interest, if the limit state is non-linear these estimates will be poor 3.Assume an initial value of the MPP, usually the means 4.Compute the partial derivatives and find alpha (unit vector in direction of the function gradient) 5.Now, if you are seeking to find the performance (value of limit state) at various P-levels, then there will be a different value of the reliability index  HL at each P-level. It will be some known value and you can estimate the MPP for each P-level as…

AMV+ Method (NESSUS) The steps in the AMV+ method (continued): 6.Convert the MPP from reduced coordinates back to original coordinates 7.Obtain an updated estimate of g(X) for each P-level using the relevant MPP’s computed in step 6 8.Check for convergence by comparing g(X) from Step 7 to g(X) from Step 2. If difference is greater than convergence criterion, return to Step 3 and use the new MPP found in Step 5.

We will continue with the AMV example already started and extend it with the AMV+ method AMV+ Example Mean Value MethodAMV Method – Iteration 1 X = MPP-1 g(X)

We will continue with the AMV example already started and extend it with the AMV+ method AMV+ Example AMV Method – Iteration 2AMV Method – Iteration 1 X = MPP-2 g(X)

AMV+ Example AMV Method – Iteration 2AMV Method – Iteration 1

AMV+ Example AMV Method – Iteration 2

AMV+ Example