1 5. Combination of random variables Understand why we need bottoms-up approach for reliability analysis Learn how to compute the probability density function,

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

The Normal Distribution
Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.
Important Random Variables Binomial: S X = { 0,1,2,..., n} Geometric: S X = { 0,1,2,... } Poisson: S X = { 0,1,2,... }
A.P. Psychology Statistics Notes. Correlation  The way 2 factors vary together and how well one predicts the other  Positive Correlation- direct relationship.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Class notes for ISE 201 San Jose State University
QA-2 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 2.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Chapter 6 Continuous Random Variables and Probability Distributions
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Continuous Random Variables and Probability Distributions
Chapter 7 Estimation: Single Population
Chap 6-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 6 Continuous Random Variables and Probability Distributions Statistics.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Statistical Intervals Based on a Single Sample.
Lecture II-2: Probability Review
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Standard error of estimate & Confidence interval.
Chapter 6 Sampling and Sampling Distributions
Chapter 6 Random Error The Nature of Random Errors
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction l Example: Suppose we measure the current (I) and resistance (R) of a resistor. u Ohm's law relates V and I: V = IR u If we know the uncertainties.
STATISTICS: BASICS Aswath Damodaran 1. 2 The role of statistics Aswath Damodaran 2  When you are given lots of data, and especially when that data is.
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Chap. 4 Continuous Distributions
Chap 6-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 6 Introduction to Sampling.
Oceanography 569 Oceanographic Data Analysis Laboratory Kathie Kelly Applied Physics Laboratory 515 Ben Hall IR Bldg class web site: faculty.washington.edu/kellyapl/classes/ocean569_.
General Statistics Ch En 475 Unit Operations. Quantifying variables (i.e. answering a question with a number) Each has some error or uncertainty.
Normal distribution (2) When it is not the standard normal distribution.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 6-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
“ Building Strong “ Delivering Integrated, Sustainable, Water Resources Solutions Statistics 101 Robert C. Patev NAD Regional Technical Specialist (978)
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 3 Section 2 – Slide 1 of 27 Chapter 3 Section 2 Measures of Dispersion.
Probability and Statistics
Functions of random variables Sometimes what we can measure is not what we are interested in! Example: mass of binary-star system: We want M but can only.
Statistics for Business & Economics
Geology 5670/6670 Inverse Theory 21 Jan 2015 © A.R. Lowry 2015 Read for Fri 23 Jan: Menke Ch 3 (39-68) Last time: Ordinary Least Squares Inversion Ordinary.
: An alternative representation of level of significance. - normal distribution applies. - α level of significance (e.g. 5% in two tails) determines the.
Math 4030 – 6a Joint Distributions (Discrete)
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Continuous Random Variables and Probability Distributions
Sampling Theory and Some Important Sampling Distributions.
CSE 474 Simulation Modeling | MUSHFIQUR ROUF CSE474:
Lecture 5 Introduction to Sampling Distributions.
Uncertainty and Reliability Analysis D Nagesh Kumar, IISc Water Resources Planning and Management: M6L2 Stochastic Optimization.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Biostatistics Class 3 Probability Distributions 2/15/2000.
Statistics and probability Dr. Khaled Ismael Almghari Phone No:
MEGN 537 – Probabilistic Biomechanics Ch
Measures of Dispersion
Correlation and Regression Analysis
3.1 Expectation Expectation Example
Introduction to Probability & Statistics The Central Limit Theorem
MEGN 537 – Probabilistic Biomechanics Ch
Additional notes on random variables
Additional notes on random variables
6.3 Sampling Distributions
Chapter 8 Estimation.
Mathematical Expectation
Presentation transcript:

1 5. Combination of random variables Understand why we need bottoms-up approach for reliability analysis Learn how to compute the probability density function, mean value and standard deviation of functions of random variables. Also learn how to approximate the mean value and standard deviation of functions of random variables. We will assume static reliability models for the rest of the course.

2 Bottoms-up approach for reliability analysis Select primitive random variables Probability distributions of primitive random variables Probability calculus Reliability or failure probability Data and judgment Relation between performance and primitive random variables

3 Why bottoms-up approach for reliability analysis Sometimes we do not have enough failure data to estimate reliability of a system. Examples: buildings, bridges, nuclear power plants, offshore platforms, ships Solution: Bottom up approach for reliability assessment: start with the probability distributions of the primitive (generic random variables), derive probability distribution of performance variables (e.g. failure time). Advantages: –Estimate probability distribution of input random variables (e.g., yield stress of steel, wind speed). Use the same probability distribution of the generic random variables in many different problems. –Identify and reduce important sources of uncertainty and variability.

4 Transformation of random variables y=g(x) Objective: given probability distribution of X, and function g(.), derive probability distribution of Y.

5 Transformation of random variables X Y xx+Dx y y+Dy y=g(x) One-to-one transformation

6 General transformation multiple-valued inverse function

7 Functions of many variables X1X1 X2X2 Y1Y1 Y2Y2 Ax Ay

8 Expectation (mean value) and variance In many problems it is impractical to estimate probability density functions so we work with mean values (expectations) and variances Expectation –E(aX)=aE(X) –E(X+Y)=E(X)+E(Y) –If X, Y independent, then E(XY)=E(X)E(Y)

9 Variance

10 Covariance Covariance measures the degree to which two variables tend to increase to decrease together X Y Negative covariance X Y Positive covariance

11 Correlation coefficient Correlation coefficient,  : covariance normalized by the product of standard deviations Ranges from –1 to +1 Uncorrelated variables: correlation coefficient=0

12 Relation between correlation and statistical dependence If X, Y independent then they are uncorrelated If X, Y are uncorrelated, then they may be dependent or independent Uncorrelated variables Independent variables

13 Variance of uncorrelated variables

14 Chebyshev’s inequality Upper bound of probability of a random variable deviating more than k standard deviations from its mean value P(|Y-E(Y)|  k  )  1/k 2 Upper bound is to large to be useful

15

16 Approximations for mean and variance of a function of random variables Function of one variable: g(X) E(g(X))=g(E(X)) Standard deviation of g(X)=[dg(X)/dX]  standard deviation of X Derivative of g(X) is evaluated at the mean value of X

17 Approximations for mean and variance of a function of random variables Function of many variables: g(X1,…,Xn) E(g(X1,…,Xn))=g(E(X1),…, E(Xn)) Variance of g(X) =  [dg(X1,…,Xn)/dXi] 2 ×variance of Xi+ 2   [dg(X1,…,Xn)/dXi] × [dg(X1,…,Xn)/dXj] ×covariance of Xi,Xj Derivatives of g(X) are evaluated at the mean value of X

18 When are the above approximations good? When the standard deviations of the independent variables are small compared to their average values Function g(X) is mildly nonlinear i.e. the derivatives do not change substantially when the independent variables change