ECE 313 Probability with Engineering Applications Lecture 19

Slides:



Advertisements
Similar presentations
STATISTICS Joint and Conditional Distributions
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
6. Reliability Modeling Reliable System Design 2010 by: Amir M. Rahmani.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
06/05/2008 Jae Hyun Kim Chapter 2 Probability Theory (ii) : Many Random Variables Bioinformatics Tea Seminar: Statistical Methods in Bioinformatics.
1 Engineering Computation Part 6. 2 Probability density function.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Course outline and schedule Introduction Event Algebra (Sec )
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Continuous random variables Uniform and Normal distribution (Sec. 3.1, )
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Exponential distribution Reliability and failure rate (Sec )
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
CHAPTER 6 Statistical Analysis of Experimental Data
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Course outline and schedule Introduction (Sec )
The moment generating function of random variable X is given by Moment generating function.
3-1 Introduction Experiment Random Random experiment.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
System Reliability. Random State Variables System Reliability/Availability.
4. Dynamic reliability models Objectives Be able to find reliability of series, parallel, stand-by and shared load parallel systems, when the reliabilities.
1 Reliability Application Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS.
General information CSE : Probabilistic Analysis of Computer Systems
Chapter 14 Monte Carlo Simulation Introduction Find several parameters Parameter follow the specific probability distribution Generate parameter.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Random Variables and Stochastic Processes – Lecture#13 Dr. Ghazi Al Sukkar Office Hours:
L Berkley Davis Copyright 2009 MER035: Engineering Reliability Lecture 6 1 MER301: Engineering Reliability LECTURE 6: Chapter 3: 3.9, 3.11 and Reliability.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Stracener_EMIS 7305/5305_Spr08_ Systems Reliability Modeling & Analysis Series and Active Parallel Configurations Dr. Jerrell T. Stracener, SAE.
Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.
Topic 5 - Joint distributions and the CLT
STATISTICS Joint and Conditional Distributions Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
Jointly distributed Random variables Multivariate distributions.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
5 pair of RVs.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
 How do you know how long your design is going to last?  Is there any way we can predict how long it will work?  Why do Reliability Engineers get paid.
More on Exponential Distribution, Hypo exponential distribution
Expectations of Random Variables, Functions of Random Variables
Some problems on Joint Distributions,
Gaussian, Exponential and Hypo Exponential Distributions
Probabilistic Analysis of Computer Systems
Discrete and Continuous Randonm Variables; Group Activity Solution
Hazards, Instantaneous failure rates Group Activity
Expected values, covariance, and correlation
Inequalities, Covariance, examples
Statistics Lecture 19.
ECE 313 Probability with Engineering Applications Lecture 7
Expectations of Random Variables, Functions of Random Variables
Expectations of Random Variables, Functions of Random Variables
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
CONCEPTS OF HYPOTHESIS TESTING
TM 605: Probability for Telecom Managers
Some Rules for Expectation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Distributions and expected value
Analysis of Engineering and Scientific Data
Chapter 14 Monte Carlo Simulation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Chapter 2. Random Variables
5 pair of RVs.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Random Variables and Probability Distributions
Moments of Random Variables
Mathematical Expectation
Presentation transcript:

Joint Distribution Functions, Independence of Random Variables, Covariance, and Correlation ECE 313 Probability with Engineering Applications Lecture 19 Ravi K. Iyer Dept. of Electrical and Computer Engineering University of Illinois at Urbana Champaign

Today’s Topics and Announcements Joint Distribution Functions Announcements: Group activity in the class, next Week,. Final project will be released Today Concepts: Hypothesis testing, Joint distributions, Independence, Covariance and correlation Project schedules on Compass and Piazza

Standby Redundancy: joint/conditional joint probability distributions A standby system is one in which two components are connected in parallel, but only one component is required to be operative for the system to function properly. Initially the power is applied to only one component and the other component is kept in a powered-off state (de-energized). When the energized component fails, it is de-energized and removed from operation, and the second component is energized and connected in the former’s place. If we assume that the first component fails at some time τ, then the second component’s lifetime starts at time τ and assuming that it fails at time t, its lifetime will be t – τ: t = 0 τ t > τ t - τ

Standby Redundancy (Cont’d) If we assume that the time to failure of the components is exponentially distributed with parameters λ1 and λ2, then the probability density function for the failure of the first component is: Given that the first component must fail for the lifetime of the second component to start, the density function of the lifetime of the second component is conditional, given by: Then we define the system failure as a function of t and , using the definition of conditional probability:

Standby Redundancy (Cont’d) The associated marginal density function of is: So the system failure will be: And the reliability function will be:

Standby Redundancy (Cont’d)

Joint Distribution Functions We have concerned ourselves with the probability distribution of a single random variable Often interested in probability statements concerning two or more random variables Define, for any two random variables X and Y, the joint cumulative probability distribution function of X and Y by The distribution of X (Marginal Distribution)can be obtained from the joint distribution of X and Y as follows:

Joint Distribution Functions Cont’d Similarly, Where X and Y are both discrete random variables it is convenient to define the joint probability mass function of X and Y by Probability mass function of X

Joint Distribution Functions Cont’d We say that X and Y are jointly continuous defined for all real x and y f(x,y) Called the joint probability density function of X and Y . probability density of X is the probability density function of X. The probability density function of Y is because:

Joint Distribution Functions Cont’d Proposition: if X and Y are random variables and g is a function of two variables, then For example, if g(X,Y)=X+Y, then, in the continuous case

Joint Distribution Functions Cont’d Where the first integral is evaluated by using the foregoing Proposition with g(x,y)=x and the second with g(x,y)=y In the discrete case Joint probability distributions may also be defined for n random variables. If are n random variables, then for any n constants

Example 1 A batch of 1M RAM chips are purchases from two different semiconductor houses. Let X and Y denote the times to failure of the chips purchased from the two suppliers. The joint probability density of X and Y is estimated by: Assume per hour and per hour. Determine the probability that time to failure is greater for chips characterized by X than it is for chips characterized by Y.

Example 1 (Cont’d)