IE 360: Design and Control of Industrial Systems I

Slides:



Advertisements
Similar presentations
Lecture 2 Today: Statistical Review cont’d:
Advertisements

Lecture note 6 Continuous Random Variables and Probability distribution.
Independence of random variables
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Class notes for ISE 201 San Jose State University
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
SIMPLE LINEAR REGRESSION
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
1A.1 Copyright© 1977 John Wiley & Son, Inc. All rights reserved Review Some Basic Statistical Concepts Appendix 1A.
Random Variable and Probability Distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Variance and Covariance
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Statistics for Business & Economics
1 Regression & Correlation (1) 1.A relationship between 2 variables X and Y 2.The relationship seen as a straight line 3.Two problems 4.How can we tell.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Measures of Association: Pairwise Correlation
Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.
Math 4030 – 6a Joint Distributions (Discrete)
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
The Erik Jonsson School of Engineering and Computer Science Chapter 4 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Chapter 5 Joint Probability Distributions and Random Samples
Expected values, covariance, and correlation
Inequalities, Covariance, examples
Statistics Lecture 19.
Variance and Covariance
Chapter 4 Using Probability and Probability Distributions
Cumulative distribution functions and expected values
Main topics in the course on probability theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Two Discrete Random Variables
Keller: Stats for Mgmt & Econ, 7th Ed
Chapter 4: Mathematical Expectation:
Review of Probability Concepts
MEGN 537 – Probabilistic Biomechanics Ch.3 – Quantifying Uncertainty
Multinomial Distribution
Random Variable X, with pmf p(x) or pdf f(x)
How accurately can you (1) predict Y from X, and (2) predict X from Y?
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
CS723 - Probability and Stochastic Processes
Independence of random variables
Karl’s Pearson Correlation
X is Uniform(0, 1). What is the PDF of Y = √X?
Handout Ch 4 實習.
Chapter 5 Applied Statistics and Probability for Engineers
Chapter 2. Random Variables
Lectures prepared by: Elchanan Mossel Yelena Shvets
Further Topics on Random Variables: Covariance and Correlation
IE 360: Design and Control of Industrial Systems I
IE 360: Design and Control of Industrial Systems I
Further Topics on Random Variables: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

IE 360: Design and Control of Industrial Systems I Lecture 10 Joint Random Variables Part 2 IE 360: Design and Control of Industrial Systems I References Montgomery and Runger Section 5-2 Copyright  2010 by Joel Greenstein

Expected Value of Joint RVs The expected value of joint rvs follows the same ideas as “regular” (aka “univariate”= “one variable”) rvs but you need to use the joint pmf/pdf Let X and Y be discrete random variables with joint pmf f(x, y). The expected value of some function of X and Y, denoted g(X, Y), is Let X and Y be continuous random variables with joint pdf f(x, y). The expected value of some function of X and Y, denoted g(X, Y), is g(X, Y) could be something like XY, or X/Y (as in “X divided by Y”) or other non-linear functions, or even just X or 3Y + 2 Rules of Expectation apply in this case still If two rvs are independent (their joint pmf/pdf equals the product of the marginals), then E[XY] = E[X]E[Y]

Discrete Joint RV Expectation Example I am notoriously clumsy. Let X be the number of times I drop the chalk and Y be the number of times I drop the eraser in a one-hour class. Observant students in the past have developed the following joint pmf to describe how often I drop the chalk and eraser when I teach. X=0 X=1 X=2 X=3 Y=0 0.1 0.3 0.26 Y=1 0.03 Y=2 0.02 Y=3 0.01 What is the expected number of times I drop the chalk? This is E[X], so we have g(X, Y) = X It is really the same as computing the expected value of X using the marginal distribution of X, since the sum of the joint pmf over the Y-values is the marginal pmf of X.

Continuous Joint RVs Example Consider joint continuous random variables X and Y, with pdf f(x, y) = e-x-y, x>0, y>0 Find E[X]; we have g(X, Y)=X More integration by parts!

Continuous Joint RVs Example 2 Consider joint continuous random variables X and Y, with pdf f(x, y) = e-x-y, x>0, y>0 Find E[Y]; we have g(X, Y)=Y This is just the same as E[X] but swapping X and Y, so we have E[Y] = 1 Find E[XY]; we have g(X, Y)=XY Yikes – a double dose of integration by parts!

Covariance Interpreting the covariance If two rvs are related through a joint pmf/pdf, we may ask: If one goes up, does the other go down? Do large changes in one correspond to large changes in the other? These questions are answered by the covariance, Cov(X, Y) = σXY Describes nature of linear relationship between two rvs Same idea for discrete or continuous joint rvs Definition: Cov(X, Y) = σXY = E[(X-μX)(Y-μY)] may be positive or negative!!! Easier computation: Cov(X, Y) = σXY = E[XY] – μXμY Interpreting the covariance If two rvs are independent, then Cov(X, Y) = 0 The reverse is not true. If the covariance between two random variables is zero, we cannot conclude that the two random variables are independent independence is a result of the pmf/pdf relationship described in the previous lecture An important factor is the sign of the covariance Positive indicates an increase in one rv is associated with an increase in the other Negative indicates a decrease in one rv is associated with an increase in the other

Covariance Example If E[X] = 1, E[Y] = -1, and E[XY] = 4, what is the covariance of X and Y? Cov(X, Y) = E[XY] – μXμY = E[XY] – E[X] E[Y] = 4 – (1)(-1) = 5 X and Y have positive covariance If E[W] = 10, E[Z] = -10, and E[WZ] = 400, what is the covariance of W and Z? Cov(W, Z) = E[WZ] – μWμZ = E[WZ] – E[W] E[Z] = 400 – (10)(-10) = 500 W and Z have positive covariance Are the relationships between X and Y and W and Z very different? Maybe , maybe not. The covariance is not unitless, so it is not meaningful to compare the covariances of different pairs of rvs

Correlation The correlation, a unitless extension of the covariance, can also be interpreted as a description of the linear relationship between two random variables Let X and Y be rvs with Cov(X, Y)= σXY and standard deviations σX and σY. The correlation, ρXY (pronounced “rho of X and Y”) is defined as follows: Now, we have a unitless number that ranges from -1 to +1: -1 ≤ ρXY ≤ 1 If the correlation is -1, we have perfectly negative correlation If the correlation is +1, we have perfectly positive correlation If the correlation is 0, we have no linear correlation x y ρXY = -1 x y ρXY = +1 x y ρXY = 0 x y ρXY = 0

Correlation Example Assume E[X] = ½ , E[Y] = 1/3, E[XY] = ¼, Var[X] = ¼ and Var[Y] = 1/12 What is the correlation of X and Y? First, compute the covariance Cov(X, Y) = E[XY] – μXμY = E[XY] – E[X] E[Y] = ¼ - ½(1/3) = 1/12 Now, compute the correlation X and Y are somewhat positively correlated

Related reading Montgomery and Runger, section 5-2 The captions of Figures 5-14, 5-15, and 5-16 should refer to Examples 5-21, 5-22, and 5-23, respectively. Some links that might help (maybe) Nice pictures of correlations are at http://en.wikipedia.org/wiki/Correlation http://cnx.org/content/m11248/latest/ Now you are ready to do HW 10