Multivariate Distributions

Slides:



Advertisements
Similar presentations
STATISTICS Joint and Conditional Distributions
Advertisements

Chapter 3 Properties of Random Variables
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Chapter 4 Discrete Random Variables and Probability Distributions
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Statistics Lecture 18. Will begin Chapter 5 today.
Probability Densities
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Chapter 4: Joint and Conditional Distributions
5 Joint Probability Distributions CHAPTER OUTLINE
Random Variable and Probability Distribution
Lecture II-2: Probability Review
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
Jointly distributed Random variables
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Joint Distribution of two or More Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Joint Probability Distributions Leadership in Engineering
Review of Probability.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Machine Learning Queens College Lecture 3: Probability and Statistics.
Jointly Distributed Random Variables
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
CIVL 181Tutorial 5 Return period Poisson process Multiple random variables.
1 LES of Turbulent Flows: Lecture 1 Supplement (ME EN ) Prof. Rob Stoll Department of Mechanical Engineering University of Utah Fall 2014.
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
Chapters 7 and 10: Expected Values of Two or More Random Variables
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
Variables and Random Variables àA variable is a quantity (such as height, income, the inflation rate, GDP, etc.) that takes on different values across.
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Statistics for Business & Economics
1 Topic 5 - Joint distributions and the CLT Joint distributions –Calculation of probabilities, mean and variance –Expectations of functions based on joint.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.
Basics on Probability Jingrui He 09/11/2007. Coin Flips  You flip a coin Head with probability 0.5  You flip 100 coins How many heads would you expect.
Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.
Topic 5 - Joint distributions and the CLT
STATISTICS Joint and Conditional Distributions Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Jointly distributed Random variables Multivariate distributions.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
STATISTICS Joint and Conditional Distributions
Probability Review for Financial Engineers
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Presentation transcript:

Multivariate Distributions CIVL 7012/8012

Multivariate Distributions Engineers often are interested in more than one measurement from a single item. Multivariate distributions describe the probability of events defined in terms of multiple random variables

Joint Probability Distributions Some random variables are not independent of each other, i.e., they tend to be related. Urban atmospheric ozone and airborne particulate matter tend to vary together. Urban vehicle speeds and fuel consumption rates tend to vary inversely. A joint probability distribution will describe the behavior of several random variables, say, X and Y. The graph of the distribution is 3-dimensional: x, y, and f(x,y).

Let X denote the number of bars of signal strength on you cell phone. You use your cell phone to check your airline reservation. The airline system requires that you speak the name of your departure city to the voice recognition system. Let Y denote the number of times that you have to state your departure city. Let X denote the number of bars of signal strength on you cell phone. Figure 5-1 Joint probability distribution of X and Y. The table cells are the probabilities. Observe that more bars relate to less repeating.

Joint Probability distributions

Joint Probability Density Function Defined The joint probability density function for the continuous random variables X and Y, denotes as fXY(x,y), satisfies the following properties: Figure 5-2 Joint probability density function for the random variables X and Y. Probability that (X, Y) is in the region R is determined by the volume of fXY(x,y) over the region R.

Figure 5-3 Joint probability density function for the continuous random variables X and Y of different dimensions of an injection-molded part. Note the asymmetric, narrow ridge shape of the PDF – indicating that small values in the X dimension are more likely to occur when small values in the Y dimension occur.

Marginal Probability Distributions The individual probability distribution of a random variable is referred to as its marginal probability distribution. In general, the marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables. For example, to determine P(X = x), we sum P(X = x, Y = y) over all points in the range of (X, Y ) for which X = x. Subscripts on the probability mass functions distinguish between the random variables.

Marginal Probability Distributions (discrete) For a discrete joint PMF, there are marginal distributions for each random variable, formed by summing the joint PMF over the other variable. Figure 5-6 From the prior example, the joint PMF is shown in green while the two marginal PMFs are shown in blue.

Marginal Probability Distributions (continuous) Rather than summing, like for a discrete joint PMF, we integrate a continuous joint PDF. The marginal PDFs are used to make probability statements about one variable. If the joint probability density function of random variables X and Y is fXY(x,y), the marginal probability density functions of X and Y are:

Example

Example

Mean & Variance of a Marginal Distribution Means E(X) and E(Y) are calculated from the discrete and continuous marginal distributions.

Example E(X) = 2.35 V(X) = 6.15 – 2.352 = 6.15 – 5.52 = 0.6275 E(Y) = 2.49 V(Y) = 7.61 – 2.492 = 7.61 – 16.20 = 1.4099

Conditional Probability Distributions

Conditional discrete PMFs can be shown as tables.

Conditional Probability Distributions

Covariance and Correlation Coefficient

Covariance and Correlation Coefficient The probability distribution of Example 5-1 is shown. By inspection, note that the larger probabilities occur as X and Y move in opposite directions. This indicates a negative covariance.

Covariance and Correlation Coefficient

Example The joint PMF of precipitation, X (in.) and runoff, Y (cfs) (discretized here for simplicity) due to storms at a given location is shown in the table below. a. What is the probability that the next storm will bring a precipitation of 2 or more inches and a runoff of more than 20 cfs? b. After a storm, the rain gauge indicates a precipitation of 2 in. What is the probability that the runoff in this storm is 20 cfs or more? c. Are X and Y statistically independent? d. Determine and plot the marignal PMF of runoff. e. Determine and plot the PMF of runoff for a storm whose precipitation is 2 in. f. Determine the correlation coefficient between precipitation and runoff. X = 1 X = 2 X = 3 Y = 10 0.05 0.15 0.0 Y = 20 0.10 0.25 Y = 30

Example The daily water levels (normalized to the respective full condition) of two reservoirs A and B are denoted by two random variables x and y having the following joint PDF. Determine the marginal density function of the daily water level of Reservoir A. If a reservoir A is half full on a given day, what is the probability that the water level will be more than half full in reservoir B? Is there any statistical correlation between the water levels in the two reservoirs?