Joint Probability Distributions

Slides:



Advertisements
Similar presentations
STATISTICS Joint and Conditional Distributions
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Lecture (7) Random Variables and Distribution Functions.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Multivariate Distributions
Lecture note 6 Continuous Random Variables and Probability distribution.
Chapter 4 Discrete Random Variables and Probability Distributions
5.4 Joint Distributions and Independence
Probability Densities
Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Chapter 6 Continuous Random Variables and Probability Distributions
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Statistics.
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
3-1 Introduction Experiment Random Random experiment.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
5 Joint Probability Distributions CHAPTER OUTLINE
1 Fin500J Topic 10Fall 2010 Olin Business School Fin500J: Mathematical Foundations in Finance Topic 10: Probability and Statistics Philip H. Dybvig Reference:
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Multivariate Probability Distributions. Multivariate Random Variables In many settings, we are interested in 2 or more characteristics observed in experiments.
Joint Probability distribution
Jointly distributed Random variables
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Chapter 5 Discrete Random Variables and Probability Distributions ©
CIVL 181Tutorial 5 Return period Poisson process Multiple random variables.
Two Random Variables W&W, Chapter 5. Joint Distributions So far we have been talking about the probability of a single variable, or a variable conditional.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Statistics for Business & Economics
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.
Math 4030 – 6a Joint Distributions (Discrete)
STATISTICS Joint and Conditional Distributions Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Chapter 4 Discrete Random Variables and Probability Distributions
Applied Discrete Mathematics Week 11: Relations
STATISTICS Joint and Conditional Distributions
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Discrete Random Variables
Conditional Probability on a joint discrete distribution
Some Rules for Expectation
Multinomial Distribution
Chapter 5 Applied Statistics and Probability for Engineers
Chapter 2. Random Variables
Discrete Random Variables and Probability Distributions
Moments of Random Variables
Mathematical Expectation
Presentation transcript:

Joint Probability Distributions Outlines Two Discrete/Continuous Random Variables Joint Probability Distributions Marginal Probability Distributions Conditional Probability Distributions Independence Multiple Discrete/Continuous Random Variables Multinomial Probability Distribution Covariance and Correlation Bivariate Normal Distribution Linear Combination of random variables

Joint Probability Distributions In general, if X and Y are two random variables, the probability distribution that defines their simultaneous behavior is called a joint probability distribution. For example: X : the length of one dimension of an injection- molded part, and Y : the length of another dimension. We might be interested in P(2.95  X  3.05 and 7.60  Y  7.80).

Two Discrete Random Variables Joint Probability Distributions Marginal Probability Distributions Conditional Probability Distributions Independence

Joint Probability Distributions The joint probability distribution of two random variables =bivariate probability distribution. The joint probability distribution of two discrete random variables is usually written as P(X=x, Y=y).

Marginal Probability Distributions Marginal Probability Distribution: the individual probability distribution of a random variable.

Marginal Probability Distributions Example: The marginal probability distribution for X and Y. y=number of times city name is stated x=number of bars of signal strength 1 2 3 Marginal probability distribution of Y 4 0.15 0.1 0.05 0.3 0.02 0.17 0.03 0.2 0.25 0.01 0.28 0.55 Marginal probability distribution of X P(X=3)

Conditional Probability Distributions When two random variables are defined in a random experiment, knowledge of one can change the probabilities of the other.

Conditional Mean and Variance

Conditional Mean and Variance Example: From the previous example, calculate P(Y=1|X=3), E(Y|1), and V(Y|1).

Independence In some random experiments, knowledge of the values of X does not change any of the probabilities associated with the values for Y. If two random variables are independent, then

Multiple Discrete Random Variables Joint Probability Distributions Multinomial Probability Distribution

Joint Probability Distributions In some cases, more than two random variables are defined in a random experiment. Marginal probability mass function

Joint Probability Distributions Mean and Variance

Joint Probability Distributions Conditional Probability Distributions Independence

Multinomial Probability Distribution A joint probability distribution for multiple discrete random variables that is quite useful in an extension of the binomial.

Multinomial Probability Distribution Example: Of the 20 bits received, what is the probability that 14 are Excellent, 3 are Good, 2 are Fair, and 1 is Poor? Assume that the classifications of individual bits are independent events and that the probabilities of E, G, F, and P are 0.6, 0.3, 0.08, and 0.02, respectively. One sequence of 20 bits that produces the specified numbers of bits in each class can be represented as: EEEEEEEEEEEEEEGGGFFP P(EEEEEEEEEEEEEEGGGFFP)= The number of sequences (Permutation of similar objects)=

Two Continuous Random Variables Joint Probability Distributions Marginal Probability Distributions Conditional Probability Distributions Independence

Joint Probability Distributions

Joint Probability Distributions Example: X: the time until a computer server connects to your machine , Y: the time until the server authorizes you as a valid user. Each of these random variables measures the wait from a common starting time and X <Y. Assume that the joint probability density function for X and Y is The probability that X<1000 and Y<2000 is:

Marginal Probability Distributions Similar to joint discrete random variables, we can find the marginal probability distributions of X and Y from the joint probability distribution.

Marginal Probability Distributions Example: For the random variables in the previous example, calculate the probability that Y exceeds 2000 milliseconds.

Conditional Probability Distributions

Conditional Probability Distributions Example: For the random variables in the previous example, determine the conditional probability density function for Y given that X=x Determine P(Y>2000|x=1500)

Conditional Probability Distributions Mean and Variance

Conditional Probability Distributions Example: For the random variables in the previous example, determine the conditional mean for Y given that x=1500

Independence

Independence Example: Let the random variables X and Y denote the lengths of two dimensions of a machined part, respectively. Assume that X and Y are independent random variables, and the distribution of X is normal with mean 10.5 mm and variance 0.0025 (mm)2 and that the distribution of Y is normal with mean 3.2 mm and variance 0.0036 (mm)2. Determine the probability that 10.4 < X < 10.6 and 3.15 < Y < 3.25. Because X,Y are independent

Multiple Continuous Random Variables

Multiple Continuous Random Variables Marginal Probability

Multiple Continuous Random Variables Mean and Variance Independence

Covariance and Correlation When two or more random variables are defined on a probability space, it is useful to describe how they vary together. It is useful to measure the relationship between the variables.

Covariance Covariance is a measure of linear relationship between the random variables. \ The expected value of a function of two random variables h(X, Y ).

Covariance

Covariance

Covariance Example: For the discrete random variables X, Y with the joint distribution shown in Fig. Determine

Correlation The correlation is a measure of the linear relationship between random variables. Easier to interpret than the covariance.

Correlation For independent random variables

Correlation Example: Two random variables , calculate the covariance and correlation between X and Y.

Bivariate Normal Distribution Correlation

Bivariate Normal Distribution Marginal distributions Dependence

Bivariate Normal Distribution Conditional probability

Bivariate Normal Distribution Ex. Suppose that the X and Y dimensions of an injection-modeled part have a bivariate normal distribution with Find the P(2.95<X<3.05,7.60<Y<7.80)

Bivariate Normal Distribution Ex. Let X, Y : milliliters of acid and base needed for equivalence, respectively. Assume X and Y have a bivariate normal distribution with Covariance between X and Y Marginal probability distribution of X P(X<116) P(X|Y=102) P(X<116|Y=102)

Linear Combination of random variables

Linear Combination of random variables Mean and Variance

Linear Combination of random variables Ex. A semiconductor product consists of 3 layers. The variances in thickness of the first, second, and third layers are 25,40,30 nm2 . What is the variance of the thickness of the final product? Let X1, X2, X3, and X be random variables that denote the thickness of the respective layers, and the final product. V(X)=V(X1)+V(X2)+V(X3)=25+40+30=95 nm2

Homework The time between surface finish problems in a galvanizing process is exponentially distributed with a mean of 40 hours. A single plant operates three galvanizing lines that are assumed to operate independently. What is the probability that none of the lines experience a surface finish problem in 40 hours of operation? What is the probability that all three lines experience two surface finish problems between 20 and 40 hours after starting the operation? Suppose X and Y have a bivariate normal distribution with Determine the following: a) P(2.95<X<3.05) b) P(7.60<Y<7.80) c) P(2.95<X<3.05,7.60<Y<7.80)