Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
5.4 Joint Distributions and Independence
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Class notes for ISE 201 San Jose State University
Section 8.3 Suppose X 1, X 2,..., X n are a random sample from a distribution defined by the p.d.f. f(x)for a < x < b and corresponding distribution function.
Samples vs. Distributions Distributions: Discrete Random Variable Distributions: Continuous Random Variable Another Situation: Sample of Data.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Exam includes sections from Chapter 5 –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33,
Section 3.3 If the space of a random variable X consists of discrete points, then X is said to be a random variable of the discrete type. If the space.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Chapter 4 Joint Distribution & Function of rV. Joint Discrete Distribution Definition.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Section 3.6 Recall that y –1/2 e –y dy =   0 (Multivariable Calculus is required to prove this!)  (1/2) = Perform the following change of variables.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
LECTURE IV Random Variables and Probability Distributions I.
MAT 1235 Calculus II Section 8.5 Probability
The Fundamental Theorems of Calculus Lesson 5.4. First Fundamental Theorem of Calculus Given f is  continuous on interval [a, b]  F is any function.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated.
1 Continuous Probability Distributions Continuous Random Variables & Probability Distributions Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering.
MATH 4030 – 4B CONTINUOUS RANDOM VARIABLES Density Function PDF and CDF Mean and Variance Uniform Distribution Normal Distribution.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Barnett/Ziegler/Byleen Business Calculus 11e1 Chapter 14 Review Important Terms, Symbols, Concepts 14.1 Area Between Curves If f and g are continuous and.
Random Variables an important concept in probability.
MAT 212 Brief Calculus Section 5.4 The Definite Integral.
Lecture 14 Prof. Dr. M. Junaid Mughal Mathematical Statistics 1.
Math 4030 – 6a Joint Distributions (Discrete)
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
Computer vision: models, learning and inference Chapter 2 Introduction to probability.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
Chapter 4 Multivariate Normal Distribution. 4.1 Random Vector Random Variable Random Vector X X 1, , X p are random variables.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
3.4 Joint Probability Distributions
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
11.3 CONTINUOUS RANDOM VARIABLES. Objectives: (a) Understand probability density functions (b) Solve problems related to probability density function.
Chapter 9: Joint distributions and independence CIS 3033.
Statistics Lecture 19.
Lecture 3 B Maysaa ELmahi.
Probability.
Cumulative distribution functions and expected values
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
... DISCRETE random variables X, Y Joint Probability Mass Function y1
ASV Chapters 1 - Sample Spaces and Probabilities
6.3 Sampling Distributions
ASV Chapters 1 - Sample Spaces and Probabilities
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Presentation transcript:

Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary

Theorem: 7.1. p 191 A real valued function f of two variables is joint probability density function of a pair of discrete random variables X and Y if and only if :

Example:7.1 page 191 For what value of the constant k the function given by Is a joint probability density function of some random variables X , Y ?

Marginal probability density function Example:

7.2. Bivariate Continuous Random Variables الإثنين 10/4/1435 المحاضرة الثالثة: 7.2. Bivariate Continuous Random Variables

7.2. Bivariate Continuous Random Variables In this section, we shall extend the idea of probability density functions of one random variable to that of two random variables. Definition 7.5. The joint probability density function of the random variables X and Y is an integrable function f(x, y) such that

7.2. Bivariate Continuous Random Variables Example 7.6. Let the joint density function of X and Y be given by What is the value of the constant k ?

REMARK: If we know the joint probability density function f of the random variables X and Y , then we can compute the probability of the event A from:

Bivariate Continuous Random Variables Example 7.7. Let the joint density of the ontinuous random variables X and Y be What is the probability of the event

Marginal probability density function: Definition 7.6. Let (X, Y ) be a continuous bivariate random variable. Let f(x, y) be the joint probability density function of X and Y . The function is called the marginal probability density function of X. Similarly, the function

Marginal probability density function: Similarly, the function is called the marginal probability density function of Y.

Marginal probability density function: Example 7.9. If the joint density function for X and Y is given by: What is the marginal density of X where nonzero?

Definition 7.7. Let X and Y be the continuous random variables with joint probability density function f(x, y). The joint cumulative distribution function F(x, y) of X and Y is defined as

The joint cumulative distribution function F(x, y): From the fundamental theorem of calculus, we again: obtain

The joint cumulative distribution function F(x, y): Example 7.11. If the joint cumulative distribution function of X and Y is given by then what is the joint density of X and Y ?

EXERCISES: Page 208-209 1 , 2 , 3 , 4 , 7 , 8 , 10 , 11

7.3. Conditional Distributions First, we motivate the definition of conditional distribution using discrete random variables and then based on this motivation we give a general definition of the conditional distribution. Let X and Y be two discrete random variables with joint probability density f(x, y).

7.3. Conditional Distributions Then by definition of the joint probability density, we have f(x, y) = P(X = x, Y = y). If A = {X = x}, B = {Y = y} and f (y) = P(Y = y), then from the above equation we have P ({X = x} / {Y = y}) = P (A/B)

7.3. Conditional Distributions If we write the P ({X = x} / {Y = y}) as g(x / y), then we have

7.3. Conditional Distributions Definition 7.8. Let X and Y be any two random variables with joint density f(x, y) and marginals f1(x) and f2(y). The conditional probability density function g of X, given (the event) Y = y, is defined as

7.3. Conditional Distributions Similarly, the conditional probability density function h of Y , given (the event) X = x, is defined as

7.3. Conditional Distributions Example 7.14. Let X and Y be discrete random variables with joint probability function What is the conditional probability density function of Y, given X = 2 ?

7.3. Conditional Distributions Example 7.15. Let X and Y be discrete random variables with joint probability function What is the conditional probability density function of Y, given X = x ?

7.3. Conditional Distributions Example 7.16. Let X and Y be contiuous random variables with joint pdf What is the conditional probability density function of Y, given X = x ?

7.3. Conditional Distributions Example 7.17. Let X and Y random variables such that X has pdf and the conditional density of Y given X = x is

7.3. Conditional Distributions What is the conditional density of X given Y = y over the appropriate domain?

7.4. Independence of Random Variables In this section, we define the concept of stochastic independence of two random variables X and Y . The conditional robability density function g of X given Y = y usually depends on y. If g is independent of y, then the random variables X and Y are said to be independent. This motivates the following definition.

7.4. Independence of Random Variables Definition 7.8. Let X and Y be any two random variables with joint density f(x, y) and marginals f1(x) and f2(y). The random variables X and Y are (stochastically) independent if and only if

7.4. Independence of Random Variables Example 7.18. Let X and Y be discrete random variables with joint density Are X and Y stochastically independent?

7.4. Independence of Random Variables Example 7.19. Let X and Y have the joint density Are X and Y stochastically independent?

7.4. Independence of Random Variables Example 7.20. Let X and Y have the joint density Are X and Y stochastically independent?

7.4. Independence of Random Variables Definition 7.9. The random variables X and Y are said to be independent and identically distributed (IID) if and only if they are independent and have the same distribution.

EXERCISES: Page 210-211 14 , 16 , 21