Chapter 3-2 Discrete Random Variables 主講人 : 虞台文. Content Functions of a Single Discrete Random Variable Discrete Random Vectors Independent of Random.

Slides:



Advertisements
Similar presentations
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 2 Concepts of Prob. Theory
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Mathcad 基本認識 再mathcad中等於(=)的符號有區分為三種: 第一種:冒號等於(:=)是代表我們要定義ㄧ個參數
Chapter 2 Random Vectors 與他們之間的性質 (Random vectors and their properties)
Review of Basic Probability and Statistics
5.4 Joint Distributions and Independence
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
STAT0_sampling Random Sampling  母體: Finite population & Infinity population  由一大小為 N 的有限母體中抽出一樣本數為 n 的樣 本,若每一樣本被抽出的機率是一樣的,這樣本稱 為隨機樣本 (random sample)
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Probability Distributions
自動機 (Automata) Time: 1:10~2:00 Monday: practice exercise, quiz 2:10~4:00 Wednesday: lecture Textbook: (new!) An Introduction to Formal Languages and Automata,
Extreme Discrete Summation ★★★★☆ 題組: Contest Archive with Online Judge 題號: Extreme Discrete Summation 解題者:蔡宗翰 解題日期: 2008 年 10 月 13 日.
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 4-1 Continuous Random Variables 主講人 : 虞台文.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
LECTURE IV Random Variables and Probability Distributions I.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Chapter 2 Combinatorial Analysis 主講人 : 虞台文. Content Unordered Samples without Replacement  Combinations Binomial Coefficients Some Useful Mathematic.
Chapter 2 Combinatorial Analysis 主講人 : 虞台文. Content Basic Procedure for Probability Calculation Counting – Ordered Samples with Replacement – Ordered.
7-1 Chapter 7 Special Continuous Distributions 7.1 Uniform Random Variable Def: A continuous RV X is said to have a uniform distribution if the pdf of.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
One Random Variable Random Process.
Chapter 4-2 Continuous Random Variables 主講人 : 虞台文.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 5 Expectations 主講人 : 虞台文. Content Introduction Expectation of a Function of a Random Variable Expectation of Functions of Multiple Random Variables.
Introduction to Probability Theory ‧ 3- 1 ‧ Speaker: Chuang-Chieh Lin Advisor: Professor Maw-Shang Chang National Chung Cheng University Dept. CSIE, Computation.
Chapter 3 Discrete Random Variables 主講人 : 虞台文. Content Random Variables The Probability Mass Functions Distribution Functions Bernoulli Trials Bernoulli.
Probability (outcome k) = Relative Frequency of k
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Chapter 3 Multivariate Random Variables
1.6 Change of Measure 1. Introduction We used a positive random variable Z to change probability measures on a space Ω. is risk-neutral probability measure.
Random Variables Example:
Discrete Mathematics Chapter 6 Advanced Counting Techniques.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
5 pair of RVs.
Probability Generating Functions Suppose a RV X can take on values in the set of non-negative integers: {0, 1, 2, …}. Definition: The probability generating.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Introduction to Probability Theory ‧ 2- 1 ‧ Speaker: Chuang-Chieh Lin Advisor: Professor Maw-Shang Chang National Chung Cheng University Dept. CSIE, Computation.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
Chapter 9: Joint distributions and independence CIS 3033.
Statistics Lecture 19.
... DISCRETE random variables X, Y Joint Probability Mass Function y1
Chapter 5 Expectations 主講人:虞台文.
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
Lectures prepared by: Elchanan Mossel Yelena Shvets
ASV Chapters 1 - Sample Spaces and Probabilities
Handout Ch 5.
Experiments, Outcomes, Events and Random Variables: A Revisit
Chapter 4-1 Continuous Random Variables
Chapter 3-2 Discrete Random Variables
Presentation transcript:

Chapter 3-2 Discrete Random Variables 主講人 : 虞台文

Content Functions of a Single Discrete Random Variable Discrete Random Vectors Independent of Random Variables Multinomial Distributions Sums of Independent Variables  Generating Functions Functions of Multiple Random Variables

Functions of a Single Discrete Random Variable Chapter 3-2 Discrete Random Variables

計程車司機的心聲 這傢伙上車後會 要跑幾公里 (X) ? X 為一隨機變數

計程車司機的心聲 這傢伙上車後會 要跑幾公里 (X) ? X 為一隨機變數  這傢伙上車後我可以 從他口袋掏多少錢 (Y) ? Y 亦為一隨機變數 Y = g(X) 隨機變數之函式 亦為隨機變數。

計程車司機的心聲 這傢伙上車後會 要跑幾公里 (X) ? X 為一隨機變數  這傢伙上車後我可以 從他口袋掏多少錢 (Y) ? Y 亦為一隨機變數 Y = g(X) 若 p X (x) 已知, p Y (y) =?

The Problem Y = g(X) and p X (x) is available.

Example 17 這瓶十元 這瓶只要五元 福氣啦 !!!

Example 17 這瓶十元 這瓶只要五元 福氣啦 !!!

Example 17 這瓶十元 這瓶只要五元 福氣啦 !!!

Example 17

Example 18 n=10, p=0.2.

Example 18 n=10, p=0.2.

Example 18 n=10, p=0.2.

Example 18 n=10, p=0.2. Pay 100$, #bottles ( X 3 ) obtained?

Example 18 n=10, p=0.2. Pay 100$, #bottles ( X 3 ) obtained? Let Y (  X 3 ) denote #lucky bottles obtained.

Discrete Random Vectors Chapter 3-2 Discrete Random Variables

Definition  Random Vectors A discrete r -dimensional random vector X is a function X:   R r with a finite or countable infinite image of {x 1, x 2, …}.

Example 19

11

22

Definition  Joint Pmf Let random vector X = (X 1, X 2, …, X r ). The joint pmf (jpmf) for X is defined as p X (x) = P(X 1 = x 1, X 2 = x 2, …, X r = x r ), where x = (x 1, x 2, …, x r ).

Example 20 There are three cards numbered 1, 2 and 3. Randomly draw two cards among them without replacement. Let X, Y represent the number of the 1st and 2nd card, respectively. Find the jpmf of X, Y. X Y

Example 20 There are three cards numbered 1, 2 and 3. Randomly draw two cards among them without replacement. Let X, Y represent the number of the 1st and 2nd card, respectively. Find the jpmf of X, Y. X Y

Properties of Jpmf's 1. p(x)  0, x  R r ; 2. {x | p(x)  0} is a finite or countably infinite subset of R r ; 3.

Definition  Marginal Probability Mass Functions Let X = (X 1, …, X i, …, X r ) be an r -dimensional random vectors. The i th marginal probability mass function defined by

Example 21 Find p X (x) and p Y (y) of Example 20. X Y

Example 21 Find p X (x) and p Y (y) of Example 20. X Y

Example 22 4 X = # Y = # 1. p X,Y (x, y) = ? 2. p X (x) = ? p Y (y) = ? 3. p(X < 3)= ? 4. p(X + Y < 4)= ?

p X,Y (x, y) Example 22 4 X = # Y = # 1. p X,Y (x, y) = ? 2. p X (x) = ? p Y (y) = ? 3. p(X < 3)= ? 4. p(X + Y < 4)= ?

p X,Y (x, y) Example 22 4 X = # Y = # 1. p X,Y (x, y) = ? 2. p X (x) = ? p Y (y) = ? 3. p(X < 3)= ? 4. p(X + Y < 4)= ?

p X,Y (x, y) Example 22 4 X = # Y = # 1. p X,Y (x, y) = ? 2. p X (x) = ? p Y (y) = ? 3. p(X < 3)= ? 4. p(X + Y < 4)= ?

p X,Y (x, y) Example 22 4 X = # Y = # 1. p X,Y (x, y) = ? 2. p X (x) = ? p Y (y) = ? 3. p(X < 3)= ? 4. p(X + Y < 4)= ?

p X,Y (x, y) Example 22 4 X = # Y = # 1. p X,Y (x, y) = ? 2. p X (x) = ? p Y (y) = ? 3. p(X < 3)= ? 4. p(X + Y < 4)= ?

Independent Random Variables Chapter 3-2 Discrete Random Variables

Definition Let X 1, X 2, …, X r be r discrete random variables having densities, respectively. These random variables are said to be mutually independent if their jpdf p(x 1, x 2, …, x r ) satisfies

Example 23 Tossing two dice, let X, Y represent the face values of the 1st and 2nd dice, respectively. 1. p X,Y (x, y) = ?. 2. Are X, Y independent? Tossing two dice, let X, Y represent the face values of the 1st and 2nd dice, respectively. 1. p X,Y (x, y) = ?. 2. Are X, Y independent?

Example 23

Fact  ? ? ?

Example 24 Consider Example 23. Find P(X  2, Y  4).

Example 24

Z 1 有何意義 ?

Example 24

p’p’ p’p’

Fact: cdf pmf

Example 24

Multinomial Distributions Chapter 3-2 Discrete Random Variables

Generalized Bernoulli Trials A sequence of n independent trials. Each trial has r distinct outcomes with probabilities p 1, p 2, …, p r such that

Multinomial Distributions Define X=(X 1, X 2, …, X r ) st X i is the number of trials that resulted in the i th outcome. satisfies

Multinomial Distributions Define X=(X 1, X 2, …, X r ) st X i is the number of trials that resulted in the i th outcome. satisfies

Example 26 If a pair of dice are tossed 6 times, what is the probability of obtaining a total of 7 or 11 twice, a matching pair one, and any other combination 3 times? Three outcomes: 1.7 or 11 2.match 3.others X 1  #7 or 11; X 2  #matches; X 3  #others.

Sums of Independent Variables  Generating Functions Chapter 3-2 Discrete Random Variables

The Sum of Independent Random Variables

Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z).

Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). 0 n z  nz  n z 0 n z  nz  n z Case 1: z  {0, 1, …, n} Case 2: z  {n+1, n+2, …, 2n}

Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Case 1: z  {0, 1, …, n} Case 2: z  {n+1, n+2, …, 2n}

Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Case 1: z  {0, 1, …, n} Case 2: z  {n+1, n+2, …, 2n}

Probability Generating Functions Probabilities Probabilities 機率母函數

Probability Generating Functions Let X be a nonnegative integer-valued random variable. Its probability generating function G X (t) is defined as: pgf

Probability Generating Functions Let X be a nonnegative integer-valued random variable. Its probability generating function G X (t) is defined as: pgf x x

Probability Generating Functions Let X be a nonnegative integer-valued random variable. Its probability generating function G X (t) is defined as: pgf x x

Probability Generating Functions pgf Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p). Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p).

Probability Generating Functions pgf Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p). Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p).

Probability Generating Functions pgf Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p). Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p).

Probability Generating Functions pgf Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p). Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p).

Probability Generating Functions pgf Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p). Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(  ); 3. Z ~ G(p); 4. U ~ NB(r, p). Exercise

Important Generating Functions

Theorem 2  Sums of Independent Random Variables Let X, Y be two independent, nonnegative integer-valued random variables. Then,

Theorem 2  Sums of Independent Random Variables and Let Z=X+Y. Pf)

Theorem 2  Sums of Independent Random Variables and Fact: and...

Example 29 Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Use pgf to recompute Example 27.

Example 29 Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Use pgf to recompute Example 27.

Theorem 3

熟記 !!! 請靈活的將它們用於解題

Functions of Multiple Random Variables Chapter 3-2 Discrete Random Variables

Functions of Multiple Random Variables Let X, Y be two random variables with jpmf p X,Y (x, y). Suppose that 1-1 p U,V (u, v)=?

Functions of Multiple Random Variables Let X, Y be two random variables with jpmf p X,Y (x, y). Suppose that 1-1 p U,V (u, v)=? Example: X $/month Y $/month p X,Y (x, y)  已知 p U,V (u, v) = ?

Functions of Multiple Random Variables Let X, Y be two random variables with jpmf p X,Y (x, y). Suppose that 1-1 p U,V (u, v)=? Example: p X,Y (x, y)  已知 p U,V (u, v) = ? 1-1 implies invertible.

Functions of Multiple Random Variables Let X, Y be two random variables with jpmf p X,Y (x, y). Suppose that 1-1 p U,V (u, v)=? 1-1 implies invertible.

Example 30 Let X~B(n, p 1 ), Y~B(m, p 2 ) be two independent random variables. U = X + Y V = X  Y Let X~B(n, p 1 ), Y~B(m, p 2 ) be two independent random variables. U = X + Y V = X  Y LetFind p U,V (u, v).

Example 30 Let X~B(n, p 1 ), Y~B(m, p 2 ) be two independent random variables. U = X + Y V = X  Y Let X~B(n, p 1 ), Y~B(m, p 2 ) be two independent random variables. U = X + Y V = X  Y LetFind p U,V (u, v). and