Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.

Slides:



Advertisements
Similar presentations
1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2.
Advertisements

Section 2.1 Important definitions in the text: The definition of random variable and space of a random variable Definition The definition of probability.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Introduction to stochastic process
Section 5.1 Let X be a continuous type random variable with p.d.f. f(x) where f(x) > 0 for a < x < b, where a = – and/or b = + is possible; we also let.
Section 5.4 is n  a i X i i = 1 n  M i (a i t). i = 1 M 1 (a 1 t) M 2 (a 2 t) … M n (a n t) = Y = a 1 X 1 + a 2 X 2 + … + a n X n = If X 1, X 2, …, X.
Section 2.4 For any random variable X, the cumulative distribution function (c.d.f.) of X is defined to be F(x) = P(X  x).
Section 2.6 Consider a random variable X = the number of occurrences in a “unit” interval. Let = E(X) = expected number of occurrences in a “unit” interval.
Section 2.3 Suppose X is a discrete-type random variable with outcome space S and p.m.f f(x). The mean of X is The variance of X is The standard deviation.
Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution.
Section 6.1 Let X 1, X 2, …, X n be a random sample from a distribution described by p.m.f./p.d.f. f(x ;  ) where the value of  is unknown; then  is.
Section 8.3 Suppose X 1, X 2,..., X n are a random sample from a distribution defined by the p.d.f. f(x)for a < x < b and corresponding distribution function.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Suppose an ordinary, fair six-sided die is rolled (i.e., for i = 1, 2, 3, 4, 5, 6, there is one side with i spots), and X = “the number of spots facing.
Section 3.3 If the space of a random variable X consists of discrete points, then X is said to be a random variable of the discrete type. If the space.
Section 10.6 Recall from calculus: lim= lim= lim= x  y  — x x — x kx k 1 + — y y eekek (Let y = kx in the previous limit.) ekek If derivatives.
4. Review of Basic Probability and Statistics
Today Today: More on the Normal Distribution (section 6.1), begin Chapter 8 (8.1 and 8.2) Assignment: 5-R11, 5-R16, 6-3, 6-5, 8-2, 8-8 Recommended Questions:
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Chapter 4: Joint and Conditional Distributions
Review of Probability and Statistics
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
Section 5.6 Important Theorem in the Text: The Central Limit TheoremTheorem (a) Let X 1, X 2, …, X n be a random sample from a U(–2, 3) distribution.
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Section 3.5 Let X have a gamma( ,  ) with  = r/2, where r is a positive integer, and  = 2. We say that X has a chi-square distribution with r degrees.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Section 3.6 Recall that y –1/2 e –y dy =   0 (Multivariable Calculus is required to prove this!)  (1/2) = Perform the following change of variables.
Section 2.5 Important definition in the text: The definition of the moment generating function (m.g.f.) Definition If S is the space for a random.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Properties of expectation.
22/10/2015 How many words can you make? Three or more letters Must all include A A P I Y M L F.
Overview of Probability Theory In statistical theory, an experiment is any operation that can be replicated infinitely often and gives rise to a set of.
Section 3.7 Suppose the number of occurrences in a “unit” interval follows a Poisson distribution with mean. Recall that for w > 0, P(interval length to.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
1 Since everything is a reflection of our minds, everything can be changed by our minds.
Math b (Discrete) Random Variables, Binomial Distribution.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Chapter 5a:Functions of Random Variables Yang Zhenlin.
CONTINUOUS RANDOM VARIABLES
Section 3.7 Suppose the number of occurrences in a “unit” interval follows a Poisson distribution with mean. Recall that for w > 0, P(interval length to.
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
Distributions of Functions of Random Variables November 18, 2015
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
The Erik Jonsson School of Engineering and Computer Science Chapter 4 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Chapter 9: Joint distributions and independence CIS 3033.
Chapter 5 Joint Probability Distributions and Random Samples
Random Variable 2013.
Example: X = Cholesterol level (mg/dL)
Chapter 4: Mathematical Expectation:
ASV Chapters 1 - Sample Spaces and Probabilities
Random Variable X, with pmf p(x) or pdf f(x)
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Handout Ch 4 實習.
ASV Chapters 1 - Sample Spaces and Probabilities
7. Continuous Random Variables II
ASV Chapters 1 - Sample Spaces and Probabilities
Mathematical Expectation
Presentation transcript:

Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the same distribution. Then we say that the collection of independent and identically distributed random variables X 1, X 2, …, X n is a random sample of size n from the common distribution. Important Theorems in the Text: Theorem (See Class Exercises #1(a)(b) in Section 5.2 and #1(e) in Section 5.3.) Theorem (See Class Exercise #10 in Sections 4.1, 4.2, 4.3.) Theorem (See Class Exercises #13 & 15 in Sections 4.1, 4.2, 4.3.)

Since the selection is done with replacement, X 1 and X 2 are independent and identically distributed. The p.m.f. for that common distribution is f(x) =if x = 1, 2, 3. 4 – x —— 6 1. (a) An urn contains six chips, one $3 chip, two $2 chips, and three $1 chips. Two chips are selected at random and with replacement. The following random variables are defined: X 1 = dollar value of the first chip selected, X 2 = dollar value of the second chip selected. Explain why X 1, X 2 can be treated as a random sample of size 2 from a common distribution, and find the p.m.f. for that common distribution.

1. - continued (b) (c) Find the joint p.m.f. of (X 1, X 2 ). f(x 1 ) f(x 2 ) =if x 1 = 1, 2, 3and x 2 = 1, 2, 3. (4 – x 1 )(4 – x 2 ) —————— 36 The joint p.m.f. is Find the mean and variance for each of X 1 and X 2. (1)+ (2) + (3)= 3 — 6 2 — 6 1 — 6 E(X 1 ) = E(X 2 ) = E(X) = 5 — 3 (1)+ (4) + (9)= 3 — 6 2 — 6 1 — 6 E(X 1 2 ) = E(X 2 2 ) = E(X 2 ) = 10 — 3 Var(X 1 ) = Var(X 2 ) = Var(X) = (10/3) – (5/3) 2 = 5/9

(d) (e) Find P(X 1 + X 2 < 4) two ways: from the p.m.f. of the common distribution and also from the joint p.m.f. (3/6)(3/6) + (3/6)(2/6) + (2/6)(3/6) =21/36 = 7/12 Find E(X 1 + X 2 ) two ways: from the p.m.f. of the common distribution and also from the joint p.m.f. P(X 1 = 1, X 2 = 1) + P(X 1 = 1, X 2 = 2) + P(X 1 = 2, X 2 = 1) = or (3)(3)/36 + (3)(2)/36 + (2)(3)/36 =21/36 = 7/12 E(X 1 + X 2 ) = E(X 1 ) + E(X 2 ) = E(X 1 + X 2 ) = or 9 — 36 6 — 36 3 — — 3 (1+1) + (1+2) + (1+3) + (2+1) + 5 — — = 3 10 — 3 4 — 36 2 — 36 3 — 36 (2+2) + (2+3) + (3+1) + 2 — 36 1 — 36 (3+2) + (3+3) = 6 — 36

1. - continued (f) Find E[1 / (X 1 + X 2 )]. E = 19 —— — ——— X 1 + X 2 16 —— — —— — —— — —— — —— — —— — —— — —— — = —— 270

(g)Let Y = X 1 + X 2. Find the m.g.f. of Y, and find the p.m.f. of Y. M Y (t) =E(e tY ) = t(X 1 + X 2 ) E[e] = tX 1 E[e tX 2 e ] = [(1/2)e t + (1/3)e 2t + (1/6)e 3t ] 2 = (1/4)e 2t + (1/3)e 3t + (5/18)e 4t + (1/9)e 5t + (1/36)e 6t The p.m.f. of Y is g(y) = 1/4 if y = 2 1/3if y = 3 5/18if y = 4 1/9if y = 5 1/36if y = 6 tX 1 E[e] tX 2 E[e] =

2. (a) (b) Suppose W, X, and Y are mutually independent random variables each having an exponential distribution with E(W) = 1, E(X) = 5, and E(Y) = 10. Explain why W, X, and Y cannot be treated as a random sample of size 3. Since W, X, and Y, are not identically distributed (i.e., they do not share a common distribution), these three random variables cannot be treated as a random sample. Find the joint p.d.f. of (W, X, Y). The joint p.d.f. is f(w,x,y) = e – w e – x / 5 —— 5 e – y / 10 —— = 10 e ——————if w > 0, x > 0, y > 0 50 – w – x / 5 – y / 10

(c)Find P[max(W, X, Y) < 2]. P[max(W,X,Y) < 2] = P(W < 2  X < 2  Y < 2 ) = P(W < 2) P(X < 2) P(Y < 2) = e – w dw 0 2 e – x / 5 —— dx e – y / 10 —— dy = 10 (1 – e – 2 )(1 – e – 2 / 5 )(1 – e – 2 / 10 )

2. - continued (d) Find the mean and variance of 4X + 3Y − 5W. E(4X + 3Y − 5W) =4E(X) + 3E(Y) − 5E(W) = 4(5) + 3(10) − 5(1) =45 Var(4X + 3Y − 5W) =4 2 Var(X) Var(Y) Var(W) = 4 2 (25) (100) (1) =1325

(e)Find E(XY – W 2 ). E(XY − W 2 ) =E(X)E(Y) − E(W 2 ) = E(X)E(Y) − {Var(W) + [E(W)] 2 } = (5)(10) − { } =48