Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.

Slides:



Advertisements
Similar presentations
Discrete Uniform Distribution
Advertisements

1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2.
Independence of random variables
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Section 2.6 Consider a random variable X = the number of occurrences in a “unit” interval. Let = E(X) = expected number of occurrences in a “unit” interval.
Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution.
Prof. Bart Selman Module Probability --- Part d)
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Chapter 9 Mathematical Preliminaries. Stirling’s Approximation Fig by trapezoid rule take antilogs Fig by midpoint formula take antilogs.
Section 8.3 Suppose X 1, X 2,..., X n are a random sample from a distribution defined by the p.d.f. f(x)for a < x < b and corresponding distribution function.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
CONTINUOUS RANDOM VARIABLES. Continuous random variables have values in a “continuum” of real numbers Examples -- X = How far you will hit a golf ball.
Section 3.3 If the space of a random variable X consists of discrete points, then X is said to be a random variable of the discrete type. If the space.
Section 10.6 Recall from calculus: lim= lim= lim= x  y  — x x — x kx k 1 + — y y eekek (Let y = kx in the previous limit.) ekek If derivatives.
Lesson #13 The Binomial Distribution. If X follows a Binomial distribution, with parameters n and p, we use the notation X ~ B(n, p) p x (1-p) (n-x) f(x)
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Section 5.6 Important Theorem in the Text: The Central Limit TheoremTheorem (a) Let X 1, X 2, …, X n be a random sample from a U(–2, 3) distribution.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Section 3.5 Let X have a gamma( ,  ) with  = r/2, where r is a positive integer, and  = 2. We say that X has a chi-square distribution with r degrees.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Section 3.6 Recall that y –1/2 e –y dy =   0 (Multivariable Calculus is required to prove this!)  (1/2) = Perform the following change of variables.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
1 Chapter 16 Random Variables. 2 Expected Value: Center A random variable assumes a value based on the outcome of a random event.  We use a capital letter,
Convergence in Distribution
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
Sample Variability Consider the small population of integers {0, 2, 4, 6, 8} It is clear that the mean, μ = 4. Suppose we did not know the population mean.
DISCRETE RANDOM VARIABLES.
MATH 4030 – 4B CONTINUOUS RANDOM VARIABLES Density Function PDF and CDF Mean and Variance Uniform Distribution Normal Distribution.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
2.Find the turning point of the function given in question 1.
4-5 Exploring Polynomial Functions Locating Zeros.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
Adds the rectangles, where n is the number of partitions (rectangles) Height of rectangle for each of the x-values in the interval Width of partition:
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
P robability Limit Infinity 郭俊利 2010/05/11.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Chapter 4 Mathematical Expectation.
Introduction For inference on the difference between the means of two populations, we need samples from both populations. The basic assumptions.
Random Variable 2013.
Math a Discrete Random Variables
DISCRETE RANDOM VARIABLES
Chapter 4: Mathematical Expectation:
Probability Review for Financial Engineers
More about Normal Distributions
ASV Chapters 1 - Sample Spaces and Probabilities
Random Variable X, with pmf p(x) or pdf f(x)
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Chebychev, Hoffding, Chernoff
3.0 Functions of One Random Variable
Virtual University of Pakistan
Exam 2 - Review Chapters
Analysis of Engineering and Scientific Data
9. Limit Theorems.
ASV Chapters 1 - Sample Spaces and Probabilities
4.1 Mathematical Expectation
4.1 Mathematical Expectation
Mathematical Expectation
Presentation transcript:

Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f. f(x), but what follows applies to a discrete type random variable where integral signs are replaced by summation signs and the p.d.f. is replaced by a p.m.f. For any k  1, we observe that  2 = E[(X –  ) 2 ] = –   (x –  ) 2 f(x) dx = {x : |x –  |  k  } (x –  ) 2 f(x) dx + {x : |x –  | < k  } (x –  ) 2 f(x) dx  {x : |x –  |  k  } (x –  ) 2 f(x) dx  {x : |x –  |  k  } k 2  2 f(x) dx = {x : |x –  |  k  } k 2  2 f(x) dx

We now have that  2  k 2  2 P(|X –  |  k  ) which implies that 1 P(|X –  |  k  )  — k 2 This is Chebyshev’s inequality and is stated in Theorem We may also write 1 P(|X –  | < k  )  1 – — k 2

1. (a) (b) (c) Let X be a random selection from one of the first 9 positive integers. Find the mean and variance of X.  = E(X) =  2 = Var(X) = 5 20 — 3 Find P(|X – 5|  4). If Y is any random variable with the same mean and variance as X, find the upper bound on P(|Y – 5|  4) that we get from Chebyshev’s inequality. P(|X – 5|  4) =P(X = 1  X = 9) = 2 — 9 P(|Y – 5|  4) =P[|Y – 5|  4(3/20) 1/2 (20/3) 1/2 ]  20/3 5 —— =— 1612 k 

2. (a) (b) Let X be a random variable with mean 100 and variance 75. Find the lower bound on P(|X – 100| < 10) that we get from Chebyshev’s inequality. P(|X – 100| < 10) = P[|X – 100| < (2/  3)(5  3)]  1 1 – ——– = (2/  3) 2 Find what the value of P(|X – 100| < 10) would be, if X had a U(85, 115) distribution. 1 — 4 P(|X – 100| < 10) = 20 — = 30 P(90 < X < 110) = 2 — 3 k 

3. (a) (b) Let Y have a b(n, 0.75) distribution. Find the lower bound on P(|Y / n – 0.75| < 0.05) that we get from Chebyshev’s inequality when n = 12. Find the exact value of P(|Y / n – 0.75| < 0.05) when n = 12. When n = 12, E(Y) =, and Var(Y) =, and92.25 P(|Y / n – 0.75| < 0.05) =P(|Y – 9| < 0.6) = P[|Y – 9| < (0.4)(1.5)]  (A lower bound cannot be found, since 0.4 < 1.) When n = 12, P(|Y / n – 0.75| < 0.05) = P(|Y – 9| < 0.6) = P(Y = 9) = – =

(c) (d) Find the lower bound on P(|Y / n – 0.75| < 0.05) that we get from Chebyshev’s inequality when n = 300. By using the normal approximation, show that when n = 300, then P(|Y / n – 0.75| < 0.05) = When n = 300, E(Y) =, and Var(Y) =, and P(|Y / n – 0.75| < 0.05) =P(|Y – 225| < 15) = P[|Y – 225| < (2)(7.5)]  1 1 – — =