Continuous Random Variables. L. Wang, Department of Statistics University of South Carolina; Slide 2 Continuous Random Variable A continuous random variable.

Slides:



Advertisements
Similar presentations
The Poisson distribution
Advertisements

Chapter 6 Continuous Random Variables and Probability Distributions
JMB Chapter 6 Part 1 v2 EGR 252 Spring 2009 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
CS433: Modeling and Simulation
Engineering Statistics ECIV 2305 Chapter 2 Section 2.2 Continuous Random Variables.
Continuous Random Variables and the Central Limit Theorem
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
5.2 Continuous Random Variable
Continuous Distributions
Continuous Probability Distributions
Part 9: Normal Distribution 9-1/47 Statistics and Data Analysis Professor William Greene Stern School of Business IOMS Department Department of Economics.
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
CONTINUOUS RANDOM VARIABLES These are used to define probability models for continuous scale measurements, e.g. distance, weight, time For a large data.
Probability Densities
Introduction to the Continuous Distributions
Chapter 6 Continuous Random Variables and Probability Distributions
QBM117 Business Statistics
CHAPTER 6 Statistical Analysis of Experimental Data
Chapter 5 Continuous Random Variables and Probability Distributions
QMS 6351 Statistics and Research Methods Probability and Probability distributions Chapter 4, page 161 Chapter 5 (5.1) Chapter 6 (6.2) Prof. Vera Adamchik.
Statistics Alan D. Smith.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
McGraw-Hill Ryerson Copyright © 2011 McGraw-Hill Ryerson Limited. Adapted by Peter Au, George Brown College.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Chapter 4 Continuous Random Variables and Probability Distributions
Distributions Dr. Omar Al Jadaan Assistant Professor – Computer Science & Mathematics.
1 Ch5. Probability Densities Dr. Deshi Ye
Chapter 7: The Normal Probability Distribution
CA200 Quantitative Analysis for Business Decisions.
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
Chapter 5 Some Discrete Probability Distributions.
JMB Chapter 6 Lecture 3 EGR 252 Spring 2011 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
JMB Ch6 Lecture 3 revised 2 EGR 252 Fall 2011 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2010 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Chapter 8 Continuous.
Exponential and Chi-Square Random Variables
Topic 4 - Continuous distributions
Chapter 3 Basic Concepts in Statistics and Probability
Continuous Probability Distributions  Continuous Random Variable  A random variable whose space (set of possible values) is an entire interval of numbers.
Ch5 Continuous Random Variables
Theory of Probability Statistics for Business and Economics.
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted by.
Random Variables Numerical Quantities whose values are determine by the outcome of a random experiment.
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted by.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Continuous Random Variables Chapter 6.
1 Topic 4 - Continuous distributions Basics of continuous distributions Uniform distribution Normal distribution Gamma distribution.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 6 Probability Distributions Section 6.2 Probabilities for Bell-Shaped Distributions.
1 Continuous Probability Distributions Continuous Random Variables & Probability Distributions Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering.
Copyright © 2014, 2013, 2010 and 2007 Pearson Education, Inc. Chapter The Normal Probability Distribution 7.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Review of Chapter
Chapter 6 The Normal Distribution.  The Normal Distribution  The Standard Normal Distribution  Applications of Normal Distributions  Sampling Distributions.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 6-1 Chapter 6 The Normal Distribution and Other Continuous Distributions Basic Business.
1 Math 10 Part 4 Slides Continuous Random Variables and the Central Limit Theorem © Maurice Geraghty, 2015.
Chap 5-1 Discrete and Continuous Probability Distributions.
Yandell – Econ 216 Chap 6-1 Chapter 6 The Normal Distribution and Other Continuous Distributions.
Chapter 4 Applied Statistics and Probability for Engineers
Continuous Probability Distributions
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Discrete Random Variables
Probability Distributions: a review
Random variables (r.v.) Random variable
The Exponential and Gamma Distributions
Section 7.3: Probability Distributions for Continuous Random Variables
CHAPTER 2 RANDOM VARIABLES.
Continuous Random Variables
Statistics and Data Analysis
Fundamental Sampling Distributions and Data Descriptions
Presentation transcript:

Continuous Random Variables

L. Wang, Department of Statistics University of South Carolina; Slide 2 Continuous Random Variable A continuous random variable is one for which the outcome can be any value in an interval of the real number line. A continuous random variable is one for which the outcome can be any value in an interval of the real number line. Usually a measurement. Usually a measurement. Examples Examples –Let Y = length in mm –Let Y = time in seconds –Let Y = temperature in ºC

L. Wang, Department of Statistics University of South Carolina; Slide 3 Continuous Random Variable We don’t calculate P(Y = y), we calculate P(a < Y < b), where a and b are real numbers. We don’t calculate P(Y = y), we calculate P(a < Y < b), where a and b are real numbers. For a continuous random variable For a continuous random variable P(Y = y) = 0.

L. Wang, Department of Statistics University of South Carolina; Slide 4 Continuous Random Variables The probability density function (pdf) when plotted against the possible values of Y forms a curve. The area under an interval of the curve is equal to the probability that Y is in that interval. The probability density function (pdf) when plotted against the possible values of Y forms a curve. The area under an interval of the curve is equal to the probability that Y is in that interval. f(y) Y 0.40 ab

L. Wang, Department of Statistics University of South Carolina; Slide 5 The entire area under a probability density curve for a continuous random variable A. Is always greater than 1. B. Is always less than 1. C. Is always equal to 1. D. Is undeterminable.

L. Wang, Department of Statistics University of South Carolina; Slide 6 Properties of a Probability Density Function (pdf) 1) f(y) > 0 for all possible intervals of y. 2) 2) 3) If y 0 is a specific value of interest, then the cumulative distribution function (cdf) is 4) If y 1 and y 2 are specific values of interest, then

L. Wang, Department of Statistics University of South Carolina; Slide 7 Grams of lead per liter of gasoline has the probability density function: f(y) = 12.5y for 0.1 < y < 0.5 What is the probability that the next liter of gasoline has less than 0.3 grams of lead?

L. Wang, Department of Statistics University of South Carolina; Slide 8 Suppose a random variable Y has the following probability density function: f(y) = y if 0<y<1 2-y if 1 < y<2 0 if 2 < y. Find the complete form of the cumulative distribution function F(y) for any real value y.

L. Wang, Department of Statistics University of South Carolina; Slide 9 Expected Value for a Continuous Random Variable Recall Expected Value for a discrete random variable: Recall Expected Value for a discrete random variable: Expected value for a continuous random variable: Expected value for a continuous random variable:

L. Wang, Department of Statistics University of South Carolina; Slide 10 Variance for Continuous Random Variable Recall: Variance for a discrete random variable: Variance for a continuous random variable:

L. Wang, Department of Statistics University of South Carolina; Slide 11 Difference between Discrete and continuous random variables Possible values that can be assumed Probability distribution function Cumulative distribution function Expected value Variance

L. Wang, Department of Statistics University of South Carolina; Slide 12 Times Between Industrial Accidents The times between accidents for a 10- year period at a DuPont facility can be modeled by the exponential distribution. The times between accidents for a 10- year period at a DuPont facility can be modeled by the exponential distribution. where λ is the accident rate (the expected number of accidents per day in this case)

L. Wang, Department of Statistics University of South Carolina; Slide 13 Example of time between accidents Let Y = the number of days between two accidents. Time 12 days 35 days 5 days 12 days 35 days 5 days ● ● ● ● ● ● ● ● ● ● Accident Accident Accident #1#2 #3 #1#2 #3

L. Wang, Department of Statistics University of South Carolina; Slide 14 Times Between Industrial Accidents Suppose in a 1000 day period there were 50 accidents. Suppose in a 1000 day period there were 50 accidents. or λ = 50/1000 = 0.05 accidents per day 1/λ = 1000/50 = 20 days between accidents

L. Wang, Department of Statistics University of South Carolina; Slide 15 What is the probability that this facility will go less than 10 days between the next two accidents? ? f(y) = 0.05e -0.05y

L. Wang, Department of Statistics University of South Carolina; Slide 16 ? Recall:

L. Wang, Department of Statistics University of South Carolina; Slide 17 In General…

L. Wang, Department of Statistics University of South Carolina; Slide 18 Exponential Distribution

L. Wang, Department of Statistics University of South Carolina; Slide 19 If the time to failure for an electrical component follows an exponential distribution with a mean time to failure of 1000 hours, what is the probability that a randomly chosen component will fail before 750 hours? Hint: λ is the failure rate (expected number of failures per hour).

L. Wang, Department of Statistics University of South Carolina; Slide 20 Mean and Variance for an Exponential Random Variable Note: Mean = Standard Deviation

L. Wang, Department of Statistics University of South Carolina; Slide 21 The time between accidents at a factory follows an exponential distribution with a historical average of 1 accident every 900 days. What is the probability that that there will be more than 1200 days between the next two accidents?

L. Wang, Department of Statistics University of South Carolina; Slide 22 If the time between accidents follows an exponential distribution with a mean of 900 days, what is the probability that there will be less than 900 days between the next two accidents?

L. Wang, Department of Statistics University of South Carolina; Slide 23 Relationship between Exponential & Poisson Distributions Recall that the Poisson distribution is used to compute the probability of a specific number of events occurring in a particular interval of time or space. Recall that the Poisson distribution is used to compute the probability of a specific number of events occurring in a particular interval of time or space. Instead of the number of events being the random variable, consider the time or space between events as the random variable. Instead of the number of events being the random variable, consider the time or space between events as the random variable.

L. Wang, Department of Statistics University of South Carolina; Slide 24 Relationship between Exponential & Poisson Exponential distribution models time (or space) between Poisson events. TIME

L. Wang, Department of Statistics University of South Carolina; Slide 25 Exponential or Poisson Distribution? We model the number of industrial accidents occurring in one year. We model the number of industrial accidents occurring in one year. We model the length of time between two industrial accidents (assuming an accident occurring is a Poisson event). We model the length of time between two industrial accidents (assuming an accident occurring is a Poisson event). We model the time between radioactive particles passing by a counter (assuming a particle passing by is a Poisson event). We model the time between radioactive particles passing by a counter (assuming a particle passing by is a Poisson event). We model the number of radioactive particles passing by a counter in one hour We model the number of radioactive particles passing by a counter in one hour

L. Wang, Department of Statistics University of South Carolina; Slide 26 Recall: For a Poisson Distribution y = 0,1,2,… where λ is the mean number of events per base unit of time or space and t is the number of base units inspected. The probability that no events occur in a span of time (or space) is:

L. Wang, Department of Statistics University of South Carolina; Slide 27 Now let T = the time (or space) until the next Poisson event. the probability that the length of time (or space) until the next event is greater than some given time (or space), t, is the same as the probability that no events will occur in time (or space) t. In other words, the probability that the length of time (or space) until the next event is greater than some given time (or space), t, is the same as the probability that no events will occur in time (or space) t.

L. Wang, Department of Statistics University of South Carolina; Slide 28 Radioactive Particles The arrival of radioactive particles at a counter are Poisson events. So the number of particles in an interval of time follows a Poisson distribution. Suppose we average 2 particles per millisecond. The arrival of radioactive particles at a counter are Poisson events. So the number of particles in an interval of time follows a Poisson distribution. Suppose we average 2 particles per millisecond. What is the probability that no particles will pass the counter in the next 3 milliseconds? What is the probability that no particles will pass the counter in the next 3 milliseconds? What is the probability that more than 3 millisecond will elapse before the next particle passes? What is the probability that more than 3 millisecond will elapse before the next particle passes?

L. Wang, Department of Statistics University of South Carolina; Slide 29 Machine Failures If the number of machine failures in a given interval of time follows a Poisson distribution with an average of 1 failure per 1000 hours, what is the probability that there will be no failures during the next 2000 hours? If the number of machine failures in a given interval of time follows a Poisson distribution with an average of 1 failure per 1000 hours, what is the probability that there will be no failures during the next 2000 hours? What is the probability that the time until the next failure is more than 2000 hours? What is the probability that the time until the next failure is more than 2000 hours?

L. Wang, Department of Statistics University of South Carolina; Slide 30 Number of failures in an interval of time follows a Poisson distribution. If the mean time to failure is 1000 hours, what is the probability that more than 2500 hours will pass before the next failure occurs? Number of failures in an interval of time follows a Poisson distribution. If the mean time to failure is 1000 hours, what is the probability that more than 2500 hours will pass before the next failure occurs? A. e -4 B. 1 – e -4 C. e -2.5 D. 1 – e -2.5

L. Wang, Department of Statistics University of South Carolina; Slide 31 If ten of these components are used in different devices that run independently, what is the probability that at least one will still be operating at 2500 hours? What about he probability that exact 3 of them will be still operating after 2500 hours? Challenging questions

L. Wang, Department of Statistics University of South Carolina; Slide 32 Normal Distribution f(y) = E[Y] = μ and Var[Y] = σ 2 f(y) y

L. Wang, Department of Statistics University of South Carolina; Slide 33 Normal Distribution Characteristics Characteristics –Bell-shaped curve –- < y < +–- < y < +–- < y < +–- < y < + –μ determines distribution location and is the highest point on curve –Curve is symmetric about μ –σ determines distribution spread –Curve has its points of inflection at μ + σ

L. Wang, Department of Statistics University of South Carolina; Slide 34 Normal Distribution σ σσ σ μ σ

L. Wang, Department of Statistics University of South Carolina; Slide 35 Normal Distribution N(μ = 0, σ = 1) f(y) y N(μ = 5, σ = 1)

L. Wang, Department of Statistics University of South Carolina; Slide 36 Normal Distribution N(μ = 0,σ = 0.5) f(y) y N(μ = 0,σ = 1)

L. Wang, Department of Statistics University of South Carolina; Slide 37 Normal Distribution f(y) y N(μ = 0, σ = 1) N(μ = 5, σ = 0.5)

L. Wang, Department of Statistics University of South Carolina; Slide Rule μ + 1σ covers approximately 68% μ + 2σ covers approximately 95% μ + 3σ covers approximately99.7% µ µ+1σµ+2σµ+3σµ-1σµ-2σµ-3σ

L. Wang, Department of Statistics University of South Carolina; Slide 39 Earthquakes in a California Town Since 1900, the magnitude of earthquakes that measure 0.1 or higher on the Richter Scale in a certain location in California is distributed approximately normally, with μ = 6.2 and σ = 0.5, according to data obtained from the United States Geological Survey.

L. Wang, Department of Statistics University of South Carolina; Slide 40 Earthquake Richter Scale Readings % 34% 95% 13.5% 2.5%

L. Wang, Department of Statistics University of South Carolina; Slide 41 Approximately what percent of the earthquakes are above 5.7 on the Richter Scale? % 34% 95% 13.5% 2.5%

L. Wang, Department of Statistics University of South Carolina; Slide 42 The highest an earthquake can read and still be in the lowest 2.5% is _ % 34% 95% 13.5% 2.5%

L. Wang, Department of Statistics University of South Carolina; Slide 43 The approximate probability an earthquake is above 6.7 is ______ % 34% 95% 13.5% 2.5%

L. Wang, Department of Statistics University of South Carolina; Slide 44 Standard Normal Distribution Standard normal distribution is the normal distribution that has a mean of 0 and standard deviation of 1. Standard normal distribution is the normal distribution that has a mean of 0 and standard deviation of 1. N(µ = 0, σ = 1)

L. Wang, Department of Statistics University of South Carolina; Slide 45 Z is Traditionally used as the Symbol for a Standard Normal Random Variable Z Y

L. Wang, Department of Statistics University of South Carolina; Slide 46 Normal  Standard Normal We can compare observations from two different normal distributions by converting the observations to standard normal and comparing the standardized observations. Any normally distributed random variable can be converted to standard normal using the following formula:

L. Wang, Department of Statistics University of South Carolina; Slide 47 What is the standard normal value (or Z value) for a Richter reading of 6.5? Recall Y ~ N(µ=6.2, σ=0.5)

L. Wang, Department of Statistics University of South Carolina; Slide 48Example Consider two towns in California. The distributions of the Richter readings over 0.1 in the two towns are: Consider two towns in California. The distributions of the Richter readings over 0.1 in the two towns are: Town 1:X ~ N(µ = 6.2, σ = 0.5) Town 2:Y ~ N(µ = 6.2, σ = 1). - What is the probability that Town 1 has an earthquake over 7 (on the Richter scale)? - What is the probability that Town 2 has an earthquake over 7?

L. Wang, Department of Statistics University of South Carolina; Slide 49 Town 1 Town 2 Town 1 Town 2 Town 1: Town 2: Z X Y Z

L. Wang, Department of Statistics University of South Carolina; Slide 50 Standard Normal

L. Wang, Department of Statistics University of South Carolina; Slide 51 The thickness of a certain steel bolt that continuously feeds a manufacturing process is normally distributed with a mean of 10.0 mm and standard deviation of 0.3 mm. Manufacturing becomes concerned about the process if the bolts get thicker than 10.5 mm or thinner than 9.5 mm. The thickness of a certain steel bolt that continuously feeds a manufacturing process is normally distributed with a mean of 10.0 mm and standard deviation of 0.3 mm. Manufacturing becomes concerned about the process if the bolts get thicker than 10.5 mm or thinner than 9.5 mm. Find the probability that the thickness of a randomly selected bolt is > 10.5 or 10.5 or < 9.5 mm.

L. Wang, Department of Statistics University of South Carolina; Slide 52 Inverse Normal Probabilities Sometimes we want to answer a question which is the reverse situation. Here we know the probability, and want to find the corresponding value of Y. Sometimes we want to answer a question which is the reverse situation. Here we know the probability, and want to find the corresponding value of Y. Area=0.025 y = ?

L. Wang, Department of Statistics University of South Carolina; Slide 53 Inverse Normal Probabilities Approximately 2.5% of the bolts produced will have thicknesses less than ______. Approximately 2.5% of the bolts produced will have thicknesses less than ______. Z Y ? 0.025

L. Wang, Department of Statistics University of South Carolina; Slide 54 Inverse Normal Probabilities Approximately 2.5% of the bolts produced will have thicknesses less than ______. Approximately 2.5% of the bolts produced will have thicknesses less than ______.

L. Wang, Department of Statistics University of South Carolina; Slide 55 Inverse Normal Probabilities Approximately 1% of the bolts produced will have thicknesses less than ______. Approximately 1% of the bolts produced will have thicknesses less than ______. Z Y ? 0.01

L. Wang, Department of Statistics University of South Carolina; Slide 56 Inverse Normal Probabilities Approximately 1% of the bolts produced will have thicknesses less than ______. Approximately 1% of the bolts produced will have thicknesses less than ______.