Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Exponential and Poisson Chapter 5 Material. 2 Poisson Distribution [Discrete] Poisson distribution describes many random processes quite well and is mathematically.
E(X 2 ) = Var (X) = E(X 2 ) – [E(X)] 2 E(X) = The Mean and Variance of a Continuous Random Variable In order to calculate the mean or expected value of.
Random Variables ECE460 Spring, 2012.
Sampling: Final and Initial Sample Size Determination
Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
Continuous Random Variable (1). Discrete Random Variables Probability Mass Function (PMF)
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
Stochastic Processes Dr. Talal Skaik Chapter 10 1 Probability and Stochastic Processes A friendly introduction for electrical and computer engineers Electrical.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
MA 102 Statistical Controversies Monday, April 1, 2002 Today: Randomness and probability Probability models and rules Reading (for Wednesday): Chapter.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Chapter 5: Probability Concepts
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Continuous random variables Uniform and Normal distribution (Sec. 3.1, )
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
L7.1b Continuous Random Variables CONTINUOUS RANDOM VARIABLES NORMAL DISTRIBUTIONS AD PROBABILITY DISTRIBUTIONS.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Binomial distribution Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
QBM117 Business Statistics Probability and Probability Distributions Continuous Probability Distributions 1.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
6.1B Standard deviation of discrete random variables continuous random variables AP Statistics.
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
Lecture 2 Basics of probability in statistical simulation and stochastic programming Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius,
Math b (Discrete) Random Variables, Binomial Distribution.
Lecture V Probability theory. Lecture questions Classical definition of probability Frequency probability Discrete variable and probability distribution.
EQT 272 PROBABILITY AND STATISTICS
Basic Concepts of Probability CEE 431/ESS465. Basic Concepts of Probability Sample spaces and events Venn diagram  A Sample space,  Event, A.
Random Variables Ch. 6. Flip a fair coin 4 times. List all the possible outcomes. Let X be the number of heads. A probability model describes the possible.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
Discrete and Continuous Random Variables. Yesterday we calculated the mean number of goals for a randomly selected team in a randomly selected game.
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
EQT 272 PROBABILITY AND STATISTICS
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
Random Variables A random variable is a rule that assigns exactly one value to each point in a sample space for an experiment. A random variable can be.
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
Copyright ©2011 Brooks/Cole, Cengage Learning Random Variables Class 34 1.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
Unit 4 Review. Starter Write the characteristics of the binomial setting. What is the difference between the binomial setting and the geometric setting?
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Random Variables By: 1.
CHAPTER 6 Random Variables
Lecture 3 B Maysaa ELmahi.
The Exponential and Gamma Distributions
EQT 272 PROBABILITY AND STATISTICS
CHAPTER 2 RANDOM VARIABLES.
STAT 311 REVIEW (Quick & Dirty)
Cumulative distribution functions and expected values
AP Statistics: Chapter 7
Chapter 5 Statistical Models in Simulation
Means and Variances of Random Variables
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
E370 Statistical analysis for bus & econ
Statistics Lecture 12.
6.3 Sampling Distributions
South Dakota School of Mines & Technology Expectations for Exponential
Distributions and Densities
Chapter 7 The Normal Distribution and Its Applications
Experiments, Outcomes, Events and Random Variables: A Revisit
Introduction to Probability: Solutions for Quizzes 4 and 5
Continuous Random Variables: Basics
Presentation transcript:

Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama

Continuous Random Variable When random variable X takes values on an interval For example GPA of students X  [0, 4] High day temperature in Mobile X  ( 20,∞) Recall in case of discrete variables a simple event was described as (X = k) and then we can compute P(X = k) which is called probability mass function In case of continuous variable we make a change in the definition of an event.

Continuous Random Variable Let X  [0,4], then there are infinite number of values which x may take. If we assign probability to each value then P(X=k)  0 for a continuous variable In this case we define an event as (x-  x ≤ X ≤ x+  x ) where  x is a very tiny increment in x. And thus we assign the probability to this event P(x-  x ≤ X ≤ x+  x ) = f(x) dx f(x) is called probability density function (pdf)

Properties of pdf

(cumulative) Distribution Function The cumulative distribution function of a continuous random variable is Where f(x) is the probability density function of x.

Relation between f(x) and F(x)

Mean and Variance

Exercise 5.2 To find the value of k Thus f(x) = 4x 3 for 0<x<1 P(1/4<x<3/4) = = P(x>2/3) = =

Exercise 5.7 Exercise 5.13

Probability and density curves P (a<Y<b): P(100<Y<150)=0.42 Useful link:

Normal Distribution X = normal random variate with parameters µ and σ if its probability density function is given by µ and σ are called parameters of the normal distribution rve.html

Standard Normal Distribution The distribution of a normal random variable with mean 0 and variance 1 is called a standard normal distribution.

Standard Normal Distribution The letter Z is traditionally used to represent a standard normal random variable. z is used to represent a particular value of Z. The standard normal distribution has been tabularized.

Standard Normal Distribution Given a standard normal distribution, find the area under the curve (a) to the left of z = (b) to the left of z = 2.01 (c) to the right of z = –0.99 (d) to right of z = 1.50 (e) between z = and z = 0.58

Standard Normal Distribution Given a standard normal distribution, find the value of k such that (a) P(Z < k) =.1271 (b) P(Z < k) =.9495 (c) P(Z > k) =.8186 (d) P(Z > k) =.0073 (e) P( 0.90 < Z < k) =.1806 (f) P( k < Z < 1.02) =.1464

Normal Distribution Any normal random variable, X, can be converted to a standard normal random variable: z = (x – μ x )/  x Useful link: (pictures of normal curves borrowed from:

Normal Distribution Given a random Variable X having a normal distribution with μ x = 10 and  x = 2, find the probability that X < z x

Relationship between the Normal and Binomial Distributions The normal distribution is often a good approximation to a discrete distribution when the discrete distribution takes on a symmetric bell shape. Some distributions converge to the normal as their parameters approach certain limits. Theorem 6.2: If X is a binomial random variable with mean μ = np and variance  2 = npq, then the limiting form of the distribution of Z = (X – np)/(npq).5 as n  , is the standard normal distribution, n(z;0,1).

Exercise 5.19

Uniform distribution The uniform distribution with parameters α and β has the density function

Exponential Distribution: Basic Facts Density CDF Mean Variance

Key Property: Memorylessness Reliability: Amount of time a component has been in service has no effect on the amount of time until it fails Inter-event times: Amount of time since the last event contains no information about the amount of time until the next event Service times: Amount of remaining service time is independent of the amount of service time elapsed so far

Exponential Distribution The exponential distribution is a very commonly used distribution in reliability engineering. Due to its simplicity, it has been widely employed even in cases to which it does not apply. The exponential distribution is used to describe units that have a constant failure rate. The single-parameter exponential pdf is given by: where: · λ = constant failure rate, in failures per unit of measurement, e.g. failures per hour, per cycle, etc. · λ =.1/m · m = mean time between failures, or to a failure. · T = operating time, life or age, in hours, cycles, miles, actuations, etc. This distribution requires the estimation of only one parameter,, for its application.

Joint probabilities For discrete joint probability density function (joint pdf) of a k-dimensional discrete random variable X = (X1, X2, …,Xk) is defined to be f(x1,x2,…,xk) = P(X1 = x1, X2 = x2, …,Xk = xk) for all possible values x = (x1,x2,…,xk) in X. Let (X, Y) have the joint probability function specified in the following table

Joint distribution Consider

Joint probability distribution Joint Probability Distribution Function f(x,y) > 0 Marginal pdf of x & y here is an example x =1, 2, 3 y = 1, 2

Marginal pdf of x & y Consider the following example x = 1,2,3 y = 1,2

Independent Random Variables If

Properties of expectations for a discrete pdf, f(x), The expected value of the function u(x), E[u(X)] = Mean =  = E[X] = Variance = Var(X) =  2 =  x 2 =E[(X-  ) 2 ] = E[X 2 ] -  2 For a continuous pdf, f(x) E(X) = Mean of X = E[(X-  )2] = E(X2) -[E(X)]2 = Variance of X =

Properties of expectations E(aX+b) = aE(X) + b Var (aX+b) = a 2 var(X)

Mean and variance of Z Is called standardized variable E(Z) = 0 and var(Z) = 1

Linear combination of two independent variables Let x1 and x2 be two Independent random variables then their linear combination y = ax1+bx2 is a random variable. E(y) = aE(x1)+bE(x2) Var(y) = a 2 var(x1)+b 2 var(x2)

Mean and variance of the sample mean x 1, x 2,…x n are independent identically distributed random variables (i.e. a sample coming from a population) with common mean µ and common variance σ 2 The sample mean is a linear combination of these i.i.d. variables and hence itself is a random variable