Lecture 3 B Maysaa ELmahi.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
Probability Densities
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Lesson #15 The Normal Distribution. For a truly continuous random variable, P(X = c) = 0 for any value, c. Thus, we define probabilities only on intervals.
Environmentally Conscious Design & Manufacturing (ME592) Date: May 5, 2000 Slide:1 Environmentally Conscious Design & Manufacturing Class 25: Probability.
Section 3.3 If the space of a random variable X consists of discrete points, then X is said to be a random variable of the discrete type. If the space.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Chris Morgan, MATH G160 February 3, 2012 Lecture 11 Chapter 5.3: Expectation (Mean) and Variance 1.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
LECTURE UNIT 4.3 Normal Random Variables and Normal Probability Distributions.
Section 3.6 Recall that y –1/2 e –y dy =   0 (Multivariable Calculus is required to prove this!)  (1/2) = Perform the following change of variables.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Free Powerpoint Templates ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS MADAM ROHANA BINTI ABDUL HAMID.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Continuous Distributions The Uniform distribution from a to b.
Convergence in Distribution
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
One Random Variable Random Process.
CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University
1 Continuous Probability Distributions Continuous Random Variables & Probability Distributions Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering.
DISCRETE RANDOM VARIABLES.
EQT 272 PROBABILITY AND STATISTICS
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
CONTINUOUS RANDOM VARIABLES
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
EQT 272 PROBABILITY AND STATISTICS
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
2.Find the turning point of the function given in question 1.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Lecture 21 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
11.3 CONTINUOUS RANDOM VARIABLES. Objectives: (a) Understand probability density functions (b) Solve problems related to probability density function.
Random Variables By: 1.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Math 1304 Calculus I 2.8 – The Derivative. Definition of Derivative Definition: The derivative of a function f at a number a, denoted by f’(a) is given.
Expectations of Random Variables, Functions of Random Variables
Statistics Lecture 19.
Expectations of Random Variables, Functions of Random Variables
EQT 272 PROBABILITY AND STATISTICS
CHAPTER 2 RANDOM VARIABLES.
Probability.
Cumulative distribution functions and expected values
Chapter 2 Discrete Random Variables
STATISTICS Random Variables and Distribution Functions
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
CONTINUOUS RANDOM VARIABLES
The distribution function F(x)
General Expectation So far we have considered expected values separately for discrete and continuous random variables only. This separation is somewhat.
Probability Review for Financial Engineers
Chap 6 Continuous Random Variables Ghahramani 3rd edition
Mean and Standard Deviation
If x is a variable, then an infinite series of the form
Power Series (9.8) March 9th, 2017.
Experiments, Outcomes, Events and Random Variables: A Revisit
Continuous Distributions
Random variable. Let (, , P) is probability space.
Presentation transcript:

Lecture 3 B Maysaa ELmahi

3.3. Distribution Functions of Continuous Random Variables Recall that a random variable X is said to be continuous if its space is either an interval or a union of intervals. Definition 3.7. Let X be a continuous random variable whose space is the set of real numbers I R. A nonnegative real valued function f : IR IR is said to be the probability density function for the continuous random variable X if it satisfies: (a) −∞ ∞ 𝐟 𝐱 𝐝𝐱=𝟏 , and (b) if A is an event, then 𝐩 𝐀 = 𝐀 𝐟 𝐱 𝐝𝐱

Example 3.10. Is the real valued function f : IR IR defined by 𝐟 𝐱 = 𝟐𝐱 −𝟐 𝐢𝐟 𝟏<𝐱<𝟐 𝟎 𝐨𝐭𝐡𝐞𝐫𝐰𝐢𝐬𝐞 (a) probability density function for some random variable X? Answer: −∞ ∞ 𝐟 𝐱 𝐝𝐱= 𝟏 𝟐 𝟐𝐱 −𝟐 𝐝𝐱 =−𝟐 𝟏 𝐱 𝟐 𝟏

−𝟐 𝟏 𝟐 −𝟏 =1 Thus f is a probability density function. Example 3.11. Is the real valued function f : IR IR defined by 𝐟 𝐱 = 𝟏+ 𝐱 𝐢𝐟 −𝟏<𝐱<𝟏 𝟎 𝐨𝐭𝐡𝐞𝐫𝐰𝐢𝐬𝐞 (a) probability density function for some random variable X?

Answer: −∞ ∞ 𝑓 𝑥 𝑑𝑥= −1 1 (1+ 𝑥 ) 𝑑𝑥 = −1 0 1−𝑥 𝑑𝑥+ 0 1 (1+𝑥) 𝑑𝑥 = 𝑥− 1 2 𝑥 2 0 −1 + 𝑥+ 1 2 𝑥 2 1 0 =1 + 1 2 + 1 + 1 2 = 3

Definition 3.8. Let f(x) be the probability density function of a continuous random variable X. The cumulative distribution function F(x) of X is defined as 𝐅 𝐱 =𝐏 𝐗≤𝐱 = −∞ 𝐱 𝐟 𝐭 𝐝𝐭 Theorem 3.5. If F(x) is the cumulative distribution function of a continuous random variable X, the probability density function f(x) of X is the derivative of F(x), that is 𝐝 𝐝𝐱 𝐅 𝐱 =𝐟(𝐱)

Theorem 3.6. Let X be a continuous random variable whose c d f is F(x). Then followings are true: 𝑎 . 𝑃 𝑋<𝑥 =𝐹(𝑥) 𝑏 . 𝑃 𝑋>𝑥 =1− 𝐹(𝑥) 𝑐 . 𝑃 𝑋=𝑥 =0 𝑑 . 𝑃 𝑎<𝑋<𝑏 = 𝐹(𝑏) − 𝐹(𝑎)

Example : (H.W) 𝑓 𝑥 = 𝑘+1 𝑥 2 𝑖𝑓 0<𝑥<1 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 a. what is the value of the constant k? b. What is the probability of X between the first and third? d. What is the cumulative distribution function?

4.2. Expected Value of Random Variables Definition 4.2. Let X be a random variable with space 𝑅 𝑥 and probability density function f(x). The mean 𝜇 𝑥 of the random variable X is defined as 𝛍 𝐱 =𝐄(𝐱) = 𝒙∈ 𝑹 𝑿 𝒙𝒇(𝒙) 𝒊𝒇 X is discrete −∞ ∞ 𝒙 𝒇 𝒙 𝒅𝒙 if X is continuous

Example : x 1 2 3 P(x) 1/8 3/8 what is the mean of X? Answer: = 0* 1/8+ 1* 3/8 +2* 3/8 +3 * 1/8 = 0 + 3/8 + 6/8 + 3/8 = 12/8

Example : 𝒇 𝒙 = 𝟏 𝟓 𝒊𝒇 𝟐<𝒙<𝟕 𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆 Answer: = 2 7 𝑥 1 5 𝑑𝑥 = 1 10 𝑥 2 7 2 = 1 10 49−4 = 45 10 = 9 2

4.3. Variance of Random Variables Theorem 4.1 Let X be a random variable with p d f f(x). If a and b are any two real numbers, then 𝐚. 𝐄(𝐚𝐱+𝐛 ) = a 𝐄 𝐱 +𝐛 b. 𝐄(𝐚𝐱) = 𝐚 𝐄(𝐱) c. 𝐄(𝐚) = 𝐚 4.3. Variance of Random Variables Definition 4.4. Let X be a random variable with mean 𝜇 𝑥 . The variance of X, denoted by Var(X), is defined as

𝐕𝐚𝐫 𝐱 = ( 𝐄 𝐱 − 𝛍 𝐱 ) 𝟐 𝛔 𝟐 𝐱 =𝐄 𝐱 𝟐 − (𝛍 𝟐 𝐱 ) Example : 𝑓 𝑥 = 2(𝑥−1) 𝑖𝑓 1<𝑥<2 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 a. what is the variance of X? Answer:

𝜇 𝑥 =𝐸 𝑥 = −∞ ∞ 𝑥 𝑓 𝑥 𝑑𝑥= 1 2 𝑥 2(𝑥−1)𝑑𝑥 = 2 1 2 (𝑥 2 −𝑥)𝑑𝑥 = 2 𝑥 3 3 − 𝑥 2 2 2 1 = 2 8 3 − 4 2 − 1 3 − 1 2 = 2(( 4 6 - ( - 1 6 ) ) =2∗ 5 6 = 10 6

𝐸 𝑥 2 = −∞ ∞ 𝑥 2 𝑓 𝑥 𝑑𝑥= 1 2 𝑥 2 2(𝑥−1)𝑑𝑥 = 2 1 2 (𝑥 3 − (𝑥 2 )𝑑𝑥 = 2 𝑥 4 4 − 𝑥 3 3 2 1 = 2 16 4 − 8 3 − 1 4 − 1 3 = 2(( 16 12 - ( - 1 12 ) ) = 2(( 17 12 ) ) = 17/6

Thus, the variance of X is given by 𝛔 𝟐 𝐱 =𝐄 𝐱 𝟐 − (𝛍 𝟐 𝐱 ) = 17 6 – 10 6 = 7 6 Remark: Var (𝑎𝑥+𝑏 ) = 𝑎 2 𝑉𝑎𝑟 𝑥 𝑉𝑎𝑟 (𝑎 𝑥 ) = 𝑎 2 𝑉𝑎𝑟 𝑥 𝑉𝑎𝑟 (𝑎) =0

4.1. Moments of Random Variables Definition 4.1. The nth moment about the origin of a random variable X, as denoted by E( 𝑥 𝑛 ), is defined to be 𝐄 𝐱 𝐧 = 𝐱∈ 𝐑 𝐗 𝐱 𝐧 𝐟(𝐱) 𝐢𝐟 X is discrete −∞ ∞ 𝐱 𝐧 𝐟 𝐱 𝐝𝐱 if X is continuous

for n = 0, 1, 2, 3,. , provided the right side converges absolutely for n = 0, 1, 2, 3, ...., provided the right side converges absolutely. If n = 1, then E(X) is called the first moment about the origin. If n = 2, then E( 𝑥 2 ) is called the second moment of X about the origin. 4.5. Moment Generating Functions Definition 4.5. Let X be a random variable whose probability density function is f(x). A real valued function M : IR IR defined by 𝑴 𝒕 =𝑬( 𝒆 𝒕𝒙 ) is called the moment generating function of X if this expected value exists for all t in the interval −h < t < h for some h > 0.

Using the definition of expected value of a random variable, we obtain the explicit representation for M(t) as 𝐌(𝐭) = 𝐱∈ 𝐑 𝐗 𝐞 𝐭𝐱 𝐟(𝐱) 𝐢𝐟 X is discrete −∞ ∞ 𝐞 𝐭𝐱 𝐟 𝐱 𝐝𝐱 if X is continuous