Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped.

Slides:



Advertisements
Similar presentations
Discrete Random Variables To understand what we mean by a discrete random variable To understand that the total sample space adds up to 1 To understand.
Advertisements

CS433: Modeling and Simulation
Chapter 5.1 & 5.2: Random Variables and Probability Mass Functions
Flipping an unfair coin three times Consider the unfair coin with P(H) = 1/3 and P(T) = 2/3. If we flip this coin three times, the sample space S is the.
1 MF-852 Financial Econometrics Lecture 3 Review of Probability Roy J. Epstein Fall 2003.
Random Variables A Random Variable assigns a numerical value to all possible outcomes of a random experiment We do not consider the actual events but we.
Probability &Statistics Lecture 8
Random Variables and Distributions Lecture 5: Stat 700.
Expected Value.  In gambling on an uncertain future, knowing the odds is only part of the story!  Example: I flip a fair coin. If it lands HEADS, you.
Eighth lecture Random Variables.
Lecture 2 1 Probability theory Random variables Definition: A random variable X is a function defined on S, which takes values on the real axis In an experiment.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Created by Tom Wegleitner, Centreville, Virginia Edited by.
Probability Distributions: Finite Random Variables.
Joint Distribution of two or More Random Variables
CHAPTER 10: Introducing Probability
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
Copyright ©2011 Nelson Education Limited. Probability and Probability Distributions CHAPTER 4 Part 2.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Discrete probability Business Statistics (BUSA 3101) Dr. Lari H. Arjomand
5.3 Random Variables  Random Variable  Discrete Random Variables  Continuous Random Variables  Normal Distributions as Probability Distributions 1.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Random Variables an important concept in probability.
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Chapter Four Random Variables and Their Probability Distributions
Random Variable. Random variable A random variable χ is a function (rule) that assigns a number to each outcome of a chance experiment. A function χ acts.
CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University
Barnett/Ziegler/Byleen Finite Mathematics 11e1 Learning Objectives for Section 8.5 The student will be able to identify what is meant by a random variable.
Random Variables an important concept in probability.
MM207 Statistics Welcome to the Unit 7 Seminar With Ms. Hannahs.
Probability Distributions
Review of Chapter
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
Random Variables Learn how to characterize the pattern of the distribution of values that a random variable may have, and how to use the pattern to find.
Expected values of discrete Random Variables. The function that maps S into S X in R and which is denoted by X(.) is called a random variable. The name.
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 6: Random Variables Section 6.1 Discrete and Continuous Random Variables.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
Chapter 4 Random Variables - 1. Outline Random variables Discrete random variables Expected value 2.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
9/14/1999 JHU CS /Jan Hajic 1 Introduction to Natural Language Processing Probability AI-Lab
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Lecture 21 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Discrete Random Variable Random Process. The Notion of A Random Variable We expect some measurement or numerical attribute of the outcome of a random.
CHAPTER 6 Random Variables
CHAPTER 6 Random Variables
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
Random Variables.
Reference: (Material source and pages)
UNIT 8 Discrete Probability Distributions
CS104:Discrete Structures
Discrete and Continuous Random Variables
Discrete Distributions
Chapter Four Random Variables and Their Probability Distributions
CHAPTER 6 Random Variables
CHAPTER 12: Introducing Probability
Chapter. 5_Probability Distributions
Random Variable, Probability Distribution, and Expected Value
Chapter 6: Random Variables
CHAPTER 10: Introducing Probability
Chap 4 Distribution Functions and Discrete Random Variables Ghahramani 3rd edition 2019/1/3.
CHAPTER 6 Random Variables
CHAPTER 6 Random Variables
12/6/ Discrete and Continuous Random Variables.
Discrete & Continuous Random Variables
Section 1 – Discrete and Continuous Random Variables
Presentation transcript:

Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped. The number of heads, Y, that appear is a random variable. Let us list the sample space, S. Sample PointNo. of Heads, YProbability (H,H,H)31/8 (H,H,T)21/8 (H,T,H)21/8 (T,H,H)21/8 (H,T,T)11/8 (T,H,T)11/8 (T,T,H)11/8 (T,T,T)01/8

Example, continued. P{Y = 0} = P({s|Y(s) = 0}) = P{(T,T,T)} = 1/8 P{Y = 1} = P({s|Y(s) = 1}) = P{(H,T,T),(T,H,T),(T,T,H)} = 3/8 P{Y = 2} = P({s|Y(s) = 2}) = P{(H,H,T),(H,T,H),(T,H,H)} = 3/8 P{Y = 3} = P({s|Y(s) = 3}) = P{(H,H,H)} = 1/8 Since Y must take on one of the values 0, 1, 2, 3, we must have and this agrees with the probabilities listed above.

Cumulative distribution function of a random variable For a random variable X, the function F defined by is called the cumulative distribution function, or simply, the distribution function. Clearly, F is a nondecreasing function of t. All probability questions about X can be answered in terms of the cumulative distribution function F. For example,  

Proof of For sets A and B, where B A, P(A B) = P(A) P(B). Let A = {s| X(s) b}, B = {s| X(s) a}, a < b. A B = {s| a< X(s) b}. P(A B) = P(A) P(B) = F(b) – F(a).

Properties of the cumulative distribution function For a random variable X, the cumulative distribution function (c. d. f.) F was defined by 1. F is non decreasing F is right continuous. The previous properties of F imply that

Example of a distribution function Suppose that a bus arrives at a station every day between 10am and 10:30am, at random. Let X be the arrival time. Therefore, the distribution function is:

Discrete vs. Continuous Random Variables If a set is in one-to-one correspondence with the positive integers, the set is said to be countable. If the number of values taken on by a random variable is either finite or countable, then the random variable is said to be discrete. The number of heads which appear in 3 flips of a coin is a discrete random variable. If the set of values of a random variable is neither finite nor countable, we say the random variable is continuous. The random variable defined as the time that a bus arrives at a station is an example of a continuous random variable. In Chapter 5, the random variables are discrete, while in Chapter 6, they are continuous.

Probability Mass Function For a discrete random variable X, we define the probability mass function p(a) of X by If X is a discrete random variable taking the values x 1, x 2, …, then Example. For our coin flipping example, we plot p(x i ) vs. x i : x p(x)

Example of a probability mass function on a countable set Suppose X is a random variable taking values in the positive integers. We define p(i) = for i = 1, 2, 3, … Since this defines a probability mass function. P(X is odd) = sum of heights of red bars = 2/3 and P(X is even) = sum of heights of blue bars = 1/3.

Cumulative distribution function of a discrete random variable The distribution function of a discrete random variable can be expressed as where p(a) is the probability mass function. If X is a discrete random variable whose possible values are x 1, x 2, x 3 …, where x 1 <x 2 < x 3 …, then its distribution function is a step function. That is, F is constant on the intervals [x i-1, x i ) and then takes a step (or jump) of size p(x i ) at x i. (See next slide for an example).

Random variable Y, number of heads, when 3 coins are tossed Probability Mass Function Cumulative Distribution Function

Random variable with both discrete and continuous features Define random variable X as follows: (1) Flip a fair coin (2) If the coin is H, define X to be a randomly selected value from the interval [0, 1/2]. (3) If the coin is T, define X to be 1. The cdf for X is derived next. For t < 0, P(X t) = 0 follows easily. For P(X t) = P(X t| coin is H)∙P(coin is H) = (2t)∙(1/2) = t For P(X t) = P(X 1/2) = 1/2. For P(X t) = P(X 1/2) + P(X = 1) = 1/2 + 1/2 = 1.

CDF for random variable X from previous slide Let the cdf for X be F. Then

Expected value of a discrete random variable For a discrete random variable X having probability mass function p(x), the expectation or expected value of X, denoted by E(X), is defined by We see that the expected value of x is a weighted average of the possible values that x can take on, each value being weighted by the probability that x assumes it. The expectation of random variable X is also called the mean of X and the notation µ = E(X) is used. Example. A single fair die is thrown. What is the expectation of the number of dots showing on the top face of the die? Let X be the number of dots on the top face. Then

Intuitive idea of expectation of a discrete random variable The expected value of a random variable is the average value that the random variable takes on. If for some game, E(X) = 0, then the game is called fair. For random variable X, if half the time X = 0 and the other half of the time X = 10, then the average value of X is E(X) = 5. For random variable Y, if one-third of the time Y = 6 and two- thirds of the time Y = 15, then the average value of Y is E(Y) = 12. Let Z be the amount you win in a lottery. If you win a million dollars with probability and it costs you $2 for a ticket, your expected winnings are E(Z) = (10 -6 ) + (–2)(1 – ) = –1 dollars.

Pascal’s Wager—First Use of Expectation to Make a Decision Suppose we are unsure of God’s existence, so we assign a probability of ½ to existence and ½ to nonexistence. Let X be the benefit derived from leading a pious life. X is infinite (eternal happiness) if God exists, however we lose a finite amount (d) of time and treasure devoted to serving God if He doesn’t exist. E(X) = Thus, the expected return on piety is positive infinity. Therefore, says Pascal, every reasonable person should follow the laws of God.

Expectation of a function of a discrete random variable. Theorem. If X is a discrete random variable that takes on one of the values x i, i 1, with respective probabilities p(x i ), then for any real-valued function g, Corollary. For real numbers a and b, Example. Let X be a random variable which takes the values –1, 0, 1 with probabilities 0.2, 0.5, and 0.3, respectively. Let g(x) = x 2. We have that g(X) is a random variable which takes on the values 0 and 1 with equal probability. Hence, Note that

Law of Unconscious Statistician (Theorem from previous slide) Example. Let Y = g(X) = 7X–X 2. Let X be outcome for a fair die.

Determining Insurance Premiums Suppose a 36 year old man wants to buy $50,000 worth of term life insurance for a 20-year term. Let p 36 be the probability that this man survives 20 more years. For simplicity, assume the man pays premiums for 20 years. If the yearly premium is C/20, where C is the total of the premiums the man pays, how should the insurance company choose C? Let the income to the insurance company be X. We have For the company to make money,

Variance and standard deviation of a discrete random variable The variance of a discrete random variable X, denoted by Var(X), is defined by The variance is a measure of the spread of the possible values of X. The quantity is called the standard deviation of X. Example. Suppose X has value k, k > 0, with probability 0.5 and value –k with probability 0.5. Then E(X) = 0 and Var(X) = E(X 2 ) = k 2. Also, the standard deviation of X is k.

Keno versus Bolita Let B and K be the amount that you win in one play of Bolita and Keno, respectively. (See Example 4.26 in the textbook.) E(B) = –0.25 and E(K) = –0.25 In the long run, your losses are the same with the two games. Var(B) = and Var(K) = Based on these variances, we conclude that the risk with Keno is far less than the risk with Bolita.

More about variance and standard deviation Theorem. Var(X) = E(X 2 ) – (E(X)) 2. Theorem. For constants a and b, Problem. If E(X) = 2 and E(X 2 ) = 13, find the variance of –4X+12. Solution. Var(X) = E(X 2 ) – (E(X)) 2 = 13 – 4 = 9. Definition. E(X n ) is the nth moment of X.

Standardized Random Variables Let X be a random variable with mean  and standard deviation . The random variable X* = (X   )/  is called the standardized X. It follows directly that E(X*) = 0 and Var(X*) = 1. Standardization is particularly useful if two or more random variables with different distributions must be compared. Example. By using standardization, we can compare the home run records of Babe Ruth and Barry Bonds.