1 Random Variables Supplementary Notes Prepared by Raymond Wong Presented by Raymond Wong.

Slides:



Advertisements
Similar presentations
Lecture Discrete Probability. 5.2 Recap Sample space: space of all possible outcomes. Event: subset of of S. p(s) : probability of element s of.
Advertisements

Basic Terms of Probability Section 3.2. Definitions Experiment: A process by which an observation or outcome is obtained. Sample Space: The set S of all.
Segment 3 Introduction to Random Variables - or - You really do not know exactly what is going to happen George Howard.
March 31, 2015Applied Discrete Mathematics Week 8: Advanced Counting 1 Discrete Probability Example I: A die is biased so that the number 3 appears twice.
Binomial Distribution & Bayes’ Theorem. Questions What is a probability? What is the probability of obtaining 2 heads in 4 coin tosses? What is the probability.
Copyright © Cengage Learning. All rights reserved. 8.6 Probability.
Chapter 5 Probability Distributions. E.g., X is the number of heads obtained in 3 tosses of a coin. [X=0] = {TTT} [X=1] = {HTT, THT, TTH} [X=2] = {HHT,
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Chapter 3 Section 3.3 Basic Rules of Probability.
Discrete Structures Chapter 4 Counting and Probability Nurul Amelina Nasharuddin Multimedia Department.
Ka-fu Wong © 2003 Chap 6- 1 Dr. Ka-fu Wong ECON1003 Analysis of Economic Data.
1 Independence Supplementary Notes Prepared by Raymond Wong Presented by Raymond Wong.
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
1 Intro to Probability Supplementary Notes Prepared by Raymond Wong Presented by Raymond Wong.
1 Variance of RVs Supplementary Notes Prepared by Raymond Wong Presented by Raymond Wong.
Probability Distributions: Finite Random Variables.
Joint Distribution of two or More Random Variables
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
The Binomial Distribution Permutations: How many different pairs of two items are possible from these four letters: L, M. N, P. L,M L,N L,P M,L M,N M,P.
1 Inclusion-Exclusion Supplementary Notes Prepared by Raymond Wong Presented by Raymond Wong.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
From Randomness to Probability
Lecture Discrete Probability. 5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
2/14/13CMPS 3120 Computational Geometry1 CMPS 3120: Computational Geometry Spring 2013 Expected Runtimes Carola Wenk.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Random Variables an important concept in probability.
9/7/06CS 6463: AT Computational Geometry1 CS 6463: AT Computational Geometry Fall 2006 Expected Runtimes Carola Wenk.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Probability The calculated likelihood that a given event will occur
Random Variable. Random variable A random variable χ is a function (rule) that assigns a number to each outcome of a chance experiment. A function χ acts.
Copyright © Cengage Learning. All rights reserved. 8.6 Probability.
Project 1 Lecture Notes. Table of Contents Basic Probability Word Processing Mathematics Summation Notation Expected Value Database Functions and Filtering.
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
Random Variables an important concept in probability.
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Binomial Distribution
Probability Distributions
Random Variables Learn how to characterize the pattern of the distribution of values that a random variable may have, and how to use the pattern to find.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Discrete Math Section 16.3 Use the Binomial Probability theorem to find the probability of a given outcome on repeated independent trials. Flip a coin.
3/7/20161 Now it’s time to look at… Discrete Probability.
Binomial Probability Theorem In a rainy season, there is 60% chance that it will rain on a particular day. What is the probability that there will exactly.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
CHAPTER 5 Discrete Probability Distributions. Chapter 5 Overview  Introduction  5-1 Probability Distributions  5-2 Mean, Variance, Standard Deviation,
Now it’s time to look at…
Chapter 4: Discrete Random Variables
Chapter 4: Discrete Random Variables
Copyright © 2016, 2013, and 2010, Pearson Education, Inc.
Random Variables.
PROBABILITY AND PROBABILITY RULES
Natural Language Processing
22C:19 Discrete Math Discrete Probability
CS104:Discrete Structures
Conditional Probability
Natural Language Processing
Now it’s time to look at…
Introduction to Probability
Chapter 9 Section 1 Probability Review.
Lesson 10.1 Sample Spaces and Probability
Now it’s time to look at…
Statistical Inference for Managers
Now it’s time to look at…
©G Dear 2009 – Not to be sold/Free to use
Probability Probability Principles of EngineeringTM
6.2 Probability Models.
Presentation transcript:

1 Random Variables Supplementary Notes Prepared by Raymond Wong Presented by Raymond Wong

2 e.g.1 (Page 3) Suppose that we flip the coin 5 times. The following shows the sample space of flipping the coin 5 times. HHHHH HTHHT … … TTTTT THTHT … …. Sample Space Suppose that we are interested in the total number of heads when we flip the coin 5 times. X(HHHHH) = 5X(HTHHT) = 3 X(TTTTT) = 0 X(THTHT) = 2 We define a random variable X to denote the total number of heads when we flip the coin 5 times.

3 e.g.2 (Page 6) Consider the example of flipping the coin 1 time. Let X be the random variable where X = 1 if the flip is successful (i.e., showing a head) 0 if the flip is unsuccessful (i.e., not showing a head) X is called a Bernoulli random variable. The flip is called a Bernoulli trial.

4 e.g.3 (Page 8) Consider the example of flipping the coin 5 times. Let X i be the random variable where X i = 1 if the i-th flip is successful (i.e., showing a head) 0 if the i-th flip is unsuccessful (i.e., not showing a head) X i is called a Bernoulli random variable. Each flip is called a Bernoulli trial. Flipping it 5 times is a Bernoulli trial process. Suppose that we are interested in the number of heads. We have X 1 + X 2 + X 3 + X 4 + X 5. (i.e., the sum of Bernoulli Random Variables)

5 e.g.4 (Page 9) We have 5 Bernoulli trials with probability p success on each trial Let S denote success and F denote failure. What is the probability of the following? (a) SSSFF (b) FFSSS (c) SFSFS (d) any particular ordering on three S’s and any particular ordering on two F’s (e.g., FSFSS) (a)Since each trial is independent, we have P(SSSFF) = P(S) x P(S) x P(S) x P(F) x P(F) = p x p x p x (1-p) x (1-p) = p 3 (1-p) 2

6 e.g.4 We have 5 Bernoulli trials with probability p success on each trial Let S denote success and F denote failure. What is the probability of the following? (a) SSSFF (b) FFSSS (c) SFSFS (d) any particular ordering on three S’s and any particular ordering on two F’s (e.g., FSFSS) (b)Since each trial is independent, we have P(FFSSS) = P(F) x P(F) x P(S) x P(S) x P(S) = (1-p) x (1-p) x p x p x p = p 3 (1-p) 2

7 e.g.4 We have 5 Bernoulli trials with probability p success on each trial Let S denote success and F denote failure. What is the probability of the following? (a) SSSFF (b) FFSSS (c) SFSFS (d) any particular ordering on three S’s and any particular ordering on two F’s (e.g., FSFSS) (c)Since each trial is independent, we have P(SFSFS) = P(S) x P(F) x P(S) x P(F) x P(S) = p x (1-p) x p x (1-p) x p = p 3 (1-p) 2

8 e.g.4 We have 5 Bernoulli trials with probability p success on each trial Let S denote success and F denote failure. What is the probability of the following? (a) SSSFF (b) FFSSS (c) SFSFS (d) any particular ordering on three S’s and any particular ordering on two F’s (e.g., FSFSS) (d) Since each trial is independent, we have P(any particular ordering on three S’s and any particular ordering on two F’s) = P(S) x P(S) x P(S) x P(F) x P(F) = p x p x p x (1-p) x (1-p) = p 3 (1-p) 2 P(any particular ordering on three S’s and any particular ordering on two F’s) = p 3 (1-p) 2

9 e.g.5 (Page 10) We have 5 Bernoulli trials with probability p success on each trial Let S denote success and F denote failure. What is the probability that the 5 trials contain exactly 3 successes? P(any particular ordering on three S’s and any particular ordering on two F’s) = p 3 (1-p) 2 Is it equal to p 3 (1-p) 2 ? No. The total number of trials containing 3 successes and 2 failures = 5 3 P(5 trails contain exactly 3 successes) = P(SSSFF) + P(SSFSF) + …+ P(FFSSS) = p 3 (1-p) 2 + p 3 (1-p) 2 + …+ p 3 (1-p) 2 = p 3 (1-p) 2 5 3

10 e.g.6 (Page 12) The sample space for the Binomial Random Variable X is: X = 0 X = 1 X = 2 … … … … X = n Sample Space p 0 (1-p) n-0 n 0 p 1 (1-p) n-1 n 1 p 2 (1-p) n-2 n 2 p n (1-p) n-n n n

11 e.g.7 (Page 12) The binomial theorem is If x = p and y = 1-p, we have

12 e.g.8 (Page 13) There are 10 questions in a test. A student takes this test. Suppose that he who knows 80% of the course material has probability 0.8 of success on any question, independent of how he did on another question. 10 questions P(he answers correctly) = 0.8 P(he answers incorrectly) = 0.2 (a)What is P(answer exactly 8 questions correctly)? (b) What is P(answer exactly 9 questions correctly)? (c) What is P(answer exactly 10 questions correctly)? (a) What is the probability that he answers exactly 8 questions correctly? (b) What is the probability that he answers exactly 9 questions correctly? (c) What is the probability that he answers exactly 10 questions correctly?

13 e.g.8 10 questions P(he answers correctly) = 0.8 P(he answers incorrectly) = 0.2 (a)What is P(answer exactly 8 questions correctly)? (b) What is P(answer exactly 9 questions correctly)? (c) What is P(answer exactly 10 questions correctly)? (a) This example is similar to the Bernoulli trial process. A trial in this example is answering a question. A success in this example is that he answers the question correctly. A failure in this example is that he answers the question incorrectly. Thus, we can use the formula of the Bernoulli trail process (or Binomial Random Variable X) 0.8 k x k 10 k P(X = k) = 0 if 0  k  10 otherwise Let X be the total number of questions answered correctly. P(X = 8) = x = x =0.302 (b) P(X = 9) = x = x =0.268 P(X = 10)= x = x =0.107 (c)

14 e.g.9 (Page 15) Suppose that we flip a fair coin TWICE. The sample space is TT HT TH HH Sample Space 0 head1 head 2 heads

15 e.g.10 (Page 15) Suppose that we flip a fair coin THREE times. The sample space is TTT TTH THT HTT THH HTH HHT HHH Sample Space 0 head 1 head 2 heads 3 heads

16 e.g.11 (Page 16) Illustration 1 for Page 16 Step 1 I You Step 2 Flip 3 coins Step 3 I You The outcome is HHT I You $ ??? $ 2

17 e.g.12 (Page 16) Illustration 2 for Page 16 Step 1 I You Step 2 Flip 3 coins Step 3 I You The outcome is TTT I You $ ??? $ 0

18 e.g.13 (Page 17) X = 0 X = 1 X = 2 X = 3 Sample Space What is E(X)? E(X) = Let X be a random variable denoting a number equal to 0, 1, 2, or 3. The sample space where we consider random variable X is 0 x ¼ + 1 x ¼ + 2 x ¼ + 3 x ¼ 1/4 = 3/2 xixi P(X = x i )

19 e.g.14 (Page 17) Suppose that we flip a fair coin THREE times. The sample space where we flip a fair coin THREE times is TTT TTH THT HTT THH HTH HHT HHH Sample Space 3 tails 2 tails 1 tail 0 tail 1/8 Let X be the random variable denoting the number of tails. What is E(X)? E(X) = 3x1/8 + 2x1/8 + 2x1/8 + 2x1/8 + 1x1/8 + 1x1/8 + 1x1/8 + 0x1/8 = 1.5

20 e.g.15 (Page 17) TTT TTH THT HTT THH HTH HHT HHH Sample Space 3 tails 2 tails 1 tail 0 tail 8/274/27 2/27 1/27 Let X be the random variable denoting the number of tails. What is E(X)? E(X) = 3x8/27 + 2x4/27 + 2x4/27 + 2x4/27 + 1x2/27 + 1x2/27 + 1x2/27 + 0x1/27 = 2 Suppose that we flip a biased coin THREE times where P(tail) = 2/3 and P(head) = 1/3 The sample space where we flip a biased coin THREE times is

21 e.g.16 (Page 17) X = 0 X = 1 X = 2 X = 3 Sample Space What is E(X)? E(X) = Suppose that we flip a biased coin THREE times where P(tail) = 2/3 and P(head) = 1/3 Let X be the random variable denoting the number of tails. The sample space where we consider random variable X is (2/3) 1 (1/3) (2/3) 2 (1/3) (2/3) 3 (1/3) (2/3) 0 (1/3) x (2/3) 0 (1/3) x (2/3) 0 (1/3) x (2/3) 0 (1/3) x = 2 (2/3) 0 (1/3)

22 e.g.17 (Page 18) Suppose that I want to throw one 6-sided dice. Sample space = {,,,,, } Let X be the number of spots shown. What is E(X)? 1 spot2 spots3 spots4 spots5 spots 6 spots E(X) = 1/6 1x1/6 + 2x1/6 + 3x1/6 + 4x1/6 + 5x1/6 + 6x1/6 = 7/2

23 e.g.18 (Page 18) Suppose that I want to throw two fair dice. Let Y be the random variable denoting the number of spots shown. Dice 1Dice 2Sum Dice 1Dice 2Sum Dice 1Dice 2Sum iP(Y=i) /36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36

24 e.g.19 (Page 18) Suppose that I want to throw two fair dice. Let Y be the random variable denoting the number of spots shown. iP(Y=i) /36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36 E(Y) = 2 x 2/ x 2/ x 3/ x 4/ x 5/ x 6/ x 5/ x4/ x 3/ x 2/ x 1/36 = 7

25 e.g.20 (Page 20) Suppose that we flip a fair coin THREE times. The sample space where we flip a fair coin THREE times is TTT TTH THT HTT THH HTH HHT HHH Sample Space 3 tails 2 tails 1 tail 0 tail 1/8 Let X be the random variable denoting the number of tails. What is E(X)? E(X) = 0x1/8 + 1x1/8 + 1x1/8 + 1x1/8 + 2x1/8 + 2x1/8 + 2x1/8 + 3x1/8 = 1.5 s X(s) P(s)

26 e.g.21 (Page 25) Theorem 5.10 Suppose X and Y are random variables on the (finite) sample space S. Then E(X + Y) = E(X) + E(Y) Lemma 5.9 If a random variable X is defined on a (finite) sample space S, then its expected value is given by E(X) = X(s) P(s) Let Z = X+Y. That is, given an outcome s in S, Z(s) = X(s) + Y(s) Why is it correct? According to Lemma 5.9, we have E(X+Y) = E(Z) = [X(s) + Y(s)] P(s)= [X(s)P(s) + Y(s)P(s) ]= X(s)P(s) + Y(s)P(s) = E(X) + E(Y) = Z(s) P(s) IMPORTANT: X and Y can be independent X and Y can be dependent.

27 e.g.22 (Page 26) Suppose that we flip a fair coin. We have two random variables X and Y. X = 1 if head 0 if tail Y = 0 if head 1 if tail (a) What is E(X)? (b) What is E(Y)? (c) What is E(X + Y) (without using Theorem 5.10)? (d) What is E(X+Y) (by using Theorem 5.10)? X = 1 if head 0 if tail Y = 0 if head 1 if tail (a) What is E(X)? (b) What is E(Y)? (c) What is E(X + Y) (without using Theorem 5.10)? (d) What is E(X+Y) (by using Theorem 5.10)?

28 e.g.22 X = 1 if head 0 if tail Y = 0 if head 1 if tail (a) What is E(X)? (b) What is E(Y)? (c) What is E(X + Y) (without using Theorem 5.10)? (d) What is E(X+Y) (by using Theorem 5.10)? (a) E(X) = 1 x ½ + 0 x ½ = ½ (b) E(Y) = 0 x ½ + 1 x ½ = ½ (c)Consider two cases. Case 1: head Case 2: tail X = 1 and Y = 0 X+Y = 1 X = 0 and Y = 1 X+Y = 1 E(X+Y) = 1 x ½ + 1 x ½ = 1 (d) By using the theorem, we have E(X + Y) = E(X) + E(Y) = ½ + ½ = 1

29 e.g.23 (Page 26) Suppose that we flip a fair coin. We have two random variables X and Y. X = 1 if head 0 if tail Y = 0 if head 1 if tail (a) What is E(X)? (b) What is E(Y)? (c) What is E(XY)? (d) Is “E(XY) = E(X)E(Y)”? X = 1 if head 0 if tail Y = 0 if head 1 if tail (a) What is E(X)? (b) What is E(Y)? (c) What is E(XY)? (d) Is “E(XY) = E(X)E(Y)”?

30 e.g.23 X = 1 if head 0 if tail Y = 0 if head 1 if tail (a) What is E(X)? (b) What is E(Y)? (c) What is E(XY)? (d) Is “E(XY) = E(X)E(Y)”? (a) E(X) = ½ (b) E(Y) = ½ (c)Consider two cases. Case 1: head Case 2: tail X = 1 and Y = 0 XY = 0 X = 0 and Y = 1 XY = 0 E(XY) = 0 x ½ + 0 x ½ = 0 (d) Consider E(X)E(Y) = ½ x ½ = ¼ We know that E(XY) = 0 (from part (c)) Thus, E(XY)  E(X)E(Y)

31 e.g.24 (Page 26) In any cases (or in general), E(X + Y) = E(X) + E(Y) In some cases, E(XY)  E(X)E(Y) In some other cases, E(XY) = E(X)E(Y)

32 e.g.24 (Page 27) Illustration of Theorem 5.11 E.g. E(2X) = 2E(X) The reason is E(2X) = E(X + X) = E(X) + E(X) = 2E(X)

33 e.g.25 (Page 36) Consider Derangement Problem (or Dearrangement Problem) Suppose that there are 5 (or n) students. They put their backpacks along the wall. Someone mixed up the backpacks so students get back “random” backpacks. 5 students (or n students)

34 e.g.25 Let X be the total number of students who get their backpacks back correctly Let X i be an indicator random variable denoting the event E i that student i gets his backpack correctly (a) Are E 1 and E 2 independent when n =2? (b) What is E(X) when n = 5? 5 students (or n students) X: total number of students who get their backpacks back correctly X i be an indicator random variable denoting the event E i that student i gets his backpack correctly (a) Are E 1 and E 2 independent when n = 2? (b) What is E(X) when n = 5?

35 e.g.23 5 students (or n students) X: total number of students who get their backpacks back correctly X i be an indicator random variable denoting the event E i that student i gets his backpack correctly (a) Are E 1 and E 2 independent when n = 2? (b) What is E(X) when n = 5? Raymond Peter Ray (a)Suppose that student 1 is “Raymond” and student 2 is “Peter”. E 1 : the event that “Raymond” gets his backpack correctly. E 2 : the event that “Peter” gets his backpack correctly. Peter There are only two cases. Case 1: Raymond Peter Ray Case 2: P(E 1 ) = P(“Raymond” gets his backpack correctly) = ½ P(E 2 ) = P(“Peter” gets his backpack correctly) = ½ P(E 1  E 2 ) = P(“Raymond and “Peter” get their backpack correctly) = ½ Note that P(E 1 ) x P(E 2 ) = ½ x ½ = ¼ Thus, P(E 1 ) x P(E 2 )  P(E 1  E 2 ) Thus, E 1 are E 2 are not independent.

36 e.g.23 5 students (or n students) X: total number of students who get their backpacks back correctly X i be an indicator random variable denoting the event E i that student i gets his backpack correctly (a) Are E 1 and E 2 independent when n = 2? (b) What is E(X) when n = 5? (b) Note that X = X 1 + X 2 + X 3 + X 4 + X 5 By linearity of expectation, E(X) = E(X 1 + X 2 + X 3 + X 4 + X 5 ) = E(X 1 ) + E(X 2 ) + E(X 3 ) + E(X 4 ) + E(X 5 ) Note that events E i (or the correspondence random variables X i ) are not independent. We can still use this linearity of expectation. The next question is : What is E(X i )? Note that X i = 1 if student i takes his backpack correctly 0 if student i takes his backpack incorrectly E(X i ) = 1 x P(student i takes his backpack correctly) + 0 x P(student i takes his backpack incorrectly) = P(student i takes his backpack correctly)

37 e.g.23 5 students (or n students) X: total number of students who get their backpacks back correctly X i be an indicator random variable denoting the event E i that student i gets his backpack correctly (a) Are E 1 and E 2 independent when n = 2? (b) What is E(X) when n = 5? (b) Note that X = X 1 + X 2 + X 3 + X 4 + X 5 By linearity of expectation, E(X) = E(X 1 + X 2 + X 3 + X 4 + X 5 ) = E(X 1 ) + E(X 2 ) + E(X 3 ) + E(X 4 ) + E(X 5 ) The next question is : What is E(X i )? E(X i ) = P(student i takes his backpack correctly)

38 e.g.23 5 students (or n students) X: total number of students who get their backpacks back correctly X i be an indicator random variable denoting the event E i that student i gets his backpack correctly (a) Are E 1 and E 2 independent when n = 2? (b) What is E(X) when n = 5? (b) Note that X = X 1 + X 2 + X 3 + X 4 + X 5 By linearity of expectation, E(X) = E(X 1 + X 2 + X 3 + X 4 + X 5 ) = E(X 1 ) + E(X 2 ) + E(X 3 ) + E(X 4 ) + E(X 5 ) The next question is : What is E(X i )? E(X i )= P(student i takes his backpack correctly) Raymond Peter There are (5-1)! cases that Raymond gets his OWN backpack back. There are totally 5! cases = (5-1)!/5! = 4!/5! = 1/5

39 e.g.23 5 students (or n students) X: total number of students who get their backpacks back correctly X i be an indicator random variable denoting the event E i that student i gets his backpack correctly (a) Are E 1 and E 2 independent when n = 2? (b) What is E(X) when n = 5? (b) Note that X = X 1 + X 2 + X 3 + X 4 + X 5 By linearity of expectation, E(X) = E(X 1 + X 2 + X 3 + X 4 + X 5 ) = E(X 1 ) + E(X 2 ) + E(X 3 ) + E(X 4 ) + E(X 5 ) The next question is : What is E(X i )? E(X i )= P(student i takes his backpack correctly) = (5-1)!/5! = 4!/5! = 1/5 Thus, E(X) = E(X 1 ) + E(X 2 ) + E(X 3 ) + E(X 4 ) + E(X 5 ) = 1/5 + 1/5 + 1/5 + 1/5 + 1/5 = 1 Additional Question: If n can be any number, what is E(X)? E(X) = 1 Note that it is independent of n. E.g., If n = 1000, we expect that there is only one student who gets his backpack correctly.

40 e.g.26 (Page 40) Suppose that we flip a coin. The sample space of flipping a coin is H T Event P(H) = ½ P(T) = ½ Suppose that I flip a coin repeatedly. We want to see a head. Do you think that we “expect” to see a head within TWO flips?

41 e.g.27 (Page 40) Suppose that we throw two dice. The sample space of throwing two dice is Dice 1Dice 2Sum Dice 1Dice 2Sum Dice 1Dice 2Sum P(sum=7) = 1/6 = 6/36 Suppose that I throw two dice repeatedly. We want to see the sum = 7. Do you think that we “expect” to see “sum = 7” within SIX times of throwing?

42 e.g.28 (Page 43) Suppose that the trial process is “FFFS” where F corresponds to a failure and S corresponds to a success. Let X be a random variable denoting the trial number where the first success occurs. Let p be the probability of success. (a) What is X(FFFS)? (b) What is P(FFFS)? X(FFFS) = 4 P(FFFS) = (1-p) 3 p

43 e.g.29 (Page 44) We know the following known fact. (1) Theorem 4.6: For any real number x  1, i. x i = nx n+2 – (n+1)x n+1 + x (1-x) 2 For any real number -1 < x < 1, i. x i = x (1-x) 2 (2) You don’t need to recite (2). If we have (1), we can derive (2) This is because nx n is equal to 0 when n is very large. If n is very large and -1 < x < 1, then what is the value of nx n ? Consider lim n   nx n n x -n = lim n   1 x -n (ln x)(-1) (By L’Hospital’s Rule) = lim n   1 (ln x)(-1). x n = 0 Why? (This is because lim n   x n = 0)

44 e.g.29 We know the following known fact. (1) Theorem 4.6: For any real number x  1, i. x i = nx n+2 – (n+1)x n+1 + x (1-x) 2 For any real number -1 < x < 1, i. x i = x (1-x) 2 (2) You don’t need to recite (2). If we have (1), we can derive (2) This is because nx n is equal to 0 when n is very large. Consider nx n+2 = nx n. x 2 = 0. x 2 if n is very large = 0 Similarly, (n+1)x n+1 = 0 if n is very large i. x i = nx n+2 – (n+1)x n+1 + x (1-x) 2 Thus, from Theorem 4.6 If n is large, we have i. x i = x (1-x) 2