Tel Hai Academic College Department of Computer Science Prof. Reuven Aviv Probability and Random variables.

Slides:



Advertisements
Similar presentations
Random Variables ECE460 Spring, 2012.
Advertisements

DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Chapter 2 Discrete Random Variables
Chapter 3 Probability Distribution. Chapter 3, Part A Probability Distributions n Random Variables n Discrete Probability Distributions n Binomial Probability.
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Introduction to Probability and Statistics
Discrete Random Variables and Probability Distributions
Probability Distributions Finite Random Variables.
Prof. Bart Selman Module Probability --- Part d)
Chapter 5: Probability Concepts
Probability Distributions
TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.
1 Review of Probability Theory [Source: Stanford University]
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Class notes for ISE 201 San Jose State University
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
CMPE 252A: Computer Networks Review Set:
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Discrete Random Variable and Probability Distribution
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
Discrete Distributions
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide Discrete Probability Distributions (Random Variables and Discrete Probability Distributions) Chapter 5 BA 201.
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
PBG 650 Advanced Plant Breeding
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Lecture 4 1 Discrete distributions Four important discrete distributions: 1.The Uniform distribution (discrete) 2.The Binomial distribution 3.The Hyper-geometric.
Chapter 3 Discrete Random Variables and Probability Distributions Chapter 3A Variables that are random; what will they think of next?
Discrete Probability Distributions n Random Variables n Discrete Probability Distributions n Expected Value and Variance n Binomial Probability Distribution.
1 Lecture 7: Discrete Random Variables and their Distributions Devore, Ch
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Statistical Applications Binominal and Poisson’s Probability distributions E ( x ) =  =  xf ( x )
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
1 Topic 3 - Discrete distributions Basics of discrete distributions Mean and variance of a discrete distribution Binomial distribution Poisson distribution.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Topic 3 - Discrete distributions Basics of discrete distributions - pages Mean and variance of a discrete distribution - pages ,
Some Common Discrete Random Variables. Binomial Random Variables.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
1 Engineering Statistics - IE 261 Chapter 3 Discrete Random Variables and Probability Distributions URL:
Random Variables Example:
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
February 16. In Chapter 5: 5.1 What is Probability? 5.2 Types of Random Variables 5.3 Discrete Random Variables 5.4 Continuous Random Variables 5.5 More.
Great Theoretical Ideas in Computer Science for Some.
1 Part Three: Chapters 7-9 Performance Modeling and Estimation.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
3/7/20161 Now it’s time to look at… Discrete Probability.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Tel Hai Academic College Department of Computer Science Prof. Reuven Aviv Introduction to Markov Chains Resource: Fayez Gebali, Analysis of Computer and.
Great Theoretical Ideas in Computer Science.
4. Overview of Probability Network Performance and Quality of Service.
3.1 Discrete Random Variables Present the analysis of several random experiments Discuss several discrete random variables that frequently arise in applications.
Chapter 3 Applied Statistics and Probability for Engineers
Engineering Probability and Statistics - SE-205 -Chap 3
4A: Probability Concepts and Binomial Probability Distributions
Dept. of Electrical & Computer engineering
Discrete Random Variables: Basics
Discrete Random Variables: Basics
Geometric Poisson Negative Binomial Gamma
Discrete Random Variables: Basics
Presentation transcript:

Tel Hai Academic College Department of Computer Science Prof. Reuven Aviv Probability and Random variables

Random Experiments and Events Do Random Experiment, again and again – get outcomes examples? Same experiment - Outcomes might be different Event: an outcome or subset of outcomes Example: outcomes: Num of packets in router, N – Event : N = 20 – set A {N any of 20, 21, 22, …30} an event? – set E: {N any even number} an event? – set O: {N any odd number}, an event? …..

Probability Define Probability of event A Pr[A] = NumOutcomes(A)/TotalNumExperiments  When Total Number of experiments  ∞  What is Pr[A or B] (Pr [ A U B]) Pr[A or B] = Pr[A] + Pr[B] – Pr[A and B] What if A, B cannot happen in same experiment  E.g. Events E, O Pr[A and B] = 0  Pr[A or B] ≡ Pr[A + B] = Pr[A] + Pr[B]

Q0: The Birthday Paradox n people in a room. What’s the probability that 2 of them has same birthday (day, month)? Experiment: All persons “choose” birthdays at random  Outcome: a series of n numbers from range {1..365} Number of all possible outcomes 365 n Event E (of interest): Set of all outcomes (series) in which 2 of the numbers are the same Other event O: Set of all outcomes with distinct numbers All outcomes are in either E or O.

Q0: The Birthday Paradox Assume birthdays equally probable Pr[O] = (364/365)*(363/365)***([365-(n-1)]/365) let x = 1/365 Pr[O] = (1- x)(1-2x)***(1-(n-1)x) Pr[E] = 1- Pr[O] n = 22  Pr[E] = ; n = 23  Pr[E] = Approximation: use e -x ≈ 1-x small x  Pr[O] ≈ e -n 2 x/2 Pr[E] ≈ 1- e -n 2 x/2

Q1 – Birthday paradox (cont’d) n people in a room. What’s the probability that one of them has same birthday as mine? For what value of n it will be 0.5? Event E: set of all outcomes (series) in which one of the numbers is equal to B (my birthday) Event O: set of all outcomes in which all numbers are different from B Pr[O] = (364/365) n Pr[E] = 1 – (364/365) n (364/365) n = 0.5  n *log 2 (364/365) = -1 n = -1/ log 2 (364/365) =

Random Variables (1) R.V: a variable X that on each experiment gets a value x from a set, with certain probability: Pr[X=x)]≡ Pr[x] Example: LAN States: number of frames transmitted in unit time Define X: number of transmissions in unit time  What are the possible values of X X = 1 if one station has a frame to transmit. Pr[1] = a X = 0 if no station has a frame to transmit. X = k, number of transmitting stations, from N  Pr[k] = u k = [N!/(k!(N-k)!] a k (1-a) N-k Pr[X=0] = ?

Random Variables (2) Example: X is the collision state of the LAN channel during a time unit What could be possible states? X = idle (0) if no frame is transmitted.  Pr[idle] = ? X = transmitting (1) if 1 frame is transmitted.  Pr[tran]=? X = collided (2) if more than 1 frame is transmitted.  Pr[col] Here X is a discrete random variable

Poisson Process N(t) a R.V., number of arrivals during time interval [0..t]  Probability of n arrivals during time 0..t P n (t) ≡ Pr[N(t) = n] = ( t) n e - t /n!  Poisson process with parameter Average number of arrivals = E[N(t)] =  n nPr[N(t)=n] = t Variance also t  is the average rate of arrivals (packets/time unit) P n (1): probability that n packets arrived during 1’st hour t = 0 can be arbitrary!

Probability Functions Probability Mass Function (PMF) Pr[X=x] Cumulative Distribution Function (CDF): ???  F (x) ≡ Pr[X ≤ x]  F X (-∞) = 0, F X (+∞) = 1  F ( x 1 < X ≤ x 2 ) = F X (x 2 ) – F X (x 1 ) Tail Function (Complementary Cumulative Distribution Function) T(x) ≡ Pr[X > x] = 1 – F(x) Poisson: F(x) = e- t  k ( t) k /k! (sum over k from 0 to x)

Q 3 Event Q3: the fourth packet arrived during the first two hours? What is Pr[Q3] Event Q3 occurs if and only if during [0..2] 4, OR 5, OR 6, OR 7, OR any number k>= 4 packets arrived, …  the probability that at least 4 packets arrived during [0..2] Pr[Q3] = P 4 (2) + P 5 (2) + P 6 (2) + … ≡ T 4 (2) Pr[Q3] = T 4 (2) ≡ 1 - P 0 (2) – P 1 (2) –P 2 (2) - P 3 (2) =1 – [ (4/3) 3 ]e -2 Note:  has to be given in units packets/hour

Q4 Event Q4: n th packet arrived during t hours What is Pr[Q4]? Pr[Q4] = the sum of the probabilities that at least n packets arrived during [0..t]  The Tail (complementary Cumulative) function T n (t) Pr[Q4] = T n (t) ≡ P n (t) + P n+1 (t) + P n+2 (t) +…= = 1- P 0 (t) – P 1 (t) –P 2 (t).. = 1-  k P k (t) k = 0..n

Q5 Event Q5: during the 2 nd hour 4 packets arrived What is Pr[Q5] ? A: Since time 0 is arbitrary, we can choose it to be at the end of first hour Pr[Q5] is equal to the probability that during the first hour 4 packets arrived, Pr[Q5] = P 4 (1) = 4 e  /4! Probability depends on the length of the time interval (here 1), not on its location

Q6 Event Q6: the 4 th packet arrived during the 2 nd hour What is Pr[Q6] Event Q6 occurs if and only if: 0 arrived 1 st hour, AND 4 or more arrived 2 nd hour, OR 1 arrived 1 st hour, AND 3 or more arrived 2 nd hour, OR 2 arrived 1 st hour, AND 2 or more arrived 2 nd hour, OR 3 arrived 1 st hour, AND 1 or more arrived 2 nd hour Pr[Q6] = P 0 (1)T 4 (1) + P 1 (1)T 3 (1) + P 2 (1)T 2 (1) + T 3 (1)F 0 (1)

Q6 (alternative method) Consider the probability that the 4 th packet arrived during two hours, (which is T 4 (2)). This is equal to: Pr [4 th packet arrived during 1 st Hr] + (this is T 4 (1)) + Pr[4 th packet arrived during 2 nd Hr] (what we want)  T 4 (2) = T 4 (1) + Pr[Q6]  Pr[Q6] = T 4 (2) – T 4 (1) Pr[Q6], the probability that the 4 th packet arrived during the 2 nd hour is the probability that the 4 th packet arrived during two hours minus the probability that the 4 th packet arrived during one hour

Q7 Event Q7: time until 1 st packet arrived is larger than 1 hour; What is Pr[Q7] ? Define T k arrival time of packet k, k = 1, 2, 3, …  Continuous Random variable, can have any non-zero value We are looking for Pr[Q7] ≡ Pr(T 1 >1) Event Q7 occurs if and only if zero packets arrived during 1 st hour  Pr[Q7] ≡ Pr(T 1 >1) = P 0 (1) = e -

Review: Conditional Probability Consider many experiments, observing two RVs, S 1, S 2 e.g. number of packets in buffer at time steps 1, 2 Count number of events with S 1 = n 1  Then count the fraction of these with S 2 = n 2  Define: Pr[S 2 = n 2 | S 1 = n 1 ] ≡ NumEvents[S 1 = n 1 & S 2 = n 2 ]/NumEvents[S 1 =n 1 ] When TotalNumExperiments  ∞ Pr[S 2 = n 2 | S1 = n 1 ] is the probability of S 2 equal n 2, conditioned on S 1 equal n 1

Conditional Probability (2) Pr[S 2 = n 2 | S 1 = n 1 ] ≡ NumEvents[S 1 = n 1 & S 2 = n 2 ]/NumEvents[S 1 =n 1 ] Divide numerator and denominator on RHS by TotalNumEvents (no change in LHS). Result?? Pr[S 2 =n 2 | S 1 =n 1 ] = Pr[S 2 = n 2, S 1 = n 1 ]/Pr[S 1 = n 1 ]  Pr(S 2 = n 2, S 1 =n 1 ] = Pr[S 2 = n 2 |S 1 = n 1 ]*Pr[S 1 = n 1 ]  Low of Total Probability: Pr(S 2 = n 2 ) =  n 1 Pr(S 2 = n 2 |S 1 = n 1 )*Pr(S 1 = n 1 ) How did we derive this?

Q8 Time interval [0..t+s] consists of 2 non-overlapping intervals [0..t], [t..t+s] Event E1: during full interval [0..t+s] n Poisson arrivals Event E2: during sub-interval [0..t] k arrivals (k < n) What is the conditional probability Pr[E2|E1]? Pr[E2|E1] ≡ Pr[E2 and E1]/Pr[E1] = = Pr [(N(t) = k) and (N(t+s) = n] / Pr[N(t+s) = n] = Pr [(N(t) = k) and (N(s) = n-k] / Pr[N(t+s) = n] Note: arrivals during 0…t are independent of arrivals during t..t+s, so the numerator is a product of probabilities

Q8 (cont’d) Numerator: Pr [(N(t) = k) and (N(s) = n-k] = = Pr[N(t) = k]*Pr[N(s) = n-k] = P k (t)*P n-k (s) = =( t) k e - t ( s) n-k e - s /(k!(n-k)!) Denominator: Pr[N(t+s) = n] = (  t+s)) n e -  t+s) /n! Pr[E2|E1] = [n!/(k!(n-k)!)][t/(t+s)] k [s/(t+s)] n-k  Pr[E2|E1] is independent

Q9 Jar has 5 white balls, 8 red balls. 2 balls are taken out, one after the other, (balls are not returned). X k denotes the color of the k th ball, k = 1, 2  X k = 0 is red ball. X k = 1 is white ball Calculate the conditional probabilities: Given that 2 nd ball was white, the 1 st ball was red  Pr[X 1 = 0| X 2 = 1] Given that 2 nd ball was red, the 1 st ball was red  Pr[X 1 =0|X 2 =0]  We need to calculate the joint and total probabilities

Q9 (Cont’d) Joint Probabilities (sum of these is 1): Red Red: Pr[X 1 =0,X 2 = 0] = (8/13)(7/12) = 14/39 Red White: Pr[X 1 = 0, X 2 = 1] = (8/13)(5/12) = 10/39 White Red: Pr[X 1 = 1, X 2 = 0] = (5/13)(8/12) = 10/39 White White: Pr[X 1 = 1, X 2 =1] = (5/13)(4/12) = 5/39 Total probabilities for first ball (sum of these is 1): 1 st Red: Pr[X 1 = 0] = Pr[X 1 =0, X 2 = 0] + Pr[X 1 =0, X 2 =1] = 24/39 1 st White: Pr[X 1 = 1] = Pr[X 1 =1, X 2 =0] + Pr[X 1 =1, X 2 = 1] = 15/39

Q9 (Cont’d) Total probabilities fro second ball (sum of these is 1) 2 nd Red:  Pr[X 2 = 0] = Pr[X 1 =0, X 2 =0] + Pr[X 1 =1, X 2 =0] = 24/39 2 nd White:  Pr[X 2 = 1] = Pr[X 1 =0, X 2 =1] + Pr[X 1 =1, X 2 =1] = 15/39 Pr[X 1 =0|X 2 =1] = Pr[X 1 =0, X 2 =1]/Pr[X 2 =1] = 2/3 Pr[X 1 =0|X 2 =0] = Pr[X 1 =0, X 2 =0]/Pr[X 2 =0] = 7/12

Q10 Engine  If Engine is faulty, the probability of a flash is 0.99  If Engine is OK, the probability of a flash is 0.01  The engine is faulty only one day out of 100 Given a flash, what’s the probability that engine is faulty? E engine state: E = 1 OK, E = 0 faulty F: Flash state: F = 1 flash, F = 0 no flash Pr[F = 1|E= 0] = 0.99; Pr[F = 1|E = 1] = 0.01 Pr[E = 0] = 0.01; Pr[E = 1] = 0.99 What is Pr[E = 0 |F = 1]

Q10 (Cont’d) By definition of conditional probability Pr[E = 0, F = 1] = Pr[E = 0 | F = 1]*Pr[F=1] Pr[E = 0, F = 1] = Pr[F = 1 | E = 0]*Pr[E = 0] Pr[E = 0 | F = 1]*Pr[F=1] = Pr[F = 1 | E = 0]*Pr[E = 0]  Bayes Formula Pr[E = 0 | F = 1] = Pr[F =1 | E =0]*Pr[E=0] / Pr[F=1] Numerator = 0.99* 0.01 Denominator: What is Pr[F = 1]?

Q10 (Cont’d) The probability that there will be a flash consists of the probability for a flash when the engine is OK (E=1) and the probability for a flash when the engine is faulty (E=0) The Complete Probability Rule: Pr[F=1] = Pr[F=1|E=0]*Pr[E= 0] + Pr[F=1|E=1]*Pr[E=1] = 0.99* *0.99 = 2*0.99*0.01 Pr[E=0 | F = 1] = 0.99*0.01/(2*0.99*0.01) = 0.5 If there is a flash, only 50% chance that engine is faulty!

Q11 1% of the population has a sickness If a person is sick, probability that the test is + is 0.99 If a person is not sick, probability that the test is + is 0.01 Given that the test is +, what is the probability that the person is sick? Bayes: Pr[sick |Test+] = Pr[Test+|sick]*Pr[sick]/Pr[Test+] Complete Probability formula  Pr[Test+] = Pr[Test+|sick]*Pr[sick] +Pr[Test+| not sick] * Pr[not sick] Pr[sick] = 0.01; Pr[not sick = 0.99]; Pr[Test+|sick] = 0.99; Pr[Test+| not sick] = 0.01; Pr[sick| Test +] = 0.05

Q12 Three payphones 1 st, 2 nd, 3 rd. One phone is Bad - never works, another is Reliable - always work, third phone Unstable - work with probability 0.5; We do not know which is which 3 phone tests: 1 st didn’t work, 2 nd worked, 2 nd worked again What’s the probability that 2 nd is the Reliable phone? Event A: 1 st didn’t work, 2 nd worked, 2 nd workd Use Bayes Theorem to find Pr[2 nd Reliable | A]

Q12 (cont’d) Bayes Theorem: Pr[2 nd Reliable | A] = Pr[A | 2 nd Reliable] * (Pr[2 nd Reliable] / Pr[A]) Complete Probability Rule: Pr[A] =  Pr[A|1 st Reliable]*Pr[1 st Reliable]+  Pr[A| 2 nd Reliable]*Pr[2 nd Reliable] +  Pr[A | 3 rd Reliable]* Pr[3 rd Reliable] Pr[1 st Reliable] = Pr[2 nd Reliable] = Pr[3 rd Reliable] = 1/3

Q12 (cont’d) If 2 nd is the Reliable, A will happen when : Either 1 st is Bad, probability ½, definitely wouldn’t work  OR 1 st is Unstable, Pr = ½, probability 0.5 it works Pr[A | 2 nd Reliable] = Pr[1 st didn’t work| 1 st Bad] *Pr[1 st Bad|2 nd Reliable]+ Pr[1 st didn’t work|1 st Unstable]*Pr[1 st Unstable | 2 nd Reliable] =1* Pr[1 st Bad|2 nd Reliable]+0.5Pr[1 st Unstable|2 nd Reliable] = *0.5 = 0.75

Q12 (cont’d) If 1 st is reliable A will never occur  Pr[A|1 st is Reliable] = 0 Calculate Pr[A| 3 rd Reliable] by using chain rule with all states of 2 nd Pr[A | 3 rd Reliable] = Pr[A| 2 nd Unstable]) *Pr[2 nd Unstable| 3 rd Reliable] + Pr[A| 2 nd Bad]*Pr[2 nd Bad| 3 rd Reliable] Id 3 rd is Reliable, 2 nd can be Unstable or Bad with Pr = 1/2 If 2 nd is Unstable event A occur with probability 1/4 If 2 nd is Bad, the event A does not occur Pr[A | 3 rd Reliable] = 0.25* = 1/8

Q12 (cont’d) To sum up: Pr[A] = (1/3( 0 + ¾ +1/8) Pr[2 nd Reliable] = 1/3 Pr[A|2 nd Reliable] = 3/4 Use Bayes Theorem: Pr[2 nd Reliable| A] = Pr[A | 2 nd Reliable] * Pr[2 nd Reliable]/Pr[A] = (3/4)*(1/3)/[(1/3)*(0 + ¾ + 1/8)] = (3/4)/(3/4 + 1/8) = 6/7

Bernoulli Random Variable Bernoulli Experiment: outcomes Success or Failure  Described by Bernoulli Variable X; values ? X gets one of two values: 1 (Success ), 0 (Failure )  Example: a packet arrive/doesn’t arrive to a router Bernoulli PMF: Pr[X=1] ≡ a Pr[X=0] ≡ b = 1 – a Bernoulli CDF: F X (x) ≡ Pr[X ≤ x] = b for all x < 1 How much is  PMF CD F  a + 0*b = a  2 = (1-  ) 2 a + (0-  ) 2 *b   2 = ab

A series of Bernoulli Random Variables Packets arrive to a Router: a series of Bernoulli experiments Described by a series of Bernoulli variables  X j : = 1 (arrival); X j = 0 (no arrival) at a time step j  Assume all X j have same distribution Pr[X j = 1] ≡ a; Pr[X j = 0] ≡ b = 1-a; Other examples: Rain or No-Rain in a certain day Packet with/out error in a stream of arriving packets

First Success Random variable– Geometric Distribution Packets arrive to a Router Random Var T: number of time units up to (inc) first arrival What is Event T = n n-1 failures (no arrivals), then success (arrival) X j = 0 for j = 1, 2,.. n-1. X n = 1  Pr[T = n] = a*b n-1 = a*(1-a) n--1 n ≠ 0 why?  =  n nPr(T = n) =  n na*(1-a) n—1  = 1/a what’s the meaning of   2 =  n (n-  ) 2 Pr(T = n) =  n (n-  ) 2 a*(1-a) n--1  2 = (1-a)/a 2

Counter Random Variable: Binomial PMF Same N Bernoulli Variables: X j = 1 (success),X j = 0 (failure)  Pr[X j = 1] ≡ a Pr[X j = 0] ≡ b = 1- a Bernoulli Success Counter K: counts number of successes  E.g. number of packets arriving during N time units  Pr[K = k] = [N!/(k!(N-k)!]*a k b N-k 0 ≤ k ≤ N   = Na  2 = Nab what’s the meaning of  ? Poisson Approximation: N  ∞ with finite  =Na  Pr[K=k] ≈ [(Na) k /k!] *e -Na = (  k /k!)*e - 

Usage Example: A file is downloaded from a remote site by 500 packets Assume:1 percent of received packets through the channel are in error. What’s the probability that 5 received packets are in error Which PMF should be used? These are 500 Bernoulli experiments, with a = 0.99 ?? Need Probability of 495 successes, 5 failures Pr[K=495] = [500!/(495!*5!)](0.99) 495 (0.01) 5 =  How many packets, on the average, arrived with error?