4. Overview of Probability Network Performance and Quality of Service
Motivation Provide a brief review of topics that will help us: Statistically characterize network traffic flow Model and estimate performance parameters Set stage for discussion of traffic management and routing later in the course NOT a condensed class in probability theory RQ122
Definitions of Probability Theory Probability is concerned with assignment of numbers to events. Pr[A] of an event A is a number between 0 and 1 that corresponds to the likelihood that the event A will occur. There are a number definitions of probability, we will discuss only three 1. Classical Definition 2. Relative Frequency Definition 3. Axiomatic Definition RQ123
Classical Definition If a random experiment (process with an uncertain outcome) can result in N mutually exclusive and equally likely outcomes, and if N A is the number of outcomes in which event A occurs, then the probability of A is Pr[A] = RQ124 NANNAN
Classical Definition Example: If we roll a die … There are 6 equally likely outcomes i.e. N=6 There are three outcomes that correspond to the event [even]. In this case, Pr[even] = = = 0.5 Example: If we roll two dice … There are 36 equally likely outcomes (6x6) The probability that the sum is 7 is. RQ125 NANNAN
Classical Definition What if N is not finite? In that case, the Classical definition is not applicable. What if the outcomes are not equally likely? Again, the Classical definition of probability is not applicable. In such cases, how might we define the probability of an outcome that has event A? RQ126
Relative Frequency Definition If a random experiment is repeated a large number of times, say n times, under identical conditions and if an event A is observed to occur n A times, then the probability of A is Pr[A] = The foundation of this approach is that there is some Pr[A]. We cannot deduce it, as in Classical probability, but we can estimate it. RQ127 lim n nAnnAn
Relative Frequency Definition Example: one tosses a coin, which might or might not be fair, 100 times and observes heads on 52 of the tosses. One’s estimate of the probability of a head is Pr[head] ≈ or 0.52 RQ
Axiomatic Definition The axiomatic approach build up probability theory from a number of assumptions (axioms). From these axioms, laws of probability are derived that can be used for calculations. RQ Pr[A] 1 for each even A 2.Pr[ ] = 1 3.Pr[A B] = Pr[A] + Pr[B] if A and B are mutually exclusive Common Axioms:
Axiomatic Definition RQ Pr[A] = 1 - Pr[A] 2.Pr[A B] = 0 (if A and B are mutually exclusive) 3.Pr[A B] = Pr[A] + Pr[B] – Pr[A B] 4.Pr[A B C] = Pr[A] + Pr[B] + Pr[C] – Pr[A B] – Pr[A C] – Pr[B C] + Pr[A B C] Important Laws:
Axiomatic Definition Example: If we roll a die … If we assume that each of the 6 outcomes are equally likely, probability of each will be ⅙. Pr[even] = Pr[2] + Pr[4] + Pr[6] = ½ Pr[less than 3] = Pr[1] + Pr[2] = ⅓ Pr[{even} U {less than 3}] = Pr[even] + Pr[less than 3] – Pr[2] = ½ + ⅓ – ⅙ = ⅔ RQ1211
Conditional Probability The conditional probability of an event A, given that event B has occurred is: Pr[AB] ≅ Pr[A B] ≅ Pr[A and B] A and B are independent events if Pr[A B] = Pr[A]Pr[B] Pr[A B] = Pr[B] Pr[AB] RQ1212
Conditional Probability Example: What is the probability of getting a sum of 8 on the roll of two dice if we know that the face of at least one die is an even number? Let, A = [sum of 8], B = [at least 1 die even] Pr[A | B] = = = RQ1213 Pr[B] Pr[AB] 1/12 ¾ 1919
Total Probability Given a set of mutually exclusive events E 1, E 2, …, E n covering all possible outcomes, and Given an arbitrary event A, then: Pr[A] = ∑ Pr[A E i ]Pr[E i ] n i = 1 RQ1214
Bayes’s Theorem “Posterior odds” – the probability that an event really occurred, given evidence in favor of it: Pr[E i A] = Pr[A E i ] Pr[E i ] Pr[A] = n i = 1 Pr[A E i ]Pr[E i ] RQ1215
Bayes’s Theorem Example Hit & run accident involving a taxi 85% of taxis are yellow, 15% are black Eyewitness reported that the taxi involved in the accident was black Data shows that eyewitnesses are correct on car color 80% of the time What is the probability that the cab was black? Pr[Black|WB] = Pr[WB|Black] Pr[Black] Pr[WB|Black] Pr[Black] + Pr[WB|Yellow] Pr[Yellow] = (0.8)(0.15) (0.8)(0.15) + (0.2)(0.85) = 0.41 RQ1216
Network injects errors (flips bits) Assume Pr[S1] = p = Pr[S0] = 1-p = 0.5 Assume Pr[R1] = Pr[R0] = 0.5 Given error injection, such that Pr[R0 S1] =p a and Pr[R1 S0] =p b, then : Pr[S1 R0] = Pr[R0 S1] Pr[S1] Pr[R0 S1] Pr[S1] + Pr[R0 S0] Pr[S0] p a p p a p + (1-p b )(1-p) = Sender S Receiver R Error Injection Bayes’s Theorem Example RQ1217
Random Variables A random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As opposed to other variables, a random variable conceptually does not have a single, fixed value; rather, it can take on a set of possible different values (each with an associated probability). There are two types of random variables, discrete and continuous. RQ1218
Random Variables Examples: 1. Select a soccer player; X = the number of goals the player has scored during the season. The values of X are 0, 1, 2, 3, Survey a group of 10 soccer players; Y = the average number of goals scored by the players during the season. The values of Y are 0, 0.1, 0.2,....,1.0, 1.1, … RQ1219
Random Variables A discrete random variable can take on only specific, isolated numerical values. e.g. number of packets dropped during transmission A continuous random variable is one which takes an infinite number of possible values. e.g. delay experienced by packets during transmission RQ1220
Random Variables Discrete random variables are described by a probability function P x (k) = Pr[X=k] Continuous random variables can be described by either a distribution function or a density function. Random variable characteristics: Mean value: E[X] Second moment: E[X 2 ] Variance: Var[X] = E[X 2 ] - E[X] 2 Standard deviation: X = (Var[X]) ½ RQ1221
Cumulative Distribution Function The Cumulative Distribution Function (CDF) of a random variable maps a given value a to the probability of the variable taking a value less than or equal to a: F X (a) = Pr[X ≤ a] RQ1222
Probability Density Function The above derivative of the CDF F(x) is called the probability density function of x. Given a pdf f(x), the probability of x being in the interval (x 1, x 2 ) can also be computed by integration: RQ1223
Mean and Variance Mean or Expected Value Variance: RQ1224
Probability Distributions F(x) = Pr[X x] = 1 – e - x Exponential Distribution Exponential Density E[X] = X = 1/ f(x) = F(x) = e - xddx RQ1225
Probability Distributions F(x) = Pr[X x] = 1 – e - x Exponential Distribution Exponential Density f(x) = F(x) = e - xddx RQ1226
Probability Distributions Poisson Distribution Normal Density Pr[X=k] = e - f(x) = k k! e -(x- ) 2 /2 2 2 E[X] = Var[X] = RQ1227
Probability Distributions – Relevance to Networks 2 Service times of queues (t trans ) in packet switching routers can be effectively modeled as exponential Arrival pattern of packets at a router is often Poisson in nature and, arrival interval is exponential (why?) Central Limit Theorem: the distribution of a very large number of independent RVs is approximately normal, independent of individual distributions RQ1228