TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.

Slides:



Advertisements
Similar presentations
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Advertisements

Chapter 2 Concepts of Prob. Theory
CS433: Modeling and Simulation
Statistics review of basic probability and statistics.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Chapter 2 Discrete Random Variables
Discrete Probability Distributions
Probability & Statistical Inference Lecture 3
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 4 Discrete Random Variables and Probability Distributions
Chapter 1 Probability Theory (i) : One Random Variable
The Binomial Probability Distribution and Related Topics
Probability Distributions
1 Review of Probability Theory [Source: Stanford University]
Statistics.
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
Operations Management
Class notes for ISE 201 San Jose State University
Probability Review ECS 152A Acknowledgement: slides from S. Kalyanaraman & B.Sikdar.
Discrete Random Variables and Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
CMPE 252A: Computer Networks Review Set:
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Discrete Probability Distributions
Discrete Distributions
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Random Variables & Probability Distributions Outcomes of experiments are, in part, random E.g. Let X 7 be the gender of the 7 th randomly selected student.
Modeling and Simulation CS 313
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
CHAPTER 6: DISCRETE PROBABILITY DISTRIBUTIONS. PROBIBILITY DISTRIBUTION DEFINITIONS (6.1):  Random Variable is a measurable or countable outcome of a.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Michael Baron. Probability and Statistics for Computer Scientists,
1 Topic 3 - Discrete distributions Basics of discrete distributions Mean and variance of a discrete distribution Binomial distribution Poisson distribution.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Probability Theory for Networking Application Rudra Dutta ECE/CSC Fall 2010, Section 001, 601.
One Random Variable Random Process.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Probability Refresher COMP5416 Advanced Network Technologies.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Chapter 4 Discrete Probability Distributions. Roll Two Dice and Record the Sums Physical Outcome: An ordered pair of two faces showing. We assign a numeric.
4.3 More Discrete Probability Distributions NOTES Coach Bridges.
Random Variables Example:
Probability and Distributions. Deterministic vs. Random Processes In deterministic processes, the outcome can be predicted exactly in advance Eg. Force.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Chapter 7 Section 5.  Binomial Distribution required just two outcomes (success or failure).  Multinomial Distribution can be used when there are more.
4. Overview of Probability Network Performance and Quality of Service.
Review of Probability Theory
3. Random Variables (Fig.3.1)
ONE DIMENSIONAL RANDOM VARIABLES
Random variables (r.v.) Random variable
Chapter 2 Discrete Random Variables
Lecture 10 – Introduction to Probability
Probability Review for Financial Engineers
Some Discrete Probability Distributions
Probability Theory and Specific Distributions (Moore Ch5 and Guan Ch6)
Lecture 2 Binomial and Poisson Probability Distributions
Discrete Random Variables: Basics
Random Variables Binomial and Hypergeometric Probabilities
Presentation transcript:

TDC 369 / TDC 432 April 2, 2003 Greg Brewster

Topics Math Review Probability –Distributions –Random Variables –Expected Values

Math Review Simple integrals and differentials Sums Permutations Combinations Probability

Math Review: Sums

Math Review: Permutations Given N objects, there are N! = N(N-1)…1 different ways to arrange them Example: Given 3 balls, colored Red, White and Blue, there are 3! = 6 ways to order them –RWB, RBW, BWR, BRW, WBR, WRB

Math Review: Combinations The number of ways to select K unique objects from a set of N objects without replacement is C(N,K) = Example: Given 3 balls, RBW, there are C(3,2) = 3 ways to uniquely choose 2 balls –RB, RW, BW

Probability Probability theory is concerned with the likelihood of observable outcomes (“events”) of some experiment. Let  be the set of all outcomes and let E   be some event in , then the probability of E occurring = Pr[E] is the fraction of times E will occur if the experiment is repeated infinitely often.

Probability Example: –Experiment = tossing a 6-sided die –Observable outcomes = {1, 2, 3, 4, 5, 6} –For fair die, Pr{die = 1} = Pr{die = 2} = Pr{die = 3} = Pr{die = 4} = Pr{die = 5} = Pr{die = 6} =

Probability Pie

Valid Probability Measure A probability measure, Pr, on an event space {E i } must satisfy the following: –For all E i, 0 <= Pr[E i ] <= 1 –Each pair of events, E i and E k, are mutually exclusive, that is, –All event probabilities sum to 1, that is,

Probability Mass Function Pr(Die = x)

Mass Function = Histogram If you are starting with some repeatable events, then the Probability Mass function is like a histogram of outcomes for those events. The difference is a histogram indicates how many times an event happened (out of some total number of attempts), while a mass function shows the fraction of time an event happens (number of times / total attempts).

Dice Roll Histogram 1200 attempts Number of times Die = x

Probability Distribution Function (Cumulative Distribution Function) Pr(Die <= x)

Combining Events Probability of event not happening: – Probability of both E and F happening: –IF events E and F are independent Probability of either E or F happening: –

Conditional Probabilities The conditional probability that E occurs, given that F occurs, written Pr[E | F], is defined as

Conditional Probabilities Example: The conditional probability that the value of a die is 6, given that the value is greater than 3, is Pr[die=6 | die>3] =

Probability Pie

Conditional Probability Pie

Independence Two events E and F are independent if the probability of E conditioned on F is equal to the unconditional probability of E. That is, Pr[E | F] = Pr[E]. In other words, the occurrence of F has no effect on the occurrence of E.

Random Variables A random variable, R, represents the outcome of some random event. Example: R = the roll of a die. The probability distribution of a random variable, Pr[R], is a probability measure mapping each possible value of R into its associated probability.

Sum of Two Dice Example: –R = the sum of the values of 2 dice Probability Distribution: due to independence: –

Sum of Two Dice

Probability Mass Function: R = Sum of 2 dice Pr(R = x)

Continuous Random Variables So far, we have only considered discrete random variables, which can take on a countable number of distinct values. Continuous random variables and take on any real value over some (possibly infinite) range. –Example: R = Inter-packet-arrival times at a router.

Continuous Density Functions There is no probability mass function for a continuous random variable, since, typically, Pr[R = x] = 0 for any fixed value of x because there are infinitely many possible values for R. Instead, we can generate density functions by starting with histograms split into small intervals and smoothing them (letting interval size go to zero).

Example: Bus Waiting Time Example: I arrive at a bus stop at a random time. I know that buses arrive exactly once every 10 minutes. How long do I have to wait? Answer: My waiting time is uniformly distributed between 0 and 10 minutes. That is, I am equally likely to wait for any time between 0 and 10 minutes

Bus Wait Histogram 2000 attempts (histogram interval = 2 min) Waiting Times (using 2-minute ‘buckets’)

Bus Wait Histogram 2000 attempts (histogram interval = 1 min) Waiting Times (using 1-minute ‘buckets’)

Bus Waiting Time Uniform Density Function

Value for Density Function The histograms show the shape that the density function should have, but what are the values for the density function? Answer: Density function must be set so that the function integrates to 1.

Continuous Density Functions To determine the probability that the random value lies in any interval (a, b), we integrate the function on that interval. So, the probability that you wait between 3 and 5 minutes for the bus is 20%:

Cumulative Distribution Function For every probability density function, f R (x), there is a corresponding cumulative distribution function, F R (x), which gives the probability that the random value is less than or equal to a fixed value, x.

Example: Bus Waiting Time For the bus waiting time described earlier, the cumulative distribution function is

Bus Waiting Time Cumulative Distribution Function Pr(R <= x)

Cumulative Distribution Functions The probability that the random value lies in any interval (a, b) can also easily be calculated using the cumulative distribution function So, the probability that you wait between 3 and 5 minutes for the bus is 20%:

Expectation The expected value of a random variable, E[R], is the mean value of that random variable. This may also be called the average value of the random variable.

Calculating E[R] Discrete R.V. Continuous R.V.

E[R] examples Expected sum of 2 dice Expected bus waiting time

Moments The n th moment of R is defined to be the expected value of R n –Discrete: –Continuous:

Standard Deviation The standard deviation of R,  (R), can be defined using the 2 nd moment of R:

Coefficient of Variation The coefficient of variation, CV(R), is a common measure of the variability of R which is independent of the mean value of R:

Coefficient of Variation The coefficient of variation for the exponential random variable is always equal to 1. Random variables with CV greater than 1 are sometimes called hyperexponential variables. Random variables with CV less than 1 are sometimes called hypoexponential variables.

Common Discrete R.V.s Bernouli random variable A Bernouli random variable w/ parameter p reflects a 2-valued experiment with results of success (R=1) w/ probability p

Common Discrete R.V.s Geometric random variable A Geometric random variable reflects the number of Bernouli trials required up to and including the first success

Geometric Mass Function # Die Rolls until a 6 is rolled Pr(R = x)

Geometric Cumulative Function # Die Rolls until a 6 is rolled Pr(R <= x)

Common Discrete R.V.s Binomial random variable A Binomial random variable w/ parameters (n,p) is the number of successes found in a sequence of n Bernoulli trials w/ parameter p

Binomial Mass Function # 6’s rolled in 12 die rolls Pr(R = x)

Common Discrete R.V.s Uniform random variable A Uniform random variable w/ parameter set {x 1 … x N } is one which picks each x i value with equal probability

Common Discrete R.V.s Poisson random variable A Poisson random variable w/ parameter models the number of arrivals during 1 time unit for a random system whose mean arrival rate is arrivals per time unit

Poisson Mass Function Number of Arrivals per second given an average of 4 arrivals per second ( = 4) Pr(R = x)

Continuous R.V.s Continuous Uniform random variable A Continuous Uniform random variable is one whose density function is constant over some interval (a,b):

Exponential random variable A (Negative) Exponential random variable with parameter represents the inter-arrival time between arrivals to a Poisson system:

Exponential random variable Mean (expected value) and coefficient of variation for Exponential random variable:

Exponential Delay Poisson 4 arrivals/unit (E[R] = 0.25) Pr(R <= x)