Occupancy Problems m balls being randomly assigned to one of n bins. (Independently and uniformly) The questions: - what is the maximum number of balls.

Slides:



Advertisements
Similar presentations
STATISTICS Random Variables and Distribution Functions
Advertisements

Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
Important Random Variables Binomial: S X = { 0,1,2,..., n} Geometric: S X = { 0,1,2,... } Poisson: S X = { 0,1,2,... }
Discrete Probability Distributions
Chain Rules for Entropy
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Randomized Algorithms Randomized Algorithms CS648 Lecture 8 Tools for bounding deviation of a random variable Markov’s Inequality Chernoff Bound Lecture.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Probability Densities
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
1 Review of Probability Theory [Source: Stanford University]
Probability theory 2010 Outline  The need for transforms  Probability-generating function  Moment-generating function  Characteristic function  Applications.
2003/04/23 Chapter 3 1頁1頁 3.6 Expected Value of Random Variables.
1 Worst-Case Equilibria Elias Koutsoupias and Christos Papadimitriou Proceedings of the 16th Annual Symposium on Theoretical Aspects of Computer Science.
Probability theory 2008 Conditional probability mass function  Discrete case  Continuous case.
The moment generating function of random variable X is given by Moment generating function.
Discrete Random Variables and Probability Distributions
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Some Continuous Probability Distributions Asmaa Yaseen.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
Normal distribution (2) When it is not the standard normal distribution.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Continuous Distributions The Uniform distribution from a to b.
1 Chapter 16 Random Variables. 2 Expected Value: Center A random variable assumes a value based on the outcome of a random event.  We use a capital letter,
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
HYPERGEOMETRIC DISTRIBUTION PREPARED BY :A.TUĞBA GÖRE
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
Probability Theory Overview and Analysis of Randomized Algorithms Prepared by John Reif, Ph.D. Analysis of Algorithms.
Chapter 16 Random Variables.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Math b (Discrete) Random Variables, Binomial Distribution.
Slide 16-1 Copyright © 2004 Pearson Education, Inc.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Team Assignment 6 An optical inspection system is used to distinguish among different part types. The probability of a correct classification is 0.98.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
The Birthday Problem. The Problem In a group of 50 students, what is the probability that at least two students share the same birthday?
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
Regression and Correlation of Data Correlation: Correlation is a measure of the association between random variables, say X and Y. No assumption that one.
Theory of Computational Complexity M1 Takao Inoshita Iwama & Ito Lab Graduate School of Informatics, Kyoto University.
Andrey Markov, public domain image
Basic statistics Usman Roshan.
Basic statistics Usman Roshan.
Probability Theory Overview and Analysis of Randomized Algorithms
Discrete Random Variables
Discrete Probability Distributions
An Example of {AND, OR, Given that} Using a Normal Distribution
CONCEPTS OF ESTIMATION
Probability Review for Financial Engineers
ASV Chapters 1 - Sample Spaces and Probabilities
Chebychev, Hoffding, Chernoff
6.3 Sampling Distributions
9. Limit Theorems.
Bernoulli Trials Two Possible Outcomes Trials are independent.
Chap 11 Sums of Independent Random Variables and Limit Theorem Ghahramani 3rd edition 2019/5/16.
Continuous Distributions
Presentation transcript:

Occupancy Problems m balls being randomly assigned to one of n bins. (Independently and uniformly) The questions: - what is the maximum number of balls in any bin? -what is the expected number of bins with k balls?

For arbitrary events:, not necessarily independent: the probability of the union of events is no more than the sum of their probabilities. Let m=n : For let, where j is the number of balls in the th bin. Then we get: for all i.

Now: we concentrate on analyzing the 1 st bin, so: Let denote the event that bin has or more balls in it. So: From upper bound for binomial coefficients

Now, let Then: with probability at least, no bin has more than balls in it!

The Birthday Problem The Birthday Problem Now n=365, How large must m be before two people in the group are likely to share their birthday? For, let denote the event that the th ball lands in a bin not containing any of the first balls. But:

So: Now we can see that for the probability that all m balls land in distinct bins is at most.

Markov Inequality Let Y be a random variable assuming only non-negative values. Than for all : Or: Proof: define Than: Now, else,,

If X is a random variable with expectation, The variance is defined: The standard deviation of X is

Chebyshev’s Inequality Let X be a random variable with expectation, and standard deviation. Then for any : Proof: First: for we get. applying the markov inequality to Y bounds this probability.