Presentation is loading. Please wait.

Presentation is loading. Please wait.

Physics 114: Lecture 7 Probability Density Functions

Similar presentations


Presentation on theme: "Physics 114: Lecture 7 Probability Density Functions"— Presentation transcript:

1 Physics 114: Lecture 7 Probability Density Functions
John F. Federici NJIT Physics Department

2 New Jersey TRIVIA QUESTION!
Which Actor/ Actress for the following characters from GAME OF THRONES is from New Jersey? (a) Tyrion Lannister (b) Cersai Lannister (c) Arys Stark (d) Jon Snow (e) Samwell Tarly

3 New Jersey TRIVIA QUESTION!
Which Actor/ Actress for the following characters from GAME OF THRONES is from New Jersey? Tyrion Lannister – Peter Dinklage was born in Morristown, NJ

4 Probability Distribution Functions
Most important – Binomial, Poisson, Gaussian Binomial Distribution – Applied in experiments where smaller number of FINAL states is important rather than details of HOW the state is exactly created. Final state answers are usually ‘yes’ or ‘no’ EXAMPLE: If one flips 10 coins, how often does 6 heads and 4 tails occur? …. Note state of INDIVIDUAL coin not important. EXAMPLE: Rolling of dice (6 sized or N-sided). In the game of Craps, how often is a “7” rolled with two six-sided dice? Poisson and Gaussian distributions are limiting cases of Binomial

5 Probability Distribution Functions
Most important – Binomial, Poisson, Gaussian Poisson Distribution – Average number of ‘successes’ (eg. counting something) is much smaller than the possible number of events. EXAMPLE: Counting the number of alpha particles which are emitted during radioactive decay. EXAMPLE: Counting the number of photons (particles of light) emitted by a light source.

6 Probability Distribution Functions
Most important – Binomial, Poisson, Gaussian Gaussian Distribution – Number of different observations is large AND the probability of a ‘successes’ (eg. counting something) is large as well. Used to describe the distribution of random observations for many types of experiments.

7 Binomial Distribution
If you raise the sum of two variables to a power, you get: Writing only the coefficients, you begin to see a pattern: 1 1 1 February 12, 2010

8 Binomial Distribution
Remarkably, this pattern is also the one that governs the possibilities of tossing n coins: With 3 coins, there are 8 ways for them to land, as shown above. In general, there are 2n possible ways for n coins to land. How many permutations are there for a given row, above, e.g. how many permutations for getting 1 head and 2 tails? Obviously, 3. How many permutation for x heads and n - x tails, for general n and x? n n 1 1 1 Number of combinations in each row: (n is TOTAL number of coins, x is number of tails)

9 Probability With fair coins, tossing a coin will result in equal chance of 50%, or ½, of its ending up heads. Let us call this probability p. Obviously, the probability of tossing a tails, q, is q = (1 - p). With 3 coins, the probability of getting any single one of the combinations is 1/2n = 1/8th, (since there are 8 combinations, and each is equally probable). This comes from (½) (½) (½), or the product of each probability p = ½ to get a heads. If we want to know the probability of getting, say 1 heads and 2 tails, we just need to multiply the probability of any combination (1/8th) by the number of ways of getting 1 heads and 2 tails, i.e. 3, for a total probability of 3/8. To be really general, say the coins were not fair, so p ≠ q. Then the probability to get heads, tails, tails would be (p)(q)(q) = p1q2. Finally the probability P(x; n, p) of getting x heads given n coins each of which has probability p, is

10 Binomial Distribution
This is the binomial distribution, which we write PB: Let’s see if it works. For 1 heads with a toss of 3 fair coins, x = 1, n = 3, p = ½, we get For no heads, and all tails, we get Say the coins are not fair, but p = ¼. Then the probability of 2 heads and 1 tails is: You’ll show for homework that the sum of all probabilities for this (and any) case is 1, i.e. the probabilities are normalized. Note: 0!  1

11 Binomial Distribution
To see the connection of this to the sum of two variables raised to a power, replace a and b with p and q: Since p + q = 1, each of these powers also equals one on the left side, while the right side expresses how the probabilities are split among the different combinations. When p = q = ½, for example, the binomial triangle becomes In MatLAB, use binopdf(x,n,p) to calculate one row of this triangle, e.g. binopdf(0:3,3,0.5) prints , 0.375, 0.375, 1 1 1 1 1/2 1/2 1/4 2/4 1/4 1/8 3/8 3/8 1/8 1/16 4/16 6/16 4/16 1/16

12 Binomial Distribution
Let’s say we toss 10 coins, and ask how many heads we will see. The 10th row of the triangle would be plotted as at right. The binomial distribution applies to yes/no cases, i.e. cases where you want to know the probability of something happening, vs. it not happening. Say we want to know the probability of getting a 1, rolling five 6-sided dice. Then p = 1/6 (the probability of rolling a 1 on one die), and q = 1 – p = 5/6 (the probability of NOT rolling a 1). The binomial distribution applies to this case, with PB(x,5,1/6). The plot is shown at right. >> binopdf(0:5,5,1/6.) ans =

13 Binomial Distribution
>> binopdf(0:5,5,1/6.) ans = Y = binopdf(X,N,P) computes the binomial pdf X : Number of ‘1’s in final state N: Number of die which are rolled. P: Probably of success. For a 6 sided die, there is only one ‘1’, so probability is only 1/6.

14 Binomial Distribution Mean
Let’s say we toss 10 coins N = 100 times. Then we would multiple the PDF by N, to find out how many times we would have x number of heads. The mean of the distribution is, as before: For 10 coins, with p = ½, we get m = np = 5. For 5 dice, with p = 1/6, we get m = np = 5/6. 40 30 20 10 m m

15 Binomial Standard Deviation
The standard deviation of the distribution is the “second moment,” given by the variance: For 10 coins, with p = ½, we get For 5 dice, with p = 1/6, we get 40 30 20 10 s m s m

16 Summary of Binomial Distribution
x : number of times the ‘die number’ (eg. a “1”) comes up on ‘n’ dies. n : number of dies p : probability that the ‘die number’ comes up on a single die. (eg. 1/M where M is the number of faces on the die). The binomial distribution is PB: The mean is The standard deviation is

17 Example from Exam Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6) and roll the collection simultaneously. a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of rolling all 3 die such that only ONE of the die has a “4”? b) How many different ways are there to roll a die with only ONE of the die showing a “4”? c) On any given roll of all 3 dice, what would be the probability of rolling two “4s”? d) Given your answer above, if you were to roll the collection of die 1000 times, how many times would you expect to roll two “4s”? Let’s go through this example……

18 Example from Exam Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6) and roll the collection simultaneously. a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of rolling all 3 die such that only ONE of the die has a “4”? x =1 : number of times the ‘correct’ die number (eg. a “4”) comes up on ‘n’ dies. n =3 : number of dies thrown SIMULTANEOUSLY (NOT the total number of rolls). p =1/6: for a ‘fair’ 6-sided die. b) How many different ways are there to roll a die with only ONE of the die showing a “4”?

19 Example from Exam Total # of possible combinations
Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6) and roll the collection simultaneously. a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of rolling all 3 die such that only ONE of the die has a “4”? b) How many different ways are there to roll a die with only ONE of the die showing a “4”? Total # of possible combinations Probability of the “correct” combination

20 Example from Exam x=1, n=3,p=1/6 75 x=2, n=3,p=1/6
Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6) and roll the collection simultaneously. a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of rolling all 3 die such that only ONE of the die has a “4”? b) How many different ways are there to roll a die with only ONE of the die showing a “4”? c) On any given roll of all 3 dice, what would be the probability of rolling two “4s”? x=1, n=3,p=1/6 75 x=2, n=3,p=1/6

21 Example from Exam x=1, n=3,p=1/6 75 x=2, n=3,p=1/6
Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6) and roll the collection simultaneously. a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of rolling all 3 die such that only ONE of the die has a “4”? b) How many different ways are there to roll a die with only ONE of the die showing a “4”? c) On any given roll of all 3 dice, what would be the probability of rolling two “4s”? d) Given your answer above, if you were to roll the collection of die 1000 times, how many times would you expect to roll two “4s”? x=1, n=3,p=1/6 75 x=2, n=3,p=1/6 Total # of possible combinations Probability of the “correct” combination

22 Poisson Distribution An approximation to the binomial distribution is very useful for the case where n is very large (i.e. rolls with a die with infinite number of sides?) and p is very small—called the Poisson distribution. Equivalently, a LARGE number of possible outcomes, but the probability of any individual outcome is small. This is the case of counting experiments, such as the decay of radioactive material, or measuring photons in low light level. To derive it, start with the binomial distribution with n large and p << 1, but with a well defined mean m = np. Then The term because x is small, so most of the terms cancel leaving a total of x terms each approximately equal to n. This gives Stirling’s Approximation

23 Poisson Distribution Now, the term (1 – p)-x  1, for small p, and with some algebra we can show that the term (1 – p)n  e-m. Thus, the final Poisson distribution depends only on x and m, and is defined as The text shows that the expectation value of x (i.e. the mean) is Remarkably, the standard deviation is given by the second moment as These are a little tedious to prove, but all we need for now is to know that the standard deviation is the square-root of the mean.

24 Example 2.4 Note: SMALL number of counts so Poisson Distribution appropriate Some students measure some background counts of cosmic rays. They recorded numbers of counts in their detector for a series of s intervals, and found a mean of 1.69 counts/interval. They can use the standard deviation formula from chapter 1, which is to get a standard deviation directly from the data. They do this and get s = They can also estimate the standard deviation by Now they change the length of time they count from 2-s intervals to 15-s intervals. Now the mean number of counts in each interval will increase. Now they measure a mean of 11.48, which implies while they again calculate s directly from their measurements to find s = 3.39. We can plot the theoretical distributions using MatLAB poisspdf(x,mu), e.g. poisspdf(0:8,1.69) gives ans =

25 Example 2.4, cont’d The plots of the distributions is shown for these two cases in the plots at right. You can see that for a small mean, the distribution is quite asymmetrical. As the mean increases, the distribution becomes somewhat more symmetrical (but is still not symmetrical at counts/interval). I have overplotted the mean and standard deviation. You can see that the mean does not coincide with the peak (the most probable value). s m s m

26 Example 2.4, cont’d Here is the higher-mean plot with the equivalent Gaussian (normal distribution) overlaid. For large means (high counts), the Poisson distribution approaches the Gaussian distribution, which we will describe further next time.

27 Rolling Dice - Electronically
For the homework assignment for this week, you will be ‘rolling dice’ a large number of times to generate some data for analysis. Rather than rolling a physical dice, we are going to ‘roll’ the dice in software. Write a matlab function that mimics the rolling of a 6-sided die. Use the ‘randi’ function (NOT randn) function to generate a vector of 1000 random integers between 1 and 6. The syntax is DieNum=randi(6,[1000,1]); Mimic the rolling of 2 die. Add the results of both die together. ‘Roll’ the two die together a total of 100 times. How often does a ‘7’ come up? How does this compare to the probability of rolling a 7? ‘Roll’ the two die together a total of 1000 times.


Download ppt "Physics 114: Lecture 7 Probability Density Functions"

Similar presentations


Ads by Google