Chapter 1: Introduction to Statistical Methods

Slides:



Advertisements
Similar presentations
The Kinetic Theory of Gases
Advertisements

Presentation on Probability Distribution * Binomial * Chi-square
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 18 Sampling Distribution Models.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
Probability Distributions CSLU 2850.Lo1 Spring 2008 Cameron McInally Fordham University May contain work from the Creative Commons.
We’ve spent quite a bit of time learning about how the individual fundamental particles that compose the universe behave. Can we start with that “microscopic”
Sampling Distributions
Statistics: Purpose, Approach, Method. The Basic Approach The basic principle behind the use of statistical tests of significance can be stated as: Compare.
Statistics. Large Systems Macroscopic systems involve large numbers of particles.  Microscopic determinism  Macroscopic phenomena The basis is in mechanics.
Probability and Probability Distributions
CHAPTER 6 Statistical Analysis of Experimental Data
Lecture Slides Elementary Statistics Twelfth Edition
Lecture 1: Random Walks, Distribution Functions Probability and Statistics: Fundamental in most parts of astronomy Examples: Description of “systems” 
Inferential Statistics
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Elec471 Embedded Computer Systems Chapter 4, Probability and Statistics By Prof. Tim Johnson, PE Wentworth Institute of Technology Boston, MA Theory and.
Chapter 6: Normal Probability Distributions
Econ 482 Lecture 1 I. Administration: Introduction Syllabus Thursday, Jan 16 th, “Lab” class is from 5-6pm in Savery 117 II. Material: Start of Statistical.
Copyright © 2012 Pearson Education. All rights reserved Copyright © 2012 Pearson Education. All rights reserved. Chapter 10 Sampling Distributions.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Chapter 5 Sampling Distributions
© Copyright McGraw-Hill CHAPTER 6 The Normal Distribution.
Copyright © Cengage Learning. All rights reserved. 5 Probability Distributions (Discrete Variables)
Sect. 1.5: Probability Distributions for Large N: (Continuous Distributions)
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2010 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Chapter 8 Continuous.
Thermo & Stat Mech - Spring 2006 Class 16 More Discussion of the Binomial Distribution: Comments & Examples jl.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
Random Variables Numerical Quantities whose values are determine by the outcome of a random experiment.
Copyright © 2009 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
1 Chapter 18 Sampling Distribution Models. 2 Suppose we had a barrel of jelly beans … this barrel has 75% red jelly beans and 25% blue jelly beans.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Copyright © Cengage Learning. All rights reserved. 5 Probability Distributions (Discrete Variables)
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 18 Sampling Distribution Models.
Sampling Distribution Models Chapter 18. Toss a penny 20 times and record the number of heads. Calculate the proportion of heads & mark it on the dot.
 We just discussed statistical mechanical principles which allow us to calculate the properties of a complex macroscopic system from its microscopic characteristics.
Discrete Probability Distributions Define the terms probability distribution and random variable. 2. Distinguish between discrete and continuous.
Ch 22 pp Lecture 2 – The Boltzmann distribution.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Lecture 2 Molecular dynamics simulates a system by numerically following the path of all particles in phase space as a function of time the time T must.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
STA 2023 Module 5 Discrete Random Variables. Rev.F082 Learning Objectives Upon completing this module, you should be able to: 1.Determine the probability.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Chapter 5 Discrete Random Variables Probability Distributions
Physics with Calculus by David V. Anderson & Peter Tarsi assisted by students Carolyn McCrosson & Andrew Trott Lecturette 149: The Maxwellian Speed Distribution.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
R.Kass/F02 P416 Lecture 1 1 Lecture 1 Probability and Statistics Introduction: l The understanding of many physical phenomena depend on statistical and.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 5 Probability Distributions 5-1 Overview 5-2 Random Variables 5-3 Binomial Probability Distributions 5-4 Mean, Variance and Standard Deviation.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
The normal approximation for probability histograms.
Sampling Distribution Models
Probability & Statistics
Introduction to Statistical Methods
Chapter 5 Sampling Distributions
“Théorie Analytique des Probabilités”, 1812
Probability & Statistics
Chapter 5 Sampling Distributions
The “Wandering Photon” An Animation Found on the Internet
Chapter 1: Statistical Basis of Thermodynamics
Total Energy is Conserved.
The Basic (Fundamental) Postulate of Statistical Mechanics
Section 2.2: Statistical Ensemble
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Section 1.3: A General Discussion of Mean Values
Chapter 5: Sampling Distributions
Presentation transcript:

Chapter 1: Introduction to Statistical Methods “The true logic of this world is in the calculus of probabilities”. James Clerk Maxwell

Introductory Remarks This course: The Physics of systems containing HUGE numbers (~ 1023) of particles: Solids, liquids, gases, E&M radiation (photons), …. Challenge: Describe a system’s Macroscopic characteristics from a Microscopic theory. Classical: Newton’s Laws  need to solve 1023 coupled differential equations of motion (ABSURD!!) Quantum: Schrödinger’s Equation  need to solve for 1023 particles (ABSURD!!)

Definitions:  Use a Statistical description of such a system.  Talk about Probabilities & Average Properties We are NOT concerned with the detailed behavior of individual particles. Definitions: Microscopic: ~ Atomic dimensions ~ ≤ a few Å Macroscopic: Large enough to be “visible” in the “ordinary” sense

Definitions An Isolated System is in Equilibrium when it’s Macroscopic parameters are time-independent. This is the usual case in this course! But, note! Even if it’s Macroscopic parameters are time-independent, a system’s Microscopic parameters can still vary with time!

The Random Walk & The Binomial Distribution Section 1.1: Elementary Statistical Concepts; Examples Some math preliminaries (& methods) for the next few lectures. To treat statistical physics problems, we must first know something about the mathematics of Probability & Statistics The following should be a review! (?) Keep in mind: Whenever we want to describe a situation using probability & statistics, we must consider an assembly of a large number N (in principle, N  ∞) of “similarly prepared systems”.

ENSEMBLE This assembly is called an (“Ensemble” = the French word for assembly). The Probability of an occurrence of a particular event is DEFINED with respect to this particular ensemble & is given by the fraction of systems in the ensemble characterized by the occurrence of this event. Example: In throwing a pair of dice, we can give a statistical description by considering that a very large number N of similar pairs of dice are thrown under similar circumstances. Alternatively, we could imagine the same pair of dice thrown N times under similar circumstances. The probability of obtaining two 1’s is then given by the fraction of these experiments in which two 1’s is the outcome.

The Random Walk Problem Note that this probability depends strongly on the nature of the ensemble to which we are referring. Reif’s flower seed example (p. 5). To quantitatively introduce probability concepts, we use a specific, simple example, which is actually much more general than you first might think. The example is called The Random Walk Problem

One-Dimensional Random Walk In it’s simplest, idealized form, the random walk problem can be viewed as in the figure above: A drunk starts out from a lamp post on a street. Each step he takes is of equal length ℓ. The man is SO DRUNK, that the direction of each step (right or left) is completely independent of the preceding step. The probability of stepping to the right is p & of stepping to the left is q = 1 – p. In general, q ≠ p. The x axis is along the sidewalk, the lamp post is at x = 0. Each step is of length ℓ, so his location on the x axis must be x = mℓ where m = a positive or a negative integer.

Question: After N steps, what is the probability that the man is at a specific location x = mℓ (m specified)? To answer, we first consider an ensemble of a large number N of drunk men starting from similar lamp posts. Or repeat this with the same drunk man walking on the sidewalk N times. This can be easily generalized to 2 dimensions. See figure to the right:

What is the probability that the resultant has a The 2 dimensional random walk corresponds to a PHYSICS problem of adding N, 2 dimensional vectors of equal length (see figure) & random directions & asking: What is the probability that the resultant has a certain magnitude & direction?

Physical Examples to which the Random Walk Problem applies Magnetism N atoms, each with magnetic moment μ. Each has spin ½. By Quantum Mechanics, each magnetic moment can point either “up” or “down”. If these are equally likely, what is the Net magnetic moment of the N atoms? Diffusion of a Molecule of |Gas A molecule travels in 3 dimensions with a mean distance ℓ between collisions. How far is it likely to have traveled after N collisions? Answer using Classical Mechanics.

The Random Walk Problem The Random Walk Problem illustrates some fundamental results of Probability Theory. The techniques used are Powerful & General. They are used repeatedly throughout Statistical Mechanics. So, it’s very important to spend some time on this problem & understand it!

Section 1.2: 1 Dimensional Random Walk Forget the drunk, let’s get back to Physics! Think of a particle moving in 1 dimension in steps of length ℓ, with probability p of stepping to the right & q = 1 – p of stepping to the left. After N steps, the particle is at position: x = mℓ (- N ≤ m ≤ N). Let n1 ≡ # of steps to the right (of N), n2 ≡ # of steps to the left. Clearly,  N = n1 + n2 (1) Clearly also, x ≡ mℓ = (n1 - n2)ℓ or, m = n1 - n2 (2) Combining (1) & (2) gives m = 2n1 – N (3) Thus, if N is odd, so is m and if N is even, so is m.

A Fundamental Assumption is that successive steps are statistically independent Let p ≡ the probability of stepping to the right and q = 1 – p ≡ the probability of stepping to the left. Since each step is statistically independent, the probability of a given sequence of n1 steps to the right followed by n2 steps to the left is given by multiplying the respective probabilities for each step: p·p·p·p·p· · · · · · · p·p· ···· · ···· · q·q·q·q·q·q·q·q·q·q· · · q·q ≡ pn1qn2  n1 factors  n2 factors   But, also, clearly, there are MANY different possible ways of taking N steps so n1 are to right & n2 are to left!

N(N-1)(N-2)(N-3)(N-4)·····(3)(2)(1) ≡ N! Ways The # of distinct possibilities is the SAME as counting the # of distinct ways we can place N objects, n1 of one type & n2 of another in N = n1 + n2 places: 1st place: Can be occupied any one of N ways 2nd place: Can be occupied any one of N - 1 ways 3rd place: Can be occupied any one of N - 2 ways · (N – 1)th place: Can be occupied only 2 ways Nth place: Can be occupied only 1 way  All available places can be occupied in: N(N-1)(N-2)(N-3)(N-4)·····(3)(2)(1) ≡ N! Ways N! ≡ “N-Factorial”

 So, we need to divide the result by n1!n2! Note However! This analysis doesn’t take into account the fact that there are only 2 distinguishable kinds of objects: n1 of the 1st type & n2 of the 2nd type. All n1! possible permutations of the 1st type of object lead to exactly the same N! possible arrangements of the objects. Similarly, all n2! possible permutations of the 2nd type of object also lead to exactly the same N! arrangements.  So, we need to divide the result by n1!n2!  So, the # of distinct ways in which N objects can be arranged with n1 of the 1st type & n2 of the 2nd type is ≡ N!/(n1!n2!) This is the same as the # of distinct ways of taking N steps, with n1 to the right & n2 to the left.

(p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2 In Summary: The probability WN(n1) of taking N steps, with n1 to the right & n2 = N - n1 to the left is WN(n1) = [N!/(n1!n2!)]pn1qn2 or WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 Often, this is written as WN(n1) = N pn1qn2 n1 This probability distribution is called the Binomial Distribution. This is because the Binomial Expansion has the form (p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2 Remember that q = 1- p

PN(m) = {N!/([0.5(N + m)]![0.5(N – m)!]}(½)N We really want the probability PN(m) that x = mℓ after N steps. This really the same as WN(n1) if we change notation: PN(m) = WN(n1). But m = 2n1 – N, so n1 = (½)(N + m) & n2 = N - n1 = (½)(N - m). So the probability PN(m) that x = mℓ after N steps is: PN(m) = {N!/([0.5(N + m)]![0.5(N – m)!]}p0.5(N+m)(1-p)0.5(N-m) For the common case of p = q = ½, this is: PN(m) = {N!/([0.5(N + m)]![0.5(N – m)!]}(½)N This is the usual form of the Binomial Distribution which is probably the most elementary (discrete) probability distribution.

P3(m) = {3!/[0.5(3+m)!][0.5(3-m)!](½)3 As a trivial example, suppose that p = q = ½, N = 3 steps: P3(m) = {3!/[0.5(3+m)!][0.5(3-m)!](½)3 So P3(3) = P3(-3) = (3!/[3!0!](⅛) = ⅛ P3(1) = P3(-1) = (3!/[2!1!](⅛) = ⅜ Possible Step Sequences n1 n2 m = n1 – n2 3 2 1 -1 -3

P20(m) = {20!/[0.5(20 + m)!][0.5(20 - m)!](½)3 As another example, suppose that: p = q = ½, N = 20. P20(m) = {20!/[0.5(20 + m)!][0.5(20 - m)!](½)3 Doing this gives the histogram results in the figure Note: The envelope of the histogram is a bell-shaped curve. The significance of this is that, after N random steps, the probability of a particle being a distance of N steps away from the start is very small & the probability of it being at or near the origin is relatively large: P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1

Sect. 1.3: General Discussion of Mean Values

S1 ≡ P(u1) + P(u2) + P(u3) +…..+ P(uM-1) + P(uM) ≡ ∑iP(ui) The Binomial Distribution is only one example of a probability distribution. Now, we’ll begin a discussion of a General Distribution. Most of the following results are valid for ANY probability distribution, Let u = a variable which can take on any of M discrete values: u1,u2,u3,…,uM-1,uM with probabilities P(u1),P(u2),P(u3),…..,P(uM-1),P(uM) The Mean (average) value of u is defined as: ū ≡ <u> ≡ (S2/S1) where S1 ≡ P(u1) + P(u2) + P(u3) +…..+ P(uM-1) + P(uM) ≡ ∑iP(ui) S2 ≡ u1P(u1) + u2P(u2) + u3P(u3) +…..+ uM-1P(uM-1) + uM-1P(uM) ≡ ∑iuiP(ui) For a properly normalized distribution, S1 = ∑iP(ui) = 1. We assume this from now on.

 The mean value of the deviation from the mean is always zero! Sometimes, ū is called the 1st moment of P(u). If O(u) is any function of u, the mean value of O(u) is: Ō ≡ <O> ≡ ∑iO(ui)P(ui) Some simple mean values that are useful for describing the probability distribution P(u): 1. The mean value, ū This is a measure of the central value of u about which the various values of ui are distributed. Consider the quantity Δu ≡ u - ū (deviation from the mean). It’s mean is: <Δu> = <u - ū> = ū – ū = 0  The mean value of the deviation from the mean is always zero!

the spread of the u values about the mean ū. Now, lets look at (Δu)2 = (u - <u>)2 (square of the deviation from the mean). It’s mean value is: <(Δu)2> = <(u - <u>)2> = <u2 -2uū – (ū)2> = <u2> - 2<u><u> – (<u>)2 = <u2> - (<u>)2 This is called the “Mean Square Deviation” (from the mean). It is also called several different (equivalent!) other names: the Dispersion or the Variance or the 2nd Moment of P(u) about the mean. <(Δu)2> is a measure of the spread of the u values about the mean ū. NOTE that <(Δu)2> = 0 if & only if ui = ū for all i. It can easily be shown that, <(Δu)2> ≥ 0, or <u2> ≥ (<u>)2

<(Δu)n> ≡ <(u - <u>)n> We could also define the nth moment of P(u) about the mean: <(Δu)n> ≡ <(u - <u>)n> This is rarely used beyond n = 2. Almost never beyond n = 3 or 4. NOTE: A knowledge of the probability distribution function P(u) gives complete information about the distribution of the values of u. But, a knowledge of only a few moments, like knowing just ū & <(Δu)2> implies only partial, though useful knowledge of the distribution. A knowledge of only some moments is not enough to uniquely determine P(u). Math Theorem In order to uniquely determine a distribution P(u), we need to know ALL moments of it. That is we need all moments for n = 0,1,2,3….  .

Section 1.4: Calculation of Mean Values for the Random Walk Problem Also we’ll discuss a few math “tricks” for doing discrete sums! We’ve found: The probability in N steps of making n1 to the right & n2 = N - n1 to the left is the Binomial Distribution: WN(n1) = [N!/(n1!n2!)]pn1qn2 p = the probability of a step to the right, q = 1 – p = the probability of a step to the left. First, lets verify normalization: ∑(n1 = 0N) WN(n1) = 1? Recall the binomial expansion: (p + q)N = ∑(n1 = 0N) [N!/(n1!n2!)]pn1qn2 = ∑(n1 = 0N) WN(n1) But, (p + q) = 1, so (p + q)N = 1 & ∑(n1 = 0N) WN(n1) = 1.

Question 1: What is the mean number of steps to the right? <n1> ≡ ∑(n1 = 0N) n1WN(n1) = ∑(n1 = 0N) n1[N!/(n1!(N-n1)!]pn1qN-n1 (1) We can do this sum by looking it up in a table OR we can use a “trick” as follows. The following is a general procedure which usually works, even if it doesn’t always have mathematical “rigor”. Temporarily, lets treat p & q as “arbitrary”, continuous variables, ignoring the fact that p + q =1. NOTE that, if p is a continuous variable, then we clearly have: n1pn1 ≡ p[(pn1)/p] Now, use this in (1) (interchanging the sum & the derivative): <n1> = ∑(n1 = 0N) [N!/(n1!(N-n1)!]n1pn1qn2 = ∑(n1 = 0N)[N!/(n1!(N-n1)!]p[(pn1)/p]qn2 = p[/p]∑(n1 = 0N) [N!/(n1!(N-n1)!]pn1qN-n1 = p[/p](p + q)N = pN(p + q)N-1 But, for our special case (p + q) = 1, (p + q)N-1 = 1, so <n1> = Np

The mean number of steps to the left is: Summary: The mean number of steps to the right is: <n1> = Np We might have guessed this! Similarly, we can also easily show that The mean number of steps to the left is: <n2> = Nq Of course, <n1> + <n2> = N(p + q) = N as it should! Question 2: What is the mean displacement, <x> = <m>ℓ? Clearly, m = n1 – n2, so <m> = <n1> - <n2> = N( p – q) So, if p = q = ½, <m> = 0 so, <x> = <m>ℓ = = 0

<(Δn1)2> = <(n1 - <n1>)2> Question 3: What is the dispersion (or variance) <(Δn1)2> = <(n1 - <n1>)2> in the number of steps to the right? That is, what is the spread in n1 values about <n1>? Our general discussion has shown that: <(Δn1)2> = <(n1)2> - (<n1>)2 Also we’ve just seen that <n1> = Np So, we first need to calculate the quantity <(n1)2> <(n1)2> = ∑(n1 = 0N) (n1)2 WN(n1) = ∑(n1 = 0N)(n1)2[N!/(n1!(N-n1)!]pn1qN-n1 (2) Use a similar “trick” as we did before & note that: (n1)2pn1 ≡ [p(/p)]2pn1

(Δ*n1)/<n1> = [Npq]½(Np) = (q½)(pN)½ After algebra (in the book) & using p + q = 1, we find: <(n1)2> = (Np)2 + Npq = (<n1>)2 + Npq So, finally, using <(Δn1)2> = <(n1)2> - (<n1>)2 This is the dispersion or variance of the binomial distribution. The root mean square (rms) deviation from the mean is defined as: (Δ*n1)  [<(Δn1)2>]½ (in general). For the binomial distribution this is (Δ*n1) = [Npq]½  The distribution width Again note that: <n1> = Np. So, the relative width of the distribution is: (Δ*n1)/<n1> = [Npq]½(Np) = (q½)(pN)½ If p = q, this is: (Δ*n1)/<n1> = 1(N)½ = (N)-½  As N increases, the mean value increases  N but the relative width decreases  (N)-½ <(Δn1)2> = Npq

(Δm)2 = 4(Δn1)2. So, <(Δm)2> = 4<(Δn1)2> Question 4: What is the dispersion <(Δm)2> = <(m - <m>)2> in the net displacement? x = mℓ. What is the spread in m values about <m>)? We had, m = n1 – n2 = 2n1 – N. So, <m> = 2<n1> – N. Δm = m - <m> = (2n1 – N) – (2<n1> - N) = 2(n1 – <n1>) = 2(Δn1) (Δm)2 = 4(Δn1)2. So, <(Δm)2> = 4<(Δn1)2> Using <(Δn1)2> = Npq, this becomes: If p = q = ½, <(Δm)2> = N <(Δm)2> = 4Npq

Summary: 1 Dimensional Random Walk Problem Probability Distribution is Binomial: WN(n1) = [N!/(n1!n2!)]pn1qn2 Mean number of steps to the right: <n1> = Np Dispersion in n1: <(Δn1)2> = Npq Relative width: (Δ*n1)/<n1> = (q½)(pN)½ for N increasing, the mean value increases  N, the relative width decreases  (N)-½

Some General Comments about the Binomial Distribution

Requirements justifying the use of the Binomial Distribution The Binomial Distribution applies to cases where there are only two possible outcomes: head or tail, success or failure, defective item or good item, etc. Requirements justifying the use of the Binomial Distribution 1. The experiment must consist of n identical trials. 2. Each trial must result in only one of two possible outcomes. 3. The outcomes of the trials must be statistically independent. 4. All trials must have the same probability for a particular outcome.

Common Notation for the Binomial Distribution r items of one type and (n – r) of a second type can be arranged in nCr ways. Here: ≡ nCr is called the binomial coefficient In this notation, the probability distribution can be written: Wn(r) = nCr pr(1-p)n-r ≡ probability of finding r items of one type & n – r items of the other type. p = probability of a given item being of one type

Binomial Distribution Example Problem: A sample of n = 11 electric bulbs is drawn every day from those manufactured at a plant. The probabilities of getting defective bulbs are random and independent of previous results. The probability that a bulb is defective is p = 0.04. 1. What is the probability of finding exactly three defective bulbs in a sample? (Probability that r = 3?) 2. What is the probability of finding three or more defective bulbs in a sample? (Probability that r ≥ 3?)

Binomial Distribution, n = 11 No. of defective bulbs, r Probability 11Cr pr(1-p)n-r 11C0 (0.04)0(0.96)11 = 0.6382 1 11C1 (0.04)1(0.96)10 = 0.2925 2 11C2 (0.04)2(0.96)9 = 0.0609 3 11C3 (0.04)3(0.96)8 = 0.0076

P(r = 3 defective bulbs) = W11(r = 3) = 0.0076 Question 1: Probability of finding exactly three defective bulbs in a sample? P(r = 3 defective bulbs) = W11(r = 3) = 0.0076 Question 2: Probability of finding three or more defective bulbs in a sample? P(r ≥ 3 defective bulbs ) = 1- W11(r = 0) – W11(r = 1) – W11(r = 2) = 1 – 0.6382 - 0.2925 – 0.0609 = 0.0084

Binomial Distribution, Same Problem, Larger r No. of defective bulbs Probability 11Cr pr(1-p)n-r 11C0(0.04)0(0.96)11 = 0.638239 1 11C1 (0.04)1(0.96)10 = 0.292526 2 11C2 (0.04)2(0.96)9 = 0.060943 3 11C3 (0.04)3(0.96)8 = 0.007618 4 11C4 (0.04)4(0.96)7 = 0.000635 5 11C5 (0.04)5(0.96)6 = 0.000037

Binomial Distribution

Binomial Distribution

Binomial distribution with n=10, p=0.5 Draw with your finger an approximate normal curve

“Wandering Photon” Animation found on the Internet!

The “Wandering Photon” Walks straight for a random length Stops with probability g Turns in a random direction with probability (1-g)

One Dimension

After a random length x with probability g stop with probability (1-g )/2 continue in each direction

x

x

x

x

x

x

P(photon absorbed at x)? pdf of the length of the first step 1/h is the average step length γ is the absorption probability

P(photon absorbed at x) = f (|x|,g,h) pdf of the length of the first step 1/h is the average step length γ is the absorption probability

The sleepy drunk in higher dimensions

After a random length, with probability g stop with probability (1-g ) pick a random direction

The sleepy drunk in higher dimensions r P(absorbed at r) = f (r,g,h)