Presentation is loading. Please wait.

Presentation is loading. Please wait.

Probability & Statistics

Similar presentations


Presentation on theme: "Probability & Statistics"— Presentation transcript:

1 Probability & Statistics
Brief Overview of Statistical Methods (at most, 2 classes) Note: Many more details on the probability topics discussed here may be found on the Lecture page for my Physics 4302 course:

2 Math of Probability & Statistics.
Brief discussion of the pure Math of Probability & Statistics. “The true logic of this world is in the calculus of probabilities”. James Clerk Maxwell “Misunderstanding of probability may be the greatest of all impediments to scientific literacy.” Stephen Jay Gould

3 Relevance of Probability to Physics
In this course, we’ll discuss The Physics of systems containing HUGE numbers ( >> 1023) of particles:  Solids, Liquids, Gases, EM Radiation, (photons & other quantum particles), Challenge: Describe a system’s Macroscopic characteristics starting from a Microscopic theory.

4 GOAL: Formulate a theory to describe
a system’s Macroscopic characteristics starting from a Microscopic theory. Classical Mechanics: Newton’s Laws Need to solve >>1023 coupled differential Newton’s 2nd Law equations of motion! (ABSURD!!) Quantum Mechanics: Schrödinger’s Equation: Need a solution for >>1023 particles! (ABSURD!!)

5 Average System Properties
Historically, this led to the use of a Statistical description of such a system.  So, we’ll talk about Probabilities & Average System Properties We AREN’T concerned with detailed behavior of individual particles.

6 Microscopic: ~ Atomic sized; ~ ≤ a few Å
Definitions: Microscopic: ~ Atomic sized; ~ ≤ a few Å Macroscopic: Large enough to be “visible” in the “ordinary” sense An Isolated System is in Equilibrium when it’s Macroscopic parameters are time-independent. The usual case in this course! But, note! Even if it’s Macroscopic parameters are time-independent, a system’s Microscopic parameters can & probably will still vary with time!

7 Now, some Basic Math of Probability & Statistics
“The most important questions of life are, for the most part, really only problems of probability.” Pierre Simon Laplace “Théorie Analytique des Probabilités”, 1812

8 The Binomial Probability Distribution
The following should hopefully be a review! (?) Keep in mind: Whenever we want to describe a situation using probability & statistics, we must consider an assembly of a large number N (in principle, N ∞) of “similarly prepared systems”. This assembly is called an ENSEMBLE (“Ensemble” = French word for Assembly). The Probability of an occurrence of a particular event is DEFINED with respect to this particular ensemble & is given by the fraction of systems in the ensemble characterized by the occurrence of this event.

9 Example: In throwing a pair of dice, give a statistical description by considering that a very large number N of similar pairs of dice are thrown under similar circumstances. Alternatively, we could imagine the same pair of dice thrown N times under similar circumstances. The probability of obtaining two 1’s is then given by the fraction of these experiments in which two 1’s is the outcome. Note that this probability depends strongly on the type of ensemble to which we are referring.

10 The Binomial Probability Distribution
To quantitatively introduce probability concepts, we use a specific, simple example, which is much more general than you might think. This is: The Binomial Probability Distribution This is illustrated in Ch. 1 of the supplemental book by Reif, where he discusses the “1 Dimensional Random Walk Problem” It is also discussed in a HUGE number of other books & in many places on the web. In the following, details of derivations will, for the most part, not be shown. Instead, the results will be summarized. See other sources for derivation details.

11 The random walk problem can be viewed as in the figure
The One-Dimensional Random Walk In it’s simplest, crudest, most idealized form, The random walk problem can be viewed as in the figure A story about this is that a drunk that starts out from a lamp post on a street. Obviously, he wants to move down the sidewalk to get somewhere!!

12 So the drunk starts out from a lamp post on a street.
Each step he takes is of equal length ℓ. He is SO DRUNK, that the direction of each step (right or left) is completely independent of the preceding step. The (assumed known) probability of stepping to the right is p & of stepping to the left is q = 1 – p. In general, q ≠ p. The x axis is along the sidewalk, the lamp post is at x = 0. Each step is of length ℓ, so his location on the x axis must be x = mℓ where m = a positive or negative integer.

13 Question: After N steps, what is the probability that the man is at a specific location x = mℓ (m specified)? To answer, we first consider an ensemble of a large number N of drunk men starting from similar lamp posts!! Or repeat this with the same drunk man walking on the sidewalk N times!!

14 “What is the probability that the resultant has a certain magnitude
This is “easily generalized” to 2 dimensions, as shown schematically in the figure. The 2 dimensional random walk corresponds to a PHYSICS problem of adding N, 2 dimensional vectors of equal length (figure) & random directions & asking: “What is the probability that the resultant has a certain magnitude & a certain direction?”

15 Physical Examples to which the Random Walk Problem applies:
1. Magnetism (Quantum Treatment) N atoms, each with magnetic moment μ. Each has spin ½. By Quantum Mechanics, each magnetic moment can point either “up” or “down”. If these are equally likely, what is the Net magnetic moment of the N atoms? 2. Diffusion of a Molecule of Gas (Classical) A molecule travels in 3 dimensions with a mean distance ℓ between collisions. How far is it likely to have traveled after N collisions? Answer using Classical Mechanics.

16 Statistical Mechanics.
Random Walk Problem: Is a simple example which illustrates some fundamental results of Probability Theory. The techniques used are Powerful & General. They are used repeatedly throughout Statistical Mechanics. So, it’s important to spend a bit of time on this problem & to understand it!

17 1-Dimensional Random Walk
Forget the drunk! Go back to Physics! Think of a particle moving in 1 dimension in steps of length ℓ, with 1. Probability p of stepping to right & 2. Probabilty q = 1 – p of stepping to left. After N steps, the particle is at position: x = mℓ (m = integer; - N ≤ m ≤ N). Let n1 ≡ # of steps to the right (out of N) Let n2 ≡ # of steps to the left.

18 After N steps, x = mℓ (- N ≤ m ≤ N).
Let n1 ≡ # of steps to the right (out of N) Let n2 ≡ # of steps to the left. Clearly,  N = n1 + n (1) Clearly also, x ≡ mℓ = (n1 - n2)ℓ or, m = n1 - n (2) Combining (1) & (2) gives:  m = 2n1 – N (3) So, if N is odd, so is m & if N is even, so is m.

19 A Fundamental Assumption is that Successive Steps are
Statistically Independent Let p ≡ the probability of stepping to the right & q = 1 – p ≡ the probability of stepping to the left. (p + q = 1) Since each step is statistically independent, the probability of a given sequence of n1 steps to the right followed by n2 steps to the left is given by multiplying the respective probabilities for each step. (N = n1 + n2)

20 WN(n1) = [N!/(n1!n2!)]pn1qn2 or
A Detailed Derivation of the Probability Distribution Gives: The probability WN(n1) of taking N steps; n1 to the right & n2 (= N - n1) to the left is WN(n1) = [N!/(n1!n2!)]pn1qn2 or WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 Often, this is written as WN(n1)  N pn1qn2 n1 Remember that q = 1- p

21 (p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2
WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 This probability distribution is called the Binomial Distribution. This is because the Binomial Expansion has the form (p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2 Change variables & ask for the probability PN(m) that x = mℓ after N steps. PN(m) = WN(n1). But m = 2n1 – N, so n1 = (½)(N + m) & n2 = N - n1 = (½)(N - m). Results on next slide:

22 Binomial Distribution
PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}  p(½)(N+m)(1-p)(½)(N-m) For the common case of p = q = ½, this is: PN(m) = {N!∕([(½)(N + m)]![(½)(N – m)!])}(½)N This is the usual form of the Binomial Distribution which is probably the most elementary (discrete) probability distribution.

23 P3(m) = {3!/[(½)(3+m)!][(½)(3-m)!](½)3
As a trivial example, suppose that p = q = ½, N = 3 steps: This gives, P3(m) = {3!/[(½)(3+m)!][(½)(3-m)!](½)3 So P3(3) = P3(-3) = (3!/[3!0!](⅛) = ⅛ P3(1) = P3(-1) = (3!/[2!1!](⅛) = ⅜ n1 n2 m = n1 – n2 3 2 1 -1 -3 Table of Possible Step Sequences

24 P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3
Another example, let: p = q = ½, N = 20. This gives: P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 Calculation gives the histogram results in the figure. P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1

25 P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3
For this same case: P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 Notice: The “envelope” of the histogram is a “bell-shaped” curve. The significance of this is that, after N random steps, the probability of a particle being a distance of N steps away from the start beomes very small & the probability of it being at or near the origin becomes relatively large: P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1


Download ppt "Probability & Statistics"

Similar presentations


Ads by Google