Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Statistical Methods

Similar presentations


Presentation on theme: "Introduction to Statistical Methods"— Presentation transcript:

1 Introduction to Statistical Methods
Chapter 1 Introduction to Statistical Methods

2 The math of probability world is in the calculus
Chapter 1 Some pure math discussion for a while! The math of probability & statistics. “The true logic of this world is in the calculus of probabilities”. James Clerk Maxwell

3 Relevance of Probability to Physics
In this course, we’ll discuss The Physics of systems containing HUGE numbers ( >> 1023) of particles:  Solids, Liquids, Gases, EM Radiation, (photons & other quantum particles), .. The Challenge Describe a system’s Macroscopic characteristics starting from a Microscopic theory.

4 Quantum Mechanics: Schrödinger’s
GOAL: Formulate a theory to describe a system’s Macroscopic characteristics starting from a Microscopic theory. Classical Mechanics: Newton’s Laws Need to solve >>1023 coupled differential Newton’s 2nd Law equations of motion! (ABSURD!!) Quantum Mechanics: Schrödinger’s Equation: Need a solution for >>1023 particles! (ABSURD!!)

5 Average System Properties
Historically, this led to the use of a Statistical description of such a system.  So, we’ll talk about Probabilities & Average System Properties We are NOT concerned with the detailed behavior of individual particles.

6 Definitions: Microscopic: ~ Atomic dimensions or ~ ≤ a few Å
Macroscopic: Large enough to be “visible” in the “ordinary” sense

7 Definitions: An Isolated System is in Equilibrium when it’s Macroscopic parameters are time-independent. This is the usual case in this course! But, note! Even if it’s Macroscopic parameters are time-independent, a system’s Microscopic parameters can & probably will still vary with time!

8 Now, some Basic (Simple) Math of Probability & Statistics

9 Random Walk  Binomial Distribution
Section 1.1 Elementary Statistical Concepts & Examples Math preliminaries (methods) for the next few lectures. To treat statistical physics problems, we must first know something about the mathematics of Probability & Statistics The following should hopefully be a review! (?) Keep in mind: Whenever we want to describe a situation using probability & statistics, we must consider an assembly of a large number N (in principle, N ∞) of “similarly prepared systems”.

10 This assembly is called an ENSEMBLE
(“Ensemble” = French word for Assembly). The Probability of an occurrence of a particular event is DEFINED with respect to this particular ensemble & is given by the fraction of systems in the ensemble characterized by the occurrence of this event.

11 Example In throwing a pair of dice, we can give a statistical description by considering that a very large number N of similar pairs of dice are thrown under similar circumstances. Alternatively, we could imagine the same pair of dice thrown N times under similar circumstances. The probability of obtaining two 1’s is then given by the fraction of these experiments in which two 1’s is the outcome.

12 “The Random Walk Problem”
Note that this probability depends strongly on the type of ensemble to which we are referring. See Reif’s flower seed example (p. 5). To quantitatively introduce probability concepts, we use a specific, simple example, which is actually much more general than you first might think. The example is called “The Random Walk Problem”

13 Pierre Simon Laplace “Théorie Analytique des Probabilités”, 1812
The 1-Dimensional Random Walk “The most important questions of life are indeed, for the most part, really only problems of probability.” Pierre Simon Laplace “Théorie Analytique des Probabilités”, 1812

14 The random walk problem can be viewed as in the figure
The One-Dimensional Random Walk In it’s simplest, crudest, most idealized form, The random walk problem can be viewed as in the figure A story about this is that a drunk that starts out from a lamp post on a street. Obviously, he wants to move down the sidewalk to get somewhere!!

15 So the drunk starts out from a lamp post on a street.
Each step he takes is of equal length ℓ. He is SO DRUNK, that the direction of each step (right or left) is completely independent of the preceding step. The (assumed known) probability of stepping to the right is p & of stepping to the left is q = 1 – p. In general, q ≠ p. The x axis is along the sidewalk, the lamp post is at x = 0. Each step is of length ℓ, so his location on the x axis must be x = mℓ where m = a positive or negative integer.

16 Question: After N steps, what is the probability that the man is at a specific location x = mℓ (m specified)? To answer, we first consider an ensemble of a large number N of drunk men starting from similar lamp posts!! Or repeat this with the same drunk man walking on the sidewalk N times!!

17 “What is the probability that the resultant has a certain magnitude
This is “easily generalized” to 2 dimensions, as shown schematically in the figure. The 2 dimensional random walk corresponds to a PHYSICS problem of adding N, 2 dimensional vectors of equal length (figure) & random directions & asking: “What is the probability that the resultant has a certain magnitude & a certain direction?”

18 Physical Examples to which the Random Walk Problem applies:
1. Magnetism (Quantum Treatment) N atoms, each with magnetic moment μ. Each has spin ½. By Quantum Mechanics, each magnetic moment can point either “up” or “down”. If these are equally likely, what is the Net magnetic moment of the N atoms?

19 Physical Examples to which the Random Walk Problem applies:
2. Diffusion of a Molecule of Gas (Classical Treatment) A molecule travels in 3 dimensions with a mean distance ℓ between collisions. How far is it likely to have traveled after N collisions? Answer using Classical Mechanics.

20 The Random Walk Problem:
Is a simple example which illustrates some fundamental results of Probability Theory. The techniques used are Powerful & General. They are used repeatedly throughout Statistical Mechanics. So, it’s very important to spend some time on this problem & to understand it!

21 Section 1.2: 1-Dimensional Random Walk
Forget the drunk, let’s get back to Physics! Think of a particle moving in 1 dimension in steps of length ℓ, with 1. Probability p of stepping to right & 2. Probabilty q = 1 – p of stepping to left. After N steps, the particle is at position: x = mℓ (- N ≤ m ≤ N). Let n1 ≡ # of steps to the right (out of N) Let n2 ≡ # of steps to the left.

22 After N steps, x = mℓ (- N ≤ m ≤ N).
Let n1 ≡ # of steps to the right (out of N) Let n2 ≡ # of steps to the left. Clearly,  N = n1 + n (1) Clearly also, x ≡ mℓ = (n1 - n2)ℓ or, m = n1 - n (2) Combining (1) & (2) gives:  m = 2n1 – N (3) If N is odd, so is m. If N is even, so is m.

23 A Fundamental Assumption is that Successive Steps are
Statistically Independent Let p ≡ the probability of stepping to the right & q = 1 – p ≡ the probability of stepping to the left. Since each step is statistically independent, the probability of a given sequence of n1 steps to the right followed by n2 steps to the left is given by multiplying the respective probabilities for each step:

24 ≡ pn1qn2 p ≡ Probability of stepping to the right
q = 1 – p ≡ Probability of stepping to the left. Since each step is statistically independent, the probability of a given sequence of n1 steps to the right followed by n2 steps to the left is given by multiplying the respective probabilities for each step: p·p·p·p·p· · · · · · · p·p· ·· q·q·q·q·q·q·q·q·q·q· · · q·q ≡ pn1qn2  n1 factors   n2 factors   But, also, clearly, there are MANY different possible ways of taking N steps so n1 are to right & n2 are to left!

25 Nth place: Can be occupied only 1 way
The # of distinct possibilities is the SAME as counting the # of distinct ways we can place N objects, n1 of one type & n2 of another in N = n1 + n2 places: 1st place: Can be occupied any one of N ways 2nd place: Can be occupied any one of N - 1 ways 3rd place: Can be occupied any one of N - 2 ways ……. (N – 1)th place: Can be occupied only 2 ways Nth place: Can be occupied only 1 way

26 N(N-1)(N-2)(N-3)(N-4) ······
 All available places can be occupied in: N(N-1)(N-2)(N-3)(N-4) ······ ······ (3)(2)(1) ≡ N! ways. Here, N! ≡ “N-Factorial”

27  So, we need to divide the result by n1!n2! Note However!
This analysis doesn’t take into account the fact that there are only 2 distinguishable kinds of objects: n1 of the 1st type & n2 of the 2nd type. All n1! possible permutations of the 1st type of object lead to exactly the same N! possible arrangements of the objects. Similarly, all n2! possible permutations of the 2nd type of object also lead to exactly the same N! arrangements.  So, we need to divide the result by n1!n2!

28  So, the # of distinct ways in which N objects can be arranged with n1 of the 1st type & n2 of the 2nd type is ≡ N!/(n1!n2!) This is the same as the # of distinct ways of taking N steps, with n1 to the right & n2 to the left.

29 Summary: WN(n1) = [N!/(n1!n2!)]pn1qn2 or WN(n1)  N pn1qn2 n1
The probability WN(n1) of taking N steps; n1 to the right & n2 (= N - n1) to the left is WN(n1) = [N!/(n1!n2!)]pn1qn2 or WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 Often, this is written as WN(n1)  N pn1qn2 n1 Remember that q = 1- p

30 (p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2
WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 Often, this is written as WN(n1)  N pn1qn2 n1 This probability distribution is called the Binomial Distribution. This is because the Binomial Expansion has the form (p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2 q = 1- p

31 PN(m) = {N!∕([(½)(N + m)]![(½)(N – m)!])}(½)N
We really want the probability PN(m) that x = mℓ after N steps. This really the same as WN(n1) if we change notation: PN(m) = WN(n1). But m = 2n1 – N, so n1 = (½)(N + m) & n2 = N - n1 = (½)(N - m). So the probability PN(m) that x = mℓ after N steps is: PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m) For the common case of p = q = ½, this is: PN(m) = {N!∕([(½)(N + m)]![(½)(N – m)!])}(½)N

32 Binomial Distribution
Probability PN(m) that x = mℓ after N steps is: PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m) For the common case of p = q = ½, this is: PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}(½)N This is the usual form of the Binomial Distribution which is probably the most elementary (discrete) probability distribution.

33 P3(m) = {3!/[(½)(3+m)!][(½)(3-m)!](½)3
As a trivial example, suppose that p = q = ½, N = 3 steps: This gives, P3(m) = {3!/[(½)(3+m)!][(½)(3-m)!](½)3 So P3(3) = P3(-3) = (3!/[3!0!](⅛) = ⅛ P3(1) = P3(-1) = (3!/[2!1!](⅛) = ⅜ n1 n2 m = n1 – n2 3 2 1 -1 -3 Table of Possible Step Sequences

34 P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3
Another example, let: p = q = ½, N = 20. This gives: P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 Calculation gives the histogram results in the figure. P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1

35 P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3
For this same case: P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 Notice: The “envelope” of the histogram is a “bell-shaped” curve. The significance of this is that, after N random steps, the probability of a particle being a distance of N steps away from the start is very small & the probability of it being at or near the origin is relatively large: P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1

36 “Théorie Analytique des Probabilités”, 1812
“It is remarkable that a science which began with the consideration of games of chance should have become the most important object of human knowledge.” Pierre Simon Laplace “Théorie Analytique des Probabilités”, 1812


Download ppt "Introduction to Statistical Methods"

Similar presentations


Ads by Google