Introduction to Statistical Methods

Slides:



Advertisements
Similar presentations
We’ve spent quite a bit of time learning about how the individual fundamental particles that compose the universe behave. Can we start with that “microscopic”
Advertisements

Chapter 1: Introduction to Statistical Methods
Statistics. Large Systems Macroscopic systems involve large numbers of particles.  Microscopic determinism  Macroscopic phenomena The basis is in mechanics.
Probability and Probability Distributions
CHAPTER 6 Statistical Analysis of Experimental Data
Lecture Slides Elementary Statistics Twelfth Edition
Inferential Statistics
Econ 482 Lecture 1 I. Administration: Introduction Syllabus Thursday, Jan 16 th, “Lab” class is from 5-6pm in Savery 117 II. Material: Start of Statistical.
Copyright © 2012 Pearson Education. All rights reserved Copyright © 2012 Pearson Education. All rights reserved. Chapter 10 Sampling Distributions.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Chapter 4 Probability 4-1 Overview 4-2 Fundamentals 4-3 Addition Rule
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Review and Preview This chapter combines the methods of descriptive statistics presented in.
2/6/2014PHY 770 Spring Lectures 7 & 81 PHY Statistical Mechanics 11 AM-12:15 PM & 12:30-1:45 PM TR Olin 107 Instructor: Natalie Holzwarth.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
Ch 9 pages Lecture 23 – The Hydrogen Atom.
Ch 23 pp Lecture 3 – The Ideal Gas. What is temperature?
Integrals  In Chapter 2, we used the tangent and velocity problems to introduce the derivative—the central idea in differential calculus.  In much the.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 )
 We just discussed statistical mechanical principles which allow us to calculate the properties of a complex macroscopic system from its microscopic characteristics.
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Lecture 2 Molecular dynamics simulates a system by numerically following the path of all particles in phase space as a function of time the time T must.
Ludwid Boltzmann 1844 – 1906 Contributions to Kinetic theory of gases Electromagnetism Thermodynamics Work in kinetic theory led to the branch of.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Chapter 2 Statistical Description of Systems of Particles.
Dimensional Analysis.
HAWKES LEARNING Students Count. Success Matters. Copyright © 2015 by Hawkes Learning/Quant Systems, Inc. All rights reserved. Section 7.2 Counting Our.
Chapter 5 Probability Distributions 5-1 Overview 5-2 Random Variables 5-3 Binomial Probability Distributions 5-4 Mean, Variance and Standard Deviation.
INTEGRALS 5. INTEGRALS In Chapter 3, we used the tangent and velocity problems to introduce the derivative—the central idea in differential calculus.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
(b) = 0.18 cm In this case the wavelength is significant. While the De Broglie equation applies to all systems, the wave properties become observable only.
Virtual University of Pakistan
CHAPTER 6 Random Variables
Introduction Overview of Statistical & Thermal Physics
Probability & Statistics
Chapter 41 Atomic Structure
Lecture Slides Elementary Statistics Eleventh Edition
Chapter 8: Inference for Proportions
“Théorie Analytique des Probabilités”, 1812
Elementary Statistics
Historical Overview of Quantum Mechanical Discoveries
For the 1 Dimensional Random Walk Problem
Chapter 1 - General Principles
Quantum One.
Quantum One.
Quantum Numbers Mr. Tsigaridis.
Quantum Two.
Probability & Statistics
The Quantum Model of the Atom
Section 23.3: Coulomb’s Law
Recall the Equipartition Theorem: In Ch 6,
Overview probability distributions
Chapter 41 Atomic Structure
Statistical Description of Macroscopic Systems of Particles
Chapter 6: Random Variables
Lecture 2 Jack Tanabe Old Dominion University Hampton, VA January 2011
Probability Calculations
Chapter 1: Statistical Basis of Thermodynamics
Total Energy is Conserved.
Introduction to Statistical
The Basic (Fundamental) Postulate of Statistical Mechanics
Section 2.2: Statistical Ensemble
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Introduction to Statistical & Thermal Physics (+ Some Definitions)
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Section 1.3: A General Discussion of Mean Values
Chapter 5: Sampling Distributions
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
CHAPTER 2.1 PROBABILITY DISTRIBUTIONS.
CHAPTER 2.1 PROBABILITY DISTRIBUTIONS.
Presentation transcript:

Introduction to Statistical Methods Chapter 1 Introduction to Statistical Methods

The math of probability world is in the calculus Chapter 1 Some pure math discussion for a while! The math of probability & statistics. “The true logic of this world is in the calculus of probabilities”. James Clerk Maxwell

Relevance of Probability to Physics In this course, we’ll discuss The Physics of systems containing HUGE numbers ( >> 1023) of particles:  Solids, Liquids, Gases, EM Radiation, (photons & other quantum particles), .. The Challenge Describe a system’s Macroscopic characteristics starting from a Microscopic theory.

Quantum Mechanics: Schrödinger’s GOAL: Formulate a theory to describe a system’s Macroscopic characteristics starting from a Microscopic theory. Classical Mechanics: Newton’s Laws Need to solve >>1023 coupled differential Newton’s 2nd Law equations of motion! (ABSURD!!) Quantum Mechanics: Schrödinger’s Equation: Need a solution for >>1023 particles! (ABSURD!!)

Average System Properties Historically, this led to the use of a Statistical description of such a system.  So, we’ll talk about Probabilities & Average System Properties We are NOT concerned with the detailed behavior of individual particles.

Definitions: Microscopic: ~ Atomic dimensions or ~ ≤ a few Å Macroscopic: Large enough to be “visible” in the “ordinary” sense

Definitions: An Isolated System is in Equilibrium when it’s Macroscopic parameters are time-independent. This is the usual case in this course! But, note! Even if it’s Macroscopic parameters are time-independent, a system’s Microscopic parameters can & probably will still vary with time!

Now, some Basic (Simple) Math of Probability & Statistics

Random Walk  Binomial Distribution Section 1.1 Elementary Statistical Concepts & Examples Math preliminaries (methods) for the next few lectures. To treat statistical physics problems, we must first know something about the mathematics of Probability & Statistics The following should hopefully be a review! (?) Keep in mind: Whenever we want to describe a situation using probability & statistics, we must consider an assembly of a large number N (in principle, N ∞) of “similarly prepared systems”.

This assembly is called an ENSEMBLE (“Ensemble” = French word for Assembly). The Probability of an occurrence of a particular event is DEFINED with respect to this particular ensemble & is given by the fraction of systems in the ensemble characterized by the occurrence of this event.

Example In throwing a pair of dice, we can give a statistical description by considering that a very large number N of similar pairs of dice are thrown under similar circumstances. Alternatively, we could imagine the same pair of dice thrown N times under similar circumstances. The probability of obtaining two 1’s is then given by the fraction of these experiments in which two 1’s is the outcome.

“The Random Walk Problem” Note that this probability depends strongly on the type of ensemble to which we are referring. See Reif’s flower seed example (p. 5). To quantitatively introduce probability concepts, we use a specific, simple example, which is actually much more general than you first might think. The example is called “The Random Walk Problem”

Pierre Simon Laplace “Théorie Analytique des Probabilités”, 1812 The 1-Dimensional Random Walk “The most important questions of life are indeed, for the most part, really only problems of probability.” Pierre Simon Laplace “Théorie Analytique des Probabilités”, 1812

The random walk problem can be viewed as in the figure The One-Dimensional Random Walk In it’s simplest, crudest, most idealized form, The random walk problem can be viewed as in the figure A story about this is that a drunk that starts out from a lamp post on a street. Obviously, he wants to move down the sidewalk to get somewhere!!

So the drunk starts out from a lamp post on a street. Each step he takes is of equal length ℓ. He is SO DRUNK, that the direction of each step (right or left) is completely independent of the preceding step. The (assumed known) probability of stepping to the right is p & of stepping to the left is q = 1 – p. In general, q ≠ p. The x axis is along the sidewalk, the lamp post is at x = 0. Each step is of length ℓ, so his location on the x axis must be x = mℓ where m = a positive or negative integer.

Question: After N steps, what is the probability that the man is at a specific location x = mℓ (m specified)? To answer, we first consider an ensemble of a large number N of drunk men starting from similar lamp posts!! Or repeat this with the same drunk man walking on the sidewalk N times!!

“What is the probability that the resultant has a certain magnitude This is “easily generalized” to 2 dimensions, as shown schematically in the figure. The 2 dimensional random walk corresponds to a PHYSICS problem of adding N, 2 dimensional vectors of equal length (figure) & random directions & asking: “What is the probability that the resultant has a certain magnitude & a certain direction?”

Physical Examples to which the Random Walk Problem applies: 1. Magnetism (Quantum Treatment) N atoms, each with magnetic moment μ. Each has spin ½. By Quantum Mechanics, each magnetic moment can point either “up” or “down”. If these are equally likely, what is the Net magnetic moment of the N atoms?

Physical Examples to which the Random Walk Problem applies: 2. Diffusion of a Molecule of Gas (Classical Treatment) A molecule travels in 3 dimensions with a mean distance ℓ between collisions. How far is it likely to have traveled after N collisions? Answer using Classical Mechanics.

The Random Walk Problem: Is a simple example which illustrates some fundamental results of Probability Theory. The techniques used are Powerful & General. They are used repeatedly throughout Statistical Mechanics. So, it’s very important to spend some time on this problem & to understand it!

Section 1.2: 1-Dimensional Random Walk Forget the drunk, let’s get back to Physics! Think of a particle moving in 1 dimension in steps of length ℓ, with 1. Probability p of stepping to right & 2. Probabilty q = 1 – p of stepping to left. After N steps, the particle is at position: x = mℓ (- N ≤ m ≤ N). Let n1 ≡ # of steps to the right (out of N) Let n2 ≡ # of steps to the left.

After N steps, x = mℓ (- N ≤ m ≤ N). Let n1 ≡ # of steps to the right (out of N) Let n2 ≡ # of steps to the left. Clearly,  N = n1 + n2 (1) Clearly also, x ≡ mℓ = (n1 - n2)ℓ or, m = n1 - n2 (2) Combining (1) & (2) gives:  m = 2n1 – N (3) If N is odd, so is m. If N is even, so is m.

A Fundamental Assumption is that Successive Steps are Statistically Independent Let p ≡ the probability of stepping to the right & q = 1 – p ≡ the probability of stepping to the left. Since each step is statistically independent, the probability of a given sequence of n1 steps to the right followed by n2 steps to the left is given by multiplying the respective probabilities for each step:

≡ pn1qn2 p ≡ Probability of stepping to the right q = 1 – p ≡ Probability of stepping to the left. Since each step is statistically independent, the probability of a given sequence of n1 steps to the right followed by n2 steps to the left is given by multiplying the respective probabilities for each step: p·p·p·p·p· · · · · · · p·p· ·· q·q·q·q·q·q·q·q·q·q· · · q·q ≡ pn1qn2  n1 factors   n2 factors   But, also, clearly, there are MANY different possible ways of taking N steps so n1 are to right & n2 are to left!

Nth place: Can be occupied only 1 way The # of distinct possibilities is the SAME as counting the # of distinct ways we can place N objects, n1 of one type & n2 of another in N = n1 + n2 places: 1st place: Can be occupied any one of N ways 2nd place: Can be occupied any one of N - 1 ways 3rd place: Can be occupied any one of N - 2 ways ……. (N – 1)th place: Can be occupied only 2 ways Nth place: Can be occupied only 1 way

N(N-1)(N-2)(N-3)(N-4) ······  All available places can be occupied in: N(N-1)(N-2)(N-3)(N-4) ······ ······ (3)(2)(1) ≡ N! ways. Here, N! ≡ “N-Factorial”

 So, we need to divide the result by n1!n2! Note However! This analysis doesn’t take into account the fact that there are only 2 distinguishable kinds of objects: n1 of the 1st type & n2 of the 2nd type. All n1! possible permutations of the 1st type of object lead to exactly the same N! possible arrangements of the objects. Similarly, all n2! possible permutations of the 2nd type of object also lead to exactly the same N! arrangements.  So, we need to divide the result by n1!n2!

 So, the # of distinct ways in which N objects can be arranged with n1 of the 1st type & n2 of the 2nd type is ≡ N!/(n1!n2!) This is the same as the # of distinct ways of taking N steps, with n1 to the right & n2 to the left.

Summary: WN(n1) = [N!/(n1!n2!)]pn1qn2 or WN(n1)  N pn1qn2 n1 The probability WN(n1) of taking N steps; n1 to the right & n2 (= N - n1) to the left is WN(n1) = [N!/(n1!n2!)]pn1qn2 or WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 Often, this is written as WN(n1)  N pn1qn2 n1 Remember that q = 1- p

(p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2 WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 Often, this is written as WN(n1)  N pn1qn2 n1 This probability distribution is called the Binomial Distribution. This is because the Binomial Expansion has the form (p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2 q = 1- p

PN(m) = {N!∕([(½)(N + m)]![(½)(N – m)!])}(½)N We really want the probability PN(m) that x = mℓ after N steps. This really the same as WN(n1) if we change notation: PN(m) = WN(n1). But m = 2n1 – N, so n1 = (½)(N + m) & n2 = N - n1 = (½)(N - m). So the probability PN(m) that x = mℓ after N steps is: PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m) For the common case of p = q = ½, this is: PN(m) = {N!∕([(½)(N + m)]![(½)(N – m)!])}(½)N

Binomial Distribution Probability PN(m) that x = mℓ after N steps is: PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m) For the common case of p = q = ½, this is: PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}(½)N This is the usual form of the Binomial Distribution which is probably the most elementary (discrete) probability distribution.

P3(m) = {3!/[(½)(3+m)!][(½)(3-m)!](½)3 As a trivial example, suppose that p = q = ½, N = 3 steps: This gives, P3(m) = {3!/[(½)(3+m)!][(½)(3-m)!](½)3 So P3(3) = P3(-3) = (3!/[3!0!](⅛) = ⅛ P3(1) = P3(-1) = (3!/[2!1!](⅛) = ⅜ n1 n2 m = n1 – n2 3 2 1 -1 -3 Table of Possible Step Sequences

P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 Another example, let: p = q = ½, N = 20. This gives: P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 Calculation gives the histogram results in the figure. P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1

P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 For this same case: P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 Notice: The “envelope” of the histogram is a “bell-shaped” curve. The significance of this is that, after N random steps, the probability of a particle being a distance of N steps away from the start is very small & the probability of it being at or near the origin is relatively large: P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1

“Théorie Analytique des Probabilités”, 1812 “It is remarkable that a science which began with the consideration of games of chance should have become the most important object of human knowledge.” Pierre Simon Laplace “Théorie Analytique des Probabilités”, 1812