SOME BASIC IDEAS OF STATISTICAL PHYSICS Mr. Anil Kumar Associate Professor Physics Department Govt. College for Girls, Sector -11, Chandigarh.

Slides:



Advertisements
Similar presentations
Copyright © Cengage Learning. All rights reserved. 7 Probability.
Advertisements

Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Psych 5500/6500 The Sampling Distribution of the Mean Fall, 2008.
Probability Simple Events
Probability Seeing structure and order within chaotic, chance events. Defining the boundaries between what is mere chance and what probably is not. Asymptotic.
Statistics: Purpose, Approach, Method. The Basic Approach The basic principle behind the use of statistical tests of significance can be stated as: Compare.
Thermo & Stat Mech - Spring 2006 Class 16 1 Thermodynamics and Statistical Mechanics Probabilities.
Chapter 4 Probability and Probability Distributions
COUNTING AND PROBABILITY
Mathematics.
Statistical Issues in Research Planning and Evaluation
Hypothesis Testing IV Chi Square.
Unit 32 STATISTICS.
College Algebra Sixth Edition James Stewart Lothar Redlin Saleem Watson.
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Chapter 4 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
Excursions in Modern Mathematics, 7e: Copyright © 2010 Pearson Education, Inc. 15 Chances, Probabilities, and Odds 15.1Random Experiments and.
NIPRL Chapter 1. Probability Theory 1.1 Probabilities 1.2 Events 1.3 Combinations of Events 1.4 Conditional Probability 1.5 Probabilities of Event Intersections.
Fall 2014 Fadwa ODEH (lecture 1). Probability & Statistics.
Introduction to Probability
Probability and Statistics Dr. Saeid Moloudzadeh Sample Space and Events 1 Contents Descriptive Statistics Axioms of Probability Combinatorial.
Chapter 5 Basic Probability Distributions
Slide 1 Statistics Workshop Tutorial 4 Probability Probability Distributions.
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Lecture 6: Descriptive Statistics: Probability, Distribution, Univariate Data.
1 Binomial Probability Distribution Here we study a special discrete PD (PD will stand for Probability Distribution) known as the Binomial PD.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Chapter 6 Probability.
COURSE: JUST 3900 INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Instructor: Dr. John J. Kerbs, Associate Professor Joint Ph.D. in Social Work and Sociology.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Copyright © Cengage Learning. All rights reserved. 11 Applications of Chi-Square.
Copyright © 2015, 2011, 2008 Pearson Education, Inc. Chapter 7, Unit A, Slide 1 Probability: Living With The Odds 7.
5-2 Probability Distributions This section introduces the important concept of a probability distribution, which gives the probability for each value of.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Standardized Score, probability & Normal Distribution
Chapter 5 Sampling Distributions
Chapter 4 Probability 4-1 Overview 4-2 Fundamentals 4-3 Addition Rule
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
1 Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Basic Principle of Statistics: Rare Event Rule If, under a given assumption,
“PROBABILITY” Some important terms Event: An event is one or more of the possible outcomes of an activity. When we toss a coin there are two possibilities,
Basic Concepts of Discrete Probability (Theory of Sets: Continuation) 1.
College Algebra Sixth Edition James Stewart Lothar Redlin Saleem Watson.
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Preview Rare Event Rule for Inferential Statistics: If, under a given assumption, the probability.
Worked examples and exercises are in the text STROUD PROGRAMME 28 PROBABILITY.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Introduction to Behavioral Statistics Probability, The Binomial Distribution and the Normal Curve.
Independence and Bernoulli Trials. Sharif University of Technology 2 Independence  A, B independent implies: are also independent. Proof for independence.
Lecture 2 Molecular dynamics simulates a system by numerically following the path of all particles in phase space as a function of time the time T must.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
Chapter 8: Probability: The Mathematics of Chance Lesson Plan Probability Models and Rules Discrete Probability Models Equally Likely Outcomes Continuous.
Basic Concepts of Probability
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Measuring chance Probabilities FETP India. Competency to be gained from this lecture Apply probabilities to field epidemiology.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Chapter 8: Probability: The Mathematics of Chance Probability Models and Rules 1 Probability Theory  The mathematical description of randomness.  Companies.
STROUD Worked examples and exercises are in the text Programme 29: Probability PROGRAMME 29 PROBABILITY.
Copyright ©2004 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 4-1 Probability and Counting Rules CHAPTER 4.
The Law of Averages. What does the law of average say? We know that, from the definition of probability, in the long run the frequency of some event will.
AP Statistics From Randomness to Probability Chapter 14.
Copyright © Cengage Learning. All rights reserved. 8 PROBABILITY DISTRIBUTIONS AND STATISTICS.
Probability and Probability Distributions. Probability Concepts Probability: –We now assume the population parameters are known and calculate the chances.
Basic ideas of statistical physics
Virtual University of Pakistan
Chapter-2 Maxwell-Boltzmann Statistics.
Probability Probability underlies statistical inference - the drawing of conclusions from a sample of data. If samples are drawn at random, their characteristics.
Normal Distribution Dr. Anshul Singh Thapa.
Section 2.2: Statistical Ensemble
Presentation transcript:

SOME BASIC IDEAS OF STATISTICAL PHYSICS Mr. Anil Kumar Associate Professor Physics Department Govt. College for Girls, Sector -11, Chandigarh.

Introduction  Statistics is a branch of science which deals with the collection, classification and interpretation of numerical facts or data. The aim of this science is to bring out a certain order in the data by the use of laws of probability. The application of statistical concepts of Physics has given rise to a new branch of physics known as Statistical Physics.  Statistical Physics establishes the relation between macroscopic behaviours (bulk properties) of the system in terms of its microscopic behaviour (individual properties). It is not concerned with the actual motions or interactions of the individual particles but explores the most probable behaviour of assembly of particles.

Probability  The probability of an event =  If a is the number of cases in which an event occurs and b the number of cases in which an event fails, then Probability of occurrence of the event = Probability of failing of the event = The sum of these two probabilities i.e. the total probability is always one since the event may either occur or fail.

Some Probability Considerations  Tossing of a coin : If we toss a coin. Either the head can come upward or the tail i.e P H =P T =  Tossing of two coins : The following combinations of Heads (H) and Tails (T) facing upwards are possible : HH. HT, TH, TT,HT., the chance of one of them taking place (say that of HH) is, P HH = = P H. P H

Independent events Two or more events are said to be independent if the occurrence of one is not influenced by the occurrence of others. Consider two independent events which occur simultaneously or in succession. Let P 1 and P 2 be the probabilities of the individual events, The probability of occurrence of the composite event P = P 1  P 2 Similarly, for n independent events to take place together the probability P = P 1. P 2 ……. P n This is known as multiplicative law of probability.

Principle of equal a priori probability  The principle of assuming equal probability for events which are equally likely is known as the principle of equal a priori probability.  A priori really means something which exists in our mind prior to and independently of the observation we are going to make.

Distribution of 4 different Particles in two Compartments of equal sizes Both the compartments are exactly alike. The particles are distinguishable from one another. Let the four particles be called as a, b, c and d. The total number of particles in two compartments is 4 i.e. The meaningful ways in which these four particles can be distributed among the two compartments is shown in table.

Contd..

Microstate and Macrostate  Microstate The distinct arrangement of the particles of a system is called its microstate. For example, if four distinguishable particles are distributed in two compartments, then sixteen microstates are possible. If n particles are to be distributed in 2 compartments. The no. of microstates is 2 n.  Macrostate The arrangement of the particles of a system without distinguishing them from one another is called macrostate of the system. For example, if four particles are to be distributed in two compartments without distinguishing among the particles, then there are five possible macrostates. If n particles are to be distributed in 2 compartments. The no. of macrostates is (n+1).

 Thermodynamic probability or frequency The numbers of microstates in a given macrostate is called thermodynamics probability or frequency of that macrostate. For distribution of 4 particles in 2 identical compartments W(4,0) = 1 W(3,2) = 6 W(1,3) = 4 W(3,1) = 4 W(0,4) = 1 W depends upon the distinguishable or indistinguishable nature of the particles. For indistinguishable particles,W=1

 It must be noted that  All the microstates of a system have equal a priori probability.  Probability of a microstate =  Probability of a macrostate = (number of microstates in that microstate)  (Probability of one miscrostate) = =Thermodynamic probability  probability of each microstate = W.p.

Constraints  The set of condition that must be obeyed by a system are called constraints.  All those macrostates / microstates which are allowed by the constraints on the system are known as accessible macrostates/microstates and the macrostates/ microstates forbidden on account of constrains are known as inaccessible macrostates/microstates.  The constraints on the system play an important role in determining the number of accessible macrostates / microstates. Greater the number of constraints, smaller the number of accessible microstates.

Distribution of n Particles in 2 Compartments The various macrostates (distributions) of the system are : (o, n) (1, n,  1) (2, n  2),….. (n 0), i.e., (n + 1) in number. Out of these macrostates, let us consider a particular macrostate (n 1, n 2 ) such that n 1 + n 2 = n n particles can be arranged among themselves in a total number of ways n P n = n! These arrangements include meaningful as well as meaningless arrangements. Total number of ways = (Number of meaningful arrangements)  (Number of meaningless arrangements)

n 1 particles in compartment 1 and n 2 particles in compartment 2 can be arranged in heir respective compartments in n 1 !  n 2 ! meaningless ways. The number of meaningful arrangements (i.e, the number of microstates) in the macrostate (n 1, n 2 ) W(n 1, n 2 ) = The total no. of microstates=2 n

Deviation from the state of Maximum probability When n particles are distributed in two compartments, the number of macrostates comes out to be (n+1). The macrostate (r, n  r) is of maximum probability if r = n/2, provided n is even. The probability of the macrostate (r, n  r) is P(r, n  r) = The probability of the most probable macrostate P max =

Probability of macrostate x < < n slightly different from most probable state is P x = P x = P max ….(1) Using stirling’s formula ln n = nln n  n and using Taylor’s theorem ln (1+ y) = y  provided | y | < 1 (1)on simplification gives Where f= = fractional deviation from most probable no. of particles in a cell.

Discussion For f = 10  3 Then The ratios for different values of n are given in the table It is apparent that as n increases the probability for 0.1% deviation from the most probable state goes in decreasing very rapidly.. n

Thus, as n increases, the probability of a macrostase falls off more and more rapidly even for a slight deviation with respect to the most probable macrostate. The probability distribution curve (drawn between Px/Pmax versus f) becomes narrower and narrower as n increases (figure). When n is very large the macrostates, deviating even by very-very slight amount w.r.t. the most probable macrostate, become extremely improbable and the system may be expected to exist practically in the most probable macrostate. Therefore, the properties of the system will be the same as those deduced from the most probable state. n 1 > n 2 > n 3 (2x / n)  0.2  n3n3 n2n2 n1n1

Static and Dynamic systems Static systems: If the constituent particles of a system remain at rest in a particular microstate, it is called static system. Dynamic systems: If the constituent particle of a system can move so that the system goes from one microstate to another, it is called dynamic system.

Time spent by a dynamic system in a Particular Macrostate A dynamic system continuously changes from one microstate to another. All microstates of a system have equal a priori probability. Therefore, the system should spend same amount of time in each of the microstate. If t o be the time of observation. So, on the average, we can assume that : t 0  total number of microstates of the system Ort 0  N (say) Let ‘t’ be the time spent by the system in a particular macrostate. Then t  number of microstates in the macrostate. But number of microstates = thermodynamic probability (W) of the macrostate.  t  thermodynamic probability of the macrostate ort  W = P, Probability of the macrostate That is the fraction of the time spent by a dynamic system in the macrostate is equal to the probability of that state

Equilibrium state of dynamic system The macrostate having maximum probability is termed as most probable state. For a dynamic system consisting of large number of particles, the probability of deviation from the most probable state decrease very rapidly. So majority of time the system stays in the most probable state. If the system is disturbed, it again tends to go towards the most probable state because the probability of staying in the disturbed state is very small. Thus, the most probable state behaves as the equilibrium state to which the system returns again and again.

Distribution of n Distinguishable Particles in k compartments of unequal sizes  For the macrostate the thermodynamic probability is  If the space is divided into cells. Let g 1, g 2 …… g k be the number of cells in compartments 1, 2,……k respectively. Thermodynamic probability for macrostate is

Thanks