Statistics. Large Systems Macroscopic systems involve large numbers of particles.  Microscopic determinism  Macroscopic phenomena The basis is in mechanics.

Slides:



Advertisements
Similar presentations
Random Variables ECE460 Spring, 2012.
Advertisements

Review of Basic Probability and Statistics
Chapter 4 Discrete Random Variables and Probability Distributions
Error Propagation. Uncertainty Uncertainty reflects the knowledge that a measured value is related to the mean. Probable error is the range from the mean.
Probability theory Much inspired by the presentation of Kren and Samuelsson.
Introduction to Probability and Statistics
1 Review of Probability Theory [Source: Stanford University]
Statistics.
Probability and Statistics Review
Class notes for ISE 201 San Jose State University
McGraw-Hill Ryerson Copyright © 2011 McGraw-Hill Ryerson Limited. Adapted by Peter Au, George Brown College.
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Engineering Probability and Statistics
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
1 Chapter 12 Introduction to Statistics A random variable is one in which the exact behavior cannot be predicted, but which may be described in terms of.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
5-2 Probability Distributions This section introduces the important concept of a probability distribution, which gives the probability for each value of.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
1 CY1B2 Statistics Aims: To introduce basic statistics. Outcomes: To understand some fundamental concepts in statistics, and be able to apply some probability.
QA in Finance/ Ch 3 Probability in Finance Probability.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
PBG 650 Advanced Plant Breeding
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
11-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Probability and Statistics Chapter 11.
Theory of Probability Statistics for Business and Economics.
Biostat. 200 Review slides Week 1-3. Recap: Probability.
Probability Theory 1.Basic concepts 2.Probability 3.Permutations 4.Combinations.
Engineering Probability and Statistics Dr. Leonore Findsen Department of Statistics.
Keith D. McCroan US EPA National Air and Radiation Environmental Laboratory.
OPIM 5103-Lecture #3 Jose M. Cruz Assistant Professor.
Continuous Distributions The Uniform distribution from a to b.
The Scientific Method Probability and Inferential Statistics.
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted by.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Introduction to Behavioral Statistics Probability, The Binomial Distribution and the Normal Curve.
Biostatistics Class 3 Discrete Probability Distributions 2/8/2000.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
Psyc 235: Introduction to Statistics DON’T FORGET TO SIGN IN FOR CREDIT!
Probability Definitions Dr. Dan Gilbert Associate Professor Tennessee Wesleyan College.
Independence and Bernoulli Trials. Sharif University of Technology 2 Independence  A, B independent implies: are also independent. Proof for independence.
Physics 270. o Experiment with one die o m – frequency of a given result (say rolling a 4) o m/n – relative frequency of this result o All throws are.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Radiation Detection and Measurement, JU, 1st Semester, (Saed Dababneh). 1 Radioactive decay is a random process. Fluctuations. Characterization.
Probability and Distributions. Deterministic vs. Random Processes In deterministic processes, the outcome can be predicted exactly in advance Eg. Force.
Engineering Probability and Statistics Dr. Leonore Findsen Department of Statistics.
R.Kass/F02 P416 Lecture 1 1 Lecture 1 Probability and Statistics Introduction: l The understanding of many physical phenomena depend on statistical and.
Chapter 31Introduction to Statistical Quality Control, 7th Edition by Douglas C. Montgomery. Copyright (c) 2012 John Wiley & Sons, Inc.
Welcome to MM305 Unit 3 Seminar Prof Greg Probability Concepts and Applications.
Chapter 5 Probability Distributions 5-1 Overview 5-2 Random Variables 5-3 Binomial Probability Distributions 5-4 Mean, Variance and Standard Deviation.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Business Statistics, A First Course (4e) © 2006 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Business Statistics,
Introduction to Basics on radiation probingand imaging using x-ray detectors Ralf Hendrik Menk Elettra Sincrotrone Trieste INFN Trieste Part 1 Introduction.
R. Kass/W04 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Basic statistics Usman Roshan.
Chapter Five The Binomial Probability Distribution and Related Topics
MECH 373 Instrumentation and Measurements
Discrete Random Variables
Appendix A: Probability Theory
PROBABILITY DISTRIBUTION Dr.Fatima Alkhalidi
The Gaussian Probability Distribution Function
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Applied Statistical and Optimization Models
Presentation transcript:

Statistics

Large Systems Macroscopic systems involve large numbers of particles.  Microscopic determinism  Macroscopic phenomena The basis is in mechanics from individual molecules.  Classical and quantum Statistical thermodynamics provides the bridge between levels. Consider 1 g of He as an ideal gas.  N = 1.5  atoms Use only position and momentum.  = 6 coordinates / atom  Total 9  variables  Requires about 4  10 9 PB Find the total kinetic energy.  K = ( p x 2 + p y 2 + p z 2 )/2 m  About 100 ops / collision  At 100 GFlops, 9  s  1 set of collisions in 3  10 7 yr

Ensemble Computing time averages for large systems is infeasible. Imagine a large number of similar systems.  Prepared identically  Independent This ensemble of systems can be used to derive theoretical properties of a single system.

Probability Probability is often made as a statement before the fact.  A priori assertion - theoretical  50% probability for heads on a coin Probability can also reflect the statistics of many events.  25% probability that 10 coins have 5 heads  Fluctuations where 50% are not heads Probability can be used after the fact to describe a measurement.  A posteriori assertion - experimental  Fraction of coins that were heads in a series of samples

Head Count Take a set of experimental trials.  N number of trials  n number of values (bins)  i a specific trial (1 … N)  j a specific value (1 … n) Use 10 coins and 20 trials. trial#headstrial#heads

Distribution Sorting trials by value forms a distribution.  Distribution function f counts occurrences in a bin The mean is a measure of the center of the distribution.  Mathematical average Coin distribution = 4.95  Median - midway value Coin median = 5  Mode - most frequent value Coin mode = x f(x)f(x)

Probability Distribution The distribution function has a sum equal to the number of trials N. A probability distribution p normalizes the distribution function by N.  Sum is 1 The mean can be expressed in terms of the probability x P(x)P(x)

Subsample Subsamples of the data may differ on their central value.  First five trials  Mean 6.0  Median 6  Mode 5 and 6, not unique Experimental probability depends on the sample. Theoretical probability predicts for an infinitely large sample. trial#headstrial#heads

Deviation Individual trials differ from the mean. The deviation is the difference of a trial from the mean.  mean deviation is zero The fluctuation is the mean of the squared deviations.  Fluctuation is the variance  Standard deviation squared

Correlation Events may not be random, but related to other events.  Time measured by trial The correlation function measures the mean of the product of related deviations.  Autocorrelation C 0 Different variables can be correlated.

Independent Trials Autocorrelation within a sample is the variance.  Coin experiment C 0 = Nearest neighbor correlation tests for randomness.  Coin experiment C 1 =  Much less than C 0  Ratio C 1 / C 0 = Periodic systems have C  peak for some period . trial#headstrial#heads

Correlation Measure Independent trials should peak strongly at 0.  No connection to subsequent events  No periodic behavior “This sample autocorrelation plot shows that the time series is not random, but rather has a high degree of autocorrelation between adjacent and near- adjacent observations.” nist.gov

Continuous Distribution Data that is continuously distributed is treated with an integral.  Probability still normalized to 1 The mean and variance are given as the moments.  First moment mean  Second moment variance Correlation uses a time integral.

Joint Probability The probabilities of two systems may be related. The intersection A  B indicates that both conditions are true.  Independent probability →  P(A  B) = P(A)P(B) The union A  B indicates that either condition is true.  P(A  B) =P(A)+P(B)-P(A  B)  P(A) + P(B), if exclusive A B C C = A  B

Joint Tosses Define two classes from the coin toss experiment.  A = { x < 5 }  B = { 2 < x < 8 } Individual probabilities are a union of discrete bins.  P(A) = 0.25, P(B) = 0.80  P(A  B) = 0.95 Dependent sets don’t follow product rule.  P(A  B) = 0.1  P(A)P(B) xP(x)P(x)

Conditional Probability The probability of an occurrence on a subset is a conditional probability.  Probability with respect to subset.  P(A | B) =P(A  B) / P(B) Use the same subsets for the coin toss example.  P(A | B) = 0.10 / 0.80 = 0.13 A B C C = A | B

Combinatorics The probability that n specific occurrences happen is the product of the individual occurrences.  Other events don’t matter.  Separate probability for negative events Arbitrary choice of events require permutations. Exactly n specific events happen at p: No events happen except the specific events: Select n arbitrary events from a pool of N identical types.

Binomial Distribution Treat events as a Bernoulli process with discrete trials.  N separate trials  Trials independent  Binary outcome of trial  Probability same for all trials The general form is the binomial distribution.  Terms same as binomial expansion  Probabilities normalized mathworld.wolfram.com

Mean and Standard Deviation The mean  of the binomial distribution: Consider an arbitrary x, and differentiate, and set x = 1. The standard deviation  of the binomial distribution:

Poisson Distribution Many processes are marked by rare occurrences.  Large N, small n, small p This is the Poisson distribution.  Probability depends on only one parameter Np  Normalized when summed from n =0 to .

Poisson Properties The mean and standard deviation are simply related.  Mean  = Np, standard deviation  2 = , Unlike the binomial distribution the Poisson function has values for n > N.

Poisson Away From Zero The Poisson distribution is based on the mean  = Np.  Assumed N >> 1, N >> n. Now assume that n >> 1,  large and P n >> 0 only over a narrow range. This generates a normal or Gaussian distribution. Let x = n – . Use Stirling’s formula.

Normal Distribution The full normal distribution separates mean  and standard deviation  parameters. Tables provide the integral of the distribution function. Useful benchmarks:  P (| x -  | < 1  =  P (| x -  | < 2  =  P (| x -  | < 3  = x P(x)P(x) 