R.Kass/F02 P416 Lecture 1 1 Lecture 1 Probability and Statistics Introduction: l The understanding of many physical phenomena depend on statistical and.

Slides:



Advertisements
Similar presentations
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 4 Using Probability and Probability Distributions.
Advertisements

Business Statistics for Managerial Decision
Physics 114: Lecture 7 Uncertainties in Measurement Dale E. Gary NJIT Physics Department.
June 3, 2008Stat Lecture 6 - Probability1 Probability Introduction to Probability, Conditional Probability and Random Variables Statistics 111 -
Short review of probabilistic concepts
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Statistics. Large Systems Macroscopic systems involve large numbers of particles.  Microscopic determinism  Macroscopic phenomena The basis is in mechanics.
Quality Control Procedures put into place to monitor the performance of a laboratory test with regard to accuracy and precision.
Short review of probabilistic concepts Probability theory plays very important role in statistics. This lecture will give the short review of basic concepts.
Evaluating Hypotheses
Probability and Statistics Review
Statistical Background
CHAPTER 6 Statistical Analysis of Experimental Data
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning 1 Evaluating Hypotheses.
Lecture 6: Descriptive Statistics: Probability, Distribution, Univariate Data.
Modern Navigation Thomas Herring
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
1 Chapter 12 Introduction to Statistics A random variable is one in which the exact behavior cannot be predicted, but which may be described in terms of.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Introduction l Example: Suppose we measure the current (I) and resistance (R) of a resistor. u Ohm's law relates V and I: V = IR u If we know the uncertainties.
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
1 CY1B2 Statistics Aims: To introduce basic statistics. Outcomes: To understand some fundamental concepts in statistics, and be able to apply some probability.
Chapter 7: Random Variables
880.P20 Winter 2006 Richard Kass Probability and Statistics Many of the process involved with detection of particles are statistical in nature: Number.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
1 9/23/2015 MATH 224 – Discrete Mathematics Basic finite probability is given by the formula, where |E| is the number of events and |S| is the total number.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Topics Covered Discrete probability distributions –The Uniform Distribution –The Binomial Distribution –The Poisson Distribution Each is appropriately.
Probability The definition – probability of an Event Applies only to the special case when 1.The sample space has a finite no.of outcomes, and 2.Each.
Theory of Probability Statistics for Business and Economics.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
Applied Business Forecasting and Regression Analysis Review lecture 2 Randomness and Probability.
R Kass/SP07 P416 Lecture 4 1 Propagation of Errors ( Chapter 3, Taylor ) Introduction Example: Suppose we measure the current (I) and resistance (R) of.
5.3 Random Variables  Random Variable  Discrete Random Variables  Continuous Random Variables  Normal Distributions as Probability Distributions 1.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Binomial Experiment A binomial experiment (also known as a Bernoulli trial) is a statistical experiment that has the following properties:
R.Kass/Sp12 P416 Lecture 1 1 Probability and Statistics Introduction: I) The understanding of many physical phenomena relies on statistical and probabilistic.
CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University
Dr. Ahmed Abdelwahab Introduction for EE420. Probability Theory Probability theory is rooted in phenomena that can be modeled by an experiment with an.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
ME Mechanical and Thermal Systems Lab Fall 2011 Chapter 3: Assessing and Presenting Experimental Data Professor: Sam Kassegne, PhD, PE.
12/7/20151 Probability Introduction to Probability, Conditional Probability and Random Variables.
Discrete Distributions. Random Variable - A numerical variable whose value depends on the outcome of a chance experiment.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Probability and Distributions. Deterministic vs. Random Processes In deterministic processes, the outcome can be predicted exactly in advance Eg. Force.
Analysis of Experimental Data; Introduction
Probability. What is probability? Probability discusses the likelihood or chance of something happening. For instance, -- the probability of it raining.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Lecture 6 Dustin Lueker.  Standardized measure of variation ◦ Idea  A standard deviation of 10 may indicate great variability or small variability,
Stat 31, Section 1, Last Time Big Rules of Probability –The not rule –The or rule –The and rule P{A & B} = P{A|B}P{B} = P{B|A}P{A} Bayes Rule (turn around.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 8: Probability: The Mathematics of Chance Probability Models and Rules 1 Probability Theory  The mathematical description of randomness.  Companies.
R.Kass/F05 P416 Lecture 1 1 Probability and Statistics Introduction: l The understanding of many physical phenomena relies on statistical and probabilistic.
R. Kass/Sp07P416/Lecture 71 More on Least Squares Fit (LSQF) In Lec 5, we discussed how we can fit our data points to a linear function (straight line)
R. Kass/W04 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
APPENDIX A: A REVIEW OF SOME STATISTICAL CONCEPTS
MECH 373 Instrumentation and Measurements
Lecture 1 Probability and Statistics
The Gaussian Probability Distribution Function
Chapter 4 – Part 3.
Lecture 4 Propagation of errors
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
From Randomness to Probability
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Presentation transcript:

R.Kass/F02 P416 Lecture 1 1 Lecture 1 Probability and Statistics Introduction: l The understanding of many physical phenomena depend on statistical and probabilistic concepts: Statistical Mechanics (physics of systems composed of many parts: gases, liquids, solids.) 1 mole of anything contains 6x10 23 particles (Avogadro's number) Even though the force between particles (Newton’s laws) it is impossible to keep track of all 6x10 23 particles even with the fastest computer imaginable We must resort to learning about the group properties of all the particles: use the partition function: calculate average energy, entropy, pressure... of a system Quantum Mechanics (physics at the atomic or smaller scale) wavefunction = probability amplitude talk about the probability of an electron being located at (x,y,z) at a certain time. l Our understanding/interpretation of experimental data depends on statistical and probabilistic concepts: how do we extract the best value of a quantity from a set of measurements? how do we decide if our experiment is consistent/inconsistent with a given theory? how do we decide if our experiment is internally consistent? how do we decide if our experiment is consistent with other experiments? In this course we will concentrate on the above experimental issues!

R.Kass/F02 P416 Lecture 1 2 Definition of probability by example: l Suppose we have N trials and a specified event occurs r times. example: the trial could be rolling a dice and the event could be rolling a 6. define probability (P) of an event (E) occurring as: P(E) = r/N when N  examples: six sided dice:P(6) = 1/6 for an honest dice: P(1) = P(2) = P(3) = P(4) =P(5) = P(6) =1/6 coin toss:P(heads) = P(tails) =0.5 P(heads) should approach 0.5 the more times you toss the coin. For a single coin toss we can never get P(heads) = 0.5! u By definition probability (P) is a non-negative real number bounded by 0  P  1 if P = 0 then the event never occurs if P = 1 then the event always occurs Let A and B be subsets of S then P(A)>0, P(B)>0 Events are independent if: P(A  B) = P(A)P(B) Coin tosses are independent events, the result of the next toss does not depend on previous toss. Events are mutually exclusive (disjoint) if: P(A  B) = 0 or P(A  B) = P(A) + P(B) In tossing a coin we either get a head or a tail. Sum (or integral) of all probabilities if they are mutually exclusive must = 1.  intersection  union How do we define Probability?

R.Kass/F02 P416 Lecture 1 3 l Probability can be a discrete or a continuous variable. Discrete probability: P can have certain values only. examples: tossing a six-sided dice: P(x i ) = P i here x i = 1, 2, 3, 4, 5, 6 and P i = 1/6 for all x i. tossing a coin: only 2 choices, heads or tails. for both of the above discrete examples (and in general) when we sum over all mutually exclusive possibilities: u Continuous probability: P can be any number between 0 and 1. define a “probability density function”, pdf, with  a continuous variable Probability for x to be in the range a  x  b is: Just like the discrete case the sum of all probabilities must equal 1. We say that f(x) is normalized to one. Probability for x to be exactly some number is zero since: NOTATION x i is called a random variable “area under the curve”

R.Kass/F02 P416 Lecture 1 4 l Examples of some common P(x)’s and f(x)’s: Discrete = P(x) Continuous = f(x) binomialuniform, i.e. constant PoissonGaussian exponential chi square l How do we describe a probability distribution? u mean, mode, median, and variance u for a continuous distribution, these quantities are defined by: u for a discrete distribution, the mean and variance are defined by:

R.Kass/F02 P416 Lecture 1 5 l Some continuous pdf’s: For a Gaussian pdf the mean, mode, and median are all at the same x. For a most pdfs the mean, mode, and median are in different places.

R.Kass/F02 P416 Lecture 1 6 l Calculation of mean and variance: example: a discrete data set consisting of three numbers: {1, 2, 3} average (  ) is just: Complication: suppose some measurements are more precise than others. Let each measurement x i have a weight w i associated with it then: variance (  2 ) or average squared deviation from the mean is just:  is called the standard deviation rewrite the above expression by expanding the summations: Note: The n in the denominator would be n -1 if we determined the average (  ) from the data itself. The variance describes the width of the pdf ! “weighted average” This is sometimes written as: - 2 with <>  average of what ever is in the brackets

R.Kass/F02 P416 Lecture 1 7 using the definition of  from above we have for our example of {1,2,3}: The case where the measurements have different weights is more complicated: Here  is the weighted mean If we calculated  from the data,  2 gets multiplied by a factor n/(n  1). u example: a continuous probability distribution, This “pdf” has two modes! It has same mean and median, but differ from the mode(s).

R.Kass/F02 P416 Lecture 1 8 For continuous probability distributions, the mean, mode, and median are calculated using either integrals or derivatives: u example: Gaussian distribution function, a continuous probability distribution In this class you should feel free to use a table of integrals and/or derivatives. f(x)=sin 2 x is not a true pdf since it is not normalized! f(x)=(1/  ) sin 2 x is a normalized pdf.

R.Kass/F02 P416 Lecture 1 9 Accuracy and Precision: l Accuracy:The accuracy of an experiment refers to how close the experimental measurement is to the true value of the quantity being measured. l Precision: This refers to how well the experimental result has been determined, without regard to the true value of the quantity being measured. u just because an experiment is precise it does not mean it is accurate!! u measurements of the neutron lifetime over the years: Steady increase in precision but are any of these measurements accurate? This figure shows various measurements of the neutron lifetime over the years. The size of bar reflects the precision of the experiment

R.Kass/F02 P416 Lecture 1 10 Measurement Errors (or uncertainties) l Use results from probability and statistics as a way of calculating how “good” a measurement is. most common quality indicator: relative precision = [uncertainty of measurement]/measurement example: we measure a table to be 10 inches with uncertainty of 1 inch. relative precision = 1/10 = 0.1 or 10% (% relative precision) Uncertainty in measurement is usually square root of variance:  = standard deviation  is usually calculated using the technique of “propagation of errors”. Statistics and Systematic Errors l Results from experiments are often presented as: N ± XX ± YY N: value of quantity measured (or determined) by experiment. XX: statistical error, usually assumed to be from a Gaussian distribution. With the assumption of Gaussian statistics we can say (calculate) something about how well our experiment agrees with other experiments and/or theories. Expect ~ 68% chance that the true value is between N - XX and N + XX. YY: systematic error. Hard to estimate, distribution of errors usually not known. examples:mass of proton = ± GeV (only statistical error given) mass of W boson = 80.8 ± 1.5 ± 2.4 GeV (both statistical and systematic error given)

R.Kass/F02 P416 Lecture 1 11 What’s the difference between statistical and systematic errors? u statistical errors are “random” in the sense that if we repeat the measurement enough times: XX  0 u systematic errors do not  0 with repetition. examples of sources of systematic errors: voltmeter not calibrated properly a ruler not the length we think is (meter stick might really be < meter!) Because of systematic errors, an experimental result can be precise, but not accurate! l How do we combine systematic and statistical errors to get one estimate of precision? BIG PROBLEM! two choices:  tot = XX + YY add them linearly  tot = (XX 2 + YY 2 ) 1/2 add them in quadrature Some other ways of quoting experimental results lower limit: “the mass of particle X is > 100 GeV” upper limit: “the mass of particle X is < 100 GeV” asymmetric errors: mass of particle