880.P20 Winter 2006 Richard Kass Probability and Statistics Many of the process involved with detection of particles are statistical in nature: Number.

Slides:



Advertisements
Similar presentations
Chapter 4 Probability and Probability Distributions
Advertisements

COUNTING AND PROBABILITY
Introduction to Probability
Chapter 8: Probability: The Mathematics of Chance Lesson Plan
Chapter 2: Probability.
Chapter 6 Continuous Random Variables and Probability Distributions
Evaluating Hypotheses
CHAPTER 6 Statistical Analysis of Experimental Data
QMS 6351 Statistics and Research Methods Probability and Probability distributions Chapter 4, page 161 Chapter 5 (5.1) Chapter 6 (6.2) Prof. Vera Adamchik.
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning 1 Evaluating Hypotheses.
Probability (cont.). Assigning Probabilities A probability is a value between 0 and 1 and is written either as a fraction or as a proportion. For the.
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Elec471 Embedded Computer Systems Chapter 4, Probability and Statistics By Prof. Tim Johnson, PE Wentworth Institute of Technology Boston, MA Theory and.
Chapter 4 Continuous Random Variables and Probability Distributions
Chapter 4 Probability 4-1 Overview 4-2 Fundamentals 4-3 Addition Rule
Simulation of Random Walk How do we investigate this numerically? Choose the step length to be a=1 Use a computer to generate random numbers r i uniformly.
QA in Finance/ Ch 3 Probability in Finance Probability.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
01/24/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
880.P20 Winter 2006 Richard Kass Propagation of Errors Suppose we measure the branching fraction BR(Higgs  +  - ) using the number of produced Higgs.
880.P20 Winter 2006 Richard Kass 1 Confidence Intervals and Upper Limits Confidence intervals (CI) are related to confidence limits (CL). To calculate.
1 Probability and Statistics  What is probability?  What is statistics?
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
1 Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Basic Principle of Statistics: Rare Event Rule If, under a given assumption,
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Topics Covered Discrete probability distributions –The Uniform Distribution –The Binomial Distribution –The Poisson Distribution Each is appropriately.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
880.P20 Winter 2006 Richard Kass 1 Hypothesis Testing The goal of hypothesis testing is to set up a procedure(s) to allow us to decide if a model is acceptable.
Probability The definition – probability of an Event Applies only to the special case when 1.The sample space has a finite no.of outcomes, and 2.Each.
Theory of Probability Statistics for Business and Economics.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
By C. Kohn Waterford Agricultural Sciences.   A major concern in science is proving that what we have observed would occur again if we repeated the.
Applied Business Forecasting and Regression Analysis Review lecture 2 Randomness and Probability.
Random Variables Numerical Quantities whose values are determine by the outcome of a random experiment.
Chapter 8: Probability: The Mathematics of Chance Lesson Plan Probability Models and Rules Discrete Probability Models Equally Likely Outcomes Continuous.
 Review Homework Chapter 6: 1, 2, 3, 4, 13 Chapter 7 - 2, 5, 11  Probability  Control charts for attributes  Week 13 Assignment Read Chapter 10: “Reliability”
5.3 Random Variables  Random Variable  Discrete Random Variables  Continuous Random Variables  Normal Distributions as Probability Distributions 1.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Topic 2: Intro to probability CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text, Probability and Statistics for Engineering.
R.Kass/Sp12 P416 Lecture 1 1 Probability and Statistics Introduction: I) The understanding of many physical phenomena relies on statistical and probabilistic.
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
12/7/20151 Probability Introduction to Probability, Conditional Probability and Random Variables.
Chapter 8: Probability: The Mathematics of Chance Lesson Plan Probability Models and Rules Discrete Probability Models Equally Likely Outcomes Continuous.
Inference: Probabilities and Distributions Feb , 2012.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Probability and Distributions. Deterministic vs. Random Processes In deterministic processes, the outcome can be predicted exactly in advance Eg. Force.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
Probability. What is probability? Probability discusses the likelihood or chance of something happening. For instance, -- the probability of it raining.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Lecture 6 Dustin Lueker.  Standardized measure of variation ◦ Idea  A standard deviation of 10 may indicate great variability or small variability,
Stat 31, Section 1, Last Time Big Rules of Probability –The not rule –The or rule –The and rule P{A & B} = P{A|B}P{B} = P{B|A}P{A} Bayes Rule (turn around.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
R.Kass/F02 P416 Lecture 1 1 Lecture 1 Probability and Statistics Introduction: l The understanding of many physical phenomena depend on statistical and.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 8: Probability: The Mathematics of Chance Probability Models and Rules 1 Probability Theory  The mathematical description of randomness.  Companies.
Chapter 8: Probability: The Mathematics of Chance Lesson Plan Probability Models and Rules Discrete Probability Models Equally Likely Outcomes Continuous.
R.Kass/F05 P416 Lecture 1 1 Probability and Statistics Introduction: l The understanding of many physical phenomena relies on statistical and probabilistic.
G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Statistical Data Analysis: Lecture 5 1Probability, Bayes’ theorem 2Random variables and.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
AP Statistics From Randomness to Probability Chapter 14.
R. Kass/W04 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
MECH 373 Instrumentation and Measurements
Lecture 1 Probability and Statistics
Chapter 4 – Part 3.
Lecture 2 Binomial and Poisson Probability Distributions
Probability.
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Presentation transcript:

880.P20 Winter 2006 Richard Kass Probability and Statistics Many of the process involved with detection of particles are statistical in nature: Number of ion pairs created when proton goes through 1 cm of gas Energy lost by an electron going through 1 mm of lead The understanding and interpretation of all experimental data depend on statistical and probabilistic concepts: “The result of the experiment was inconclusive so we had to use statistics” how do we extract the best value of a quantity from a set of measurements? how do we decide if our experiment is consistent/inconsistent with a given theory? how do we decide if our experiment is internally consistent? how do we decide if our experiment is consistent with other experiments? how do we decide if we have a signal (i.e. evidence for a new particle)? Our pentaquark example from the SPring-8 (LEPS) experiment : signal: 19 events Significance: 4.6   Assuming gaussian stats the prob. for a 4.6  effect is ~4x10 -6 ) What are the authors trying to say here? If this bump is accidental, then the accident rate is 1 in 4 million. Or If I repeated the experiment 4 million times I would get a bump this big or bigger. What do the authors want you to think? Since the accident rate is so low it must not be an accident, therefore it is physics! K + n  5 =dudus

880.P20 Winter 2006 Richard Kass Probability & Statistics & Reality CLAS 2003/2004 CLAS 2005 g11 Sometimes it is not a question of statistical significance! Again, the pentaquark state  + (1540) gives a great example: Consider the CLAS experiment at JLAB: 2003/4 report a 7.8  effect (~6x according to MATHEMATICA) 2005 report NO Signal! (better experiment) What size signal should we expect? Lesson: This is not a statistics issue, but one of experiment design and implementation.

880.P20 Winter 2006 Richard Kass Definition of probability by example (“empirical”): Suppose we have N trials and a specified event occurs r times. example: the trial could be rolling a dice and the event could be rolling a 6. define probability (P) of an event (E) occurring as: P(E) = r/N when N  examples: six sided dice:P(6) = 1/6 for an honest dice: P(1) = P(2) = P(3) = P(4) =P(5) = P(6) =1/6 coin toss:P(heads) = P(tails) =0.5 P(heads) should approach 0.5 the more times you toss the coin. For a single coin toss we can never get P(heads) = 0.5! By definition probability (P) is a non-negative real number bounded by 0  P  1 if P = 0 then the event never occurs if P = 1 then the event always occurs Let A and B be subsets of S then P(A)≥0, P(B)≥0 Events are independent if: P(A  B) = P(A)P(B) Coin tosses are independent events, the result of the next toss does not depend on previous toss. Events are mutually exclusive (disjoint) if: P(A  B) = 0 or P(A  B) = P(A) + P(B) In tossing a coin we either get a head or a tail. Sum (or integral) of all probabilities if they are mutually exclusive must = 1.  intersection  union How do we define Probability?    {1,3} A  {1,2,3,5} p(x) =probability distribution function (pdf) One can find 4 definitions of probability: mathematical, empirical, objective, and subjective. (see STATISTICS by Barlow) “objective” probability “mathematical” probability

880.P20 Winter 2006 Richard Kass l Probability can be a discrete or a continuous variable. Discrete probability: P can have certain values only. examples: tossing a six-sided dice: P(x i ) = P i here x i = 1, 2, 3, 4, 5, 6 and P i = 1/6 for all x i. tossing a coin: only 2 choices, heads or tails. for both of the above discrete examples (and in general) when we sum over all mutually exclusive possibilities: Continuous probability: P can be any number between 0 and 1. define a “probability density function”, pdf, f(x): with  a continuous variable Probability for x to be in the range a  x  b is: Just like the discrete case the sum of all probabilities must equal 1. We say that f(x) is normalized to one. Probability for x to be exactly some number is zero since: NOTATION x i is called a random variable Probability=“area under the curve” Note: in the above example the pdf depends on only 1 variable, x. In general, the pdf can depend on many variables, i.e. f=f(x,y,z,..). In these cases the probability is calculated from a multi-dimensional integration. Two Types of Probability

880.P20 Winter 2006 Richard Kass Some Common Probability Distributions Examples of some common P(x)’s and f(x)’s: Discrete = P(x) Continuous = f(x) binomialuniform, i.e. = constant PoissonGaussian exponential chi square How do we describe a probability distribution? mean, mode, median, and variance For a continuous distribution these quantities are defined by: For discrete distribution the mean and variance are defined by:

880.P20 Winter 2006 Richard Kass Some Continuous Probability Distributions Chi-square distribution Student t distribution u v=  gaussian v=1  Cauchy (Breit-Wigner) For a Gaussian pdf the mean, mode, and median are all at the same x. For many pdfs the mean, mode, and median are in different places. Remember: Probability is the area under these curves! For many pdfs its integral can not be done in closed form, use a table to calculate probability.

880.P20 Winter 2006 Richard Kass Uniform distribution and Random Numbers What is a uniform probability distribution: p(x)? p(x)=constant (c) for a  x  b p(x)=zero everywhere else Therefore p(x 1 )dx 1 = p(x 2 )dx 2 if dx 1 =dx 2  equal intervals give equal probabilities For a uniform distribution with a=0, b=1 we have p(x)=1 What is a random number generator ? A number picked at random from a uniform distribution with limits [0,1] All major computer languages (FORTRAN, C) come with a random number generator. FORTRAN: RAN(iseed) The following FORTRAN program generates 5 random numbers: iseed=12345 do I=1,5 y=ran(iseed) type *, y enddo end If we generate “a lot” of random numbers all equal intervals should contain the same amount of numbers. For example: generate: 10 6 random numbers expect: 10 5 numbers [0.0, 0.1] 10 5 numbers [0.45, 0.55] x p(x) 1 1 0

880.P20 Winter 2006 Richard Kass Uniform Random Numbers & Monte Carlo The uniform pdf is the basis of all Monte Carlo Calculations! The Monte Carlo method is commonly used to simulate experiments. A google search yielded 3,960,000 hits for “Monte Carlo Method ” First reference ( “The Monte Carlo method provides approximate solutions to a variety of mathematical problems by performing statistical sampling experiments on a computer. The method applies to problems with no probabilistic content as well as to those with inherent probabilistic structure. Among all numerical methods that rely on N-point evaluations in M-dimensional space to produce an approximate solution, the Monte Carlo method has absolute error of estimate that decreases as N superscript -1/2 whereas, in the absence of exploitable special structure all others have errors that decrease as N superscript -1/M at best.” “The method is called after the city in the Monaco principality, because of a roulette, a simple random number generator. The name and the systematic development of Monte Carlo methods dates from about 1944.” “The real use of Monte Carlo methods as a research tool stems from work on the atomic bomb during the second world war. This work involved a direct simulation of the probabilistic problems concerned with random neutron diffusion in fissile material ” Basically, it is a way to do really complicated integrals! A certain molecule always has a rectangular shape. However, the length of a side varies uniformly between 0.5 and 1Å. Calculate the probability that the area of a molecule is  0.5 Å 2. Suppose I want the volume to be  0.5 Å 3 ?

880.P20 Winter 2006 Richard Kass Uniform Random Numbers & Monte Carlo Given the uniform pdf we can generate all other pdfs! For example: if RAN is uniform in (0,1) then a+(b-a)*RAN is uniform in (a,b) Suppose we want to generate random numbers according to a pdf=p(x) starting from our uniform random number pdf (=r). Let r 0 =random number uniform in (0,1), we want to find the x that satisfies: Whether or not we can actually invert this equation depends on whether or not p(x) can be integrated in closed form. Works for exponential pdf: Does not for gaussian since integral can not be done in closed form. BUT lots of other clever ways of generating pdfs, e.g. gaussian: g=sin2  r 1 (-2lnr 2 ) 1/2 When all else fails, can use “acceptance-rejection” x p(x) 1 0 1) normalize p(x): p(x)/p max Max. of function=1 2) pick a random number to represent x and calculate p(x)/p max 3) pick another random number=y 4) if y<p(x)/p max accept otherwise reject 5) repeat 2)-4) lots of times… Problem: this algorithm can be very inefficient  “inversion”

880.P20 Winter 2006 Richard Kass Calculation of mean and variance: example: a discrete data set consisting of three numbers: {1, 2, 3} average (  ) is just: Complication: suppose some measurements are more precise than others. Let each measurement x i have a weight w i associated with it then: variance (  2 ) or average squared deviation from the mean is:  is called the standard deviation rewrite the above expression by expanding the summations: Note: The n in the denominator would be n -1 if we determined the average (  ) from the data itself. The variance describes the width of the pdf ! “weighted average” This is sometimes written as: - 2 with <>  average of what ever is in the brackets Discrete Probability Distributions Real life example: 3 measurements of branching fraction of B -  D 0 K *-

880.P20 Winter 2006 Richard Kass Using the definition of  from above we have for our example of {1,2,3}: The case where the measurements have different weights is more complicated: Here  is the weighted mean If we calculated  from the data,  2 gets multiplied by a factor n/(n  1). Example: a continuous probability distribution, This “pdf” has two modes! It has same mean and median, but differ from the mode(s). Discrete & Continuous Probability Distributions f(x)=sin 2 x is not a true pdf since it is not normalized! f(x)=(1/  ) sin 2 x is a normalized pdf (c=1/  ).

880.P20 Winter 2006 Richard Kass Probability, Set Theory and Stuff The relationships and results from set theory are essential to the understanding of probability. Below are some definitions and examples that illustra  e the connection between set theory, probability and statistics. We define an experiment as a process that generates “observations” and a sample space (S) as the set of all possible outcomes from the experiment: simple event: only one possible outcome compound event: more than one outcome As an example of simple and compound events consider particles (e.g. protons, neutrons) made of u (“up”), d (“down”), and s (“strange”) quarks. The u quark has electric charge (Q) =2/3|e| (e=charge of electron) while the d and s quarks have charge =-1/3|e|. Let the experiment be the ways we combine 3 quarks to make a Q=0, 1, or 2 state. Event A:Q=0 {ssu, ddu, sdu}note: a neutron is a ddu state Event B:Q=1{suu, duu}note: a proton is a duu state Event C: Q=2 {uuu} For this example events A and B are compound while event C is simple. The following definitions from set theory are used all the time in the discussion of probability. Let A and B be events in a sample space S. Union: The union of A & B (A  B) is the event consisting of all outcomes in A or B. Intersection: The intersection of A & B (A  B) is the event consisting of all outcomes in A and B. Complement: The complement of A (A´) is the set of outcomes in S not contained in A. Mutually exclusive: If A & B have no outcomes in common they are mutually exclusive.

880.P20 Winter 2006 Richard Kass Probability, Set Theory and Stuff Returning to our example of particles containing 3 quarks (“baryons”): The event consisting of charged particles with Q=1,2 is the union of B and C: B  C The events A, B, C are mutually exclusive since they do not have any particles in common. S A B A common and useful way to visualize union, intersection, and mutually exclusive is to use a Venn diagram of sets A and B defined in space S: S A B A  B: intersection of A&B S A B A  B: union of A&B A & B mutually exclusive S A B Venn diagram of A&B The axioms of probabilities (P): a) For any event A, P(A)≥0. (no negative probabilities allowed) b) P(S)=1. c) If A 1, A 2, ….A n is a collection of mutually exclusive events then: (the collection can be infinite (n=∞)) From the above axioms we can prove the following useful propositions: a) For any event A: P(A)=1-P(A´) b) If A & B are mutually exclusive then P( A  B )=0 c) For any two events A & B: P( A  B)=P(A)+P(B)- P( A  B ) items b, c are “obvious” from their Venn diagrams

880.P20 Winter 2006 Richard Kass Probability, Set Theory and Stuff Example: Everyone likes pizza. Assume the probability of having pizza for lunch is 40%, the probability of having pizza for dinner is 70%, and the probability of having pizza for lunch and dinner is 30%. Also, this person always skips breakfast. We can recast this example using: P(A)= probability of having pizza for lunch =40% P(B)= probability of having pizza for dinner = 70% P(A  B)=30% (pizza for lunch and dinner) 1) What is the probability that pizza is eaten at least once a day? The key words are “at least once”, this means we want the union of A & B P(A  B)=P(A)+P(B)-P(A  B) = =0.8 2) What is the probability that pizza is not eaten on a given day? Not eating pizza (Z´) is the complement of eating pizza (Z) so P(Z)+P(Z´)=1 P(Z´)=1-P(Z) =1-0.8 = 0.2 3) What is the probability that pizza is only eaten once a day? This can be visualized by looking at the Venn diagram and realizing we need to exclude the overlap (intersection) region. P(A  B)-P(A  B) = =0.5 prop. c) prop. a) The non-overlapping blue area is pizza for lunch, no pizza for dinner. The non-overlapping red area is pizza for dinner, no pizza for lunch. pizza for lunch pizza for dinner

880.P20 Winter 2006 Richard Kass Conditional Probability Frequently we must calculate a probability assuming something else has occurred. This is called conditional probability. Here’s an example of conditional probability: Suppose a day of the week is chosen at random. The probability the day is Thursday is 1/7. P(Thursday)=1/7 Suppose we also know the day is a weekday. Now the probability is conditional, =1/5. P(Thursday|weekday)=1/5 the notation is: probability of it being Thursday given that it is a weekday Formally, we define the conditional probability of A given B has occurred as: P(A|B)=P(A  B)/P(B) We can use this definition to calculate the intersection of A and B: P(A  B)=P(A|B)P(B) For the case where the A i ’s are both mutually exclusive and exhaustive we have: For our example let B=the day is a Thursday, A 1 = weekday, A 2 =weekend, then: P(Thursday)=P(thursday|weekday)P(weekday)+P(Thursday|weekend)P(weekend) P(Thursday)=(1/5)(5/7)+(0)(2/7)=1/7

880.P20 Winter 2006 Richard Kass Bayes’s Theorem Bayes’s Theorem relates conditional probabilities. It is widely used in many areas of the physical and social sciences. Let A 1, A 2,..A i be a collection of mutually exclusive and exhaustive events with P(A i )>0 for all i. Then for any other event B with P(B)>0 we have: We call: P(A j ) the aprori probability of A j occurring P(A j |B) the posterior probability that A j will occur given that B has occurred P(B|A j ) the likelihood Independence has a special meaning in probability: Events A and B are said to be independent if P(A|B)=P(A) Using the definition of conditional probability A and B are independent iff: P(A  B)=P(A)P(B) Probability it is a weekday given it is Thursday. Probability it is Thursday given it is a weekday.

880.P20 Winter 2006 Richard Kass Independence Example If the scans are independent then: P(1  2)=P(1)P(2) Let’s consider a situation where we are trying to determine the number of events (N) we have in our data sample. This is a “classic” problem that comes in many different situations: ancient times: scanning emulsion or bubble/spark chamber photos modern times: using computer algorithms to find rare events (top, beyond, GKZ, B , etc) Consider the case where “scan 1” finds N 1 events and “scan 2” finds N 2 events. The number of events found by both scans is N 12. What can we say about the total number of events, N? P(1) P(2) P(1  2)

880.P20 Winter 2006 Richard Kass Example of Bayes’s Theorem While Bayes’s theorem is very useful in physics, perhaps the best illustration of its use is in medical statistics, especially drug testing. Assume a certain drug test: gives a positive result 97% of the time when the drug is present: P(positive test|drug present)=0.97 gives a positive result 0.4% of the time if the drug is not present (“false positive”) P(positive test|drug not present)=0.004 Let’s assume that the drug is present in 0.5% of the population (1 out of 200 people). P(drug present)=0.005 P(drug not present)=1-P(drug present)=0.995 What is the probability that the drug is not present and you have a positive test? P(drug is not present|positive test)=???? Bayes’s Theorem gives: Thus there is a 45% chance that the test comes back positive even if you are drug free! The real life consequence of this large probability is that drug tests are often administered twice!

880.P20 Winter 2006 Richard Kass Another Example of Bayes’s Theorem Assume we are trying to separate pions from kaons using (e.g.) a cerenkov counter (CC) 99% of the time when a pion goes through our CC we get a signal: P(signal|pion)= % of the time when a kaon goes through our CC we get a signal: P(signal|kaon)=0.055 Suppose we are interested in looking for  -  - This decay of the  -lepton has never been observed, example of a 2 nd class current! (wrong “G-parity”) Theory says BF(  -  - )=1.5x (ranges from x10 -5 ) There is a similar decay  -  -  that has been measured with BF(  -  - )=27x10 -5 Thus there is ~49% chance that an event with a CC signal is  -  -   Suppose we have an event where our CC gives us a signal. What is the probability that the event is  -  -  ?

880.P20 Winter 2006 Richard Kass Bayes’s Theorem Continued How can we improve the signal to noise (S/N)? Suppose we add (or use) another independent particle detector where 90% of the time when a pion goes through our CC we get a signal 15% of the time when a kaon goes through our CC we get a signal This is not such a great detector. But it really helps: P(signal|pion)=(0.99)(0.9)=0.89 P(signal|kaon)=(0.055)(0.15)= Thus now is ~85% chance that an event with a CC signal is  -  -  Adding particle ID has improved our S/N: No particle ID S/N=1.5/27=0.06 One PID detector S/N=.49/.51=0.96 Two PID detectorsS/N=0.85/0.15=5.7 It is usually more practical to build two moderately good PID detectors than one super high quality PID detector. In BaBar we do PID with a cerenkov counter and the drift chamber.

880.P20 Winter 2006 Richard Kass Accuracy:The accuracy of an experiment refers to how close the experimental measurement is to the true value of the quantity being measured. Precision: This refers to how well the experimental result has been determined, without regard to the true value of the quantity being measured. Just because an experiment is precise it does not mean it is accurate!! example: measurements of the neutron lifetime over the years: Steady increase in precision of the neutron lifetime but are any of these measurements accurate? This figure shows various measurements of the neutron lifetime over the years. The size of bar reflects the precision of the experiment Accuracy and Precision

880.P20 Winter 2006 Richard Kass Use results from probability and statistics as a way of calculating how “good” a measurement is. most common quality indicator: relative precision = [uncertainty of measurement]/measurement example: we measure a table to be 10 inches with uncertainty of 1 inch. relative precision = 1/10 = 0.1 or 10% (% relative precision) Uncertainty in measurement is usually square root of variance:  = standard deviation  is usually calculated using the technique of “propagation of errors”. However this  is not what most people think it is! We will discuss this in more detail soon. Statistical and Systematic Errors Results from experiments are often presented as: N ± XX ± YY N: value of quantity measured (or determined) by experiment. XX: statistical error, usually assumed to be from a Gaussian distribution. With the assumption of Gaussian statistics we can say (calculate) something about how well our experiment agrees with other experiments and/or theories. Expect ~ 68% chance that the true value is between N - XX and N + XX. YY: systematic error. Hard to estimate, distribution of errors usually not known. Examples:mass of proton = ± GeV (only statistical error given) mass of W boson = 80.8 ± 1.5 ± 2.4 GeV (both statistical and systematic error given) Measurement Errors (or Uncertainties)

880.P20 Winter 2006 Richard Kass What’s the difference between statistical and systematic errors? Statistical errors are “random” in the sense that if we repeat the measurement enough times: XX  0 Systematic errors do not  0 with repetition. examples of sources of systematic errors: voltmeter not calibrated properly a ruler not the length we think is (meter stick might really be < meter!) Because of systematic errors, an experimental result can be precise, but not accurate! How do we combine systematic and statistical errors to get one estimate of precision? Can be a problem! two choices:  tot = XX + YY add them linearly  tot = (XX 2 + YY 2 ) 1/2 add them in quadrature We will discuss a detailed averaging procedure next week………... Some other ways of quoting experimental results lower limit: “the mass of particle X is > 100 GeV” upper limit: “the mass of particle X is < 100 GeV” asymmetric errors: mass of particle Measurement Errors (or Uncertainties) Example: the error in the mean  m : If we repeat a measurement n times and each measurement has uncertainty  then: