G. Cowan Statistical Methods in Particle Physics1 Statistical Methods in Particle Physics Day 1: Introduction 清华大学高能物理研究中心 2010 年 4 月 12—16 日 Glen Cowan.

Slides:



Advertisements
Similar presentations
G. Cowan 2009 CERN Summer Student Lectures on Statistics1 Introduction to Statistical Methods for High Energy Physics 2009 CERN Summer Student Lectures.
Advertisements

1 Methods of Experimental Particle Physics Alexei Safonov Lecture #21.
Statistical Data Analysis Stat 3: p-values, parameter estimation
Maximum likelihood (ML) and likelihood ratio (LR) test
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Statistical Data Analysis: Lecture 2 1Probability, Bayes’ theorem 2Random variables and.
Maximum likelihood (ML)
Chapter 6 Continuous Random Variables and Probability Distributions
Evaluating Hypotheses
Statistics.
Machine Learning CMPT 726 Simon Fraser University
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
K. Desch – Statistical methods of data analysis SS10
G. Cowan 2011 CERN Summer Student Lectures on Statistics / Lecture 41 Introduction to Statistics − Day 4 Lecture 1 Probability Random variables, probability.
3-1 Introduction Experiment Random Random experiment.
Continuous Random Variables and Probability Distributions
G. Cowan Lectures on Statistical Data Analysis Lecture 14 page 1 Statistical Data Analysis: Lecture 14 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Introduction to Statistics − Day 2
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 7 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Maximum likelihood (ML)
Lecture II-2: Probability Review
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
G. Cowan SUSSP65, St Andrews, August 2009 / Statistical Methods 1 page 1 Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
880.P20 Winter 2006 Richard Kass 1 Confidence Intervals and Upper Limits Confidence intervals (CI) are related to confidence limits (CL). To calculate.
1 Probability and Statistics  What is probability?  What is statistics?
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis YETI IPPP Durham Young Experimentalists and.
Uniovi1 Some distributions Distribution/pdfExample use in HEP BinomialBranching ratio MultinomialHistogram with fixed N PoissonNumber of events found UniformMonte.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
G. Cowan 2009 CERN Summer Student Lectures on Statistics1 Introduction to Statistics − Day 4 Lecture 1 Probability Random variables, probability densities,
G. Cowan Lectures on Statistical Data Analysis Lecture 3 page 1 Lecture 3 1 Probability (90 min.) Definition, Bayes’ theorem, probability densities and.
G. Cowan Statistical Methods in Particle Physics1 Statistical Methods in Particle Physics Day 3: Multivariate Methods (II) 清华大学高能物理研究中心 2010 年 4 月 12—16.
G. Cowan Computing and Statistical Data Analysis / Stat 2 1 Computing and Statistical Data Analysis Stat 2: Catalogue of pdfs London Postgraduate Lectures.
1 Introduction to Statistical Methods for High Energy Physics Glen Cowan 2006 CERN Summer Student Lectures CERN Summer Student Lectures on Statistics Glen.
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics;
G. Cowan Statistical Data Analysis / Stat 1 1 Statistical Data Analysis 2015/16 London Postgraduate Lectures on Particle Physics; University of London.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Lecture 3: Statistics Review I Date: 9/3/02  Distributions  Likelihood  Hypothesis tests.
Uniovi1 Some statistics books, papers, etc. G. Cowan, Statistical Data Analysis, Clarendon, Oxford, 1998 see also R.J. Barlow,
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis RWTH Aachen Graduiertenkolleg February, 2007.
G. Cowan Weizmann Statistics Workshop, 2015 / GDC Lecture 11 Statistical Methods for Particle Physics Lecture 1: introduction to frequentist statistics.
G. Cowan, RHUL Physics Discussion on significance page 1 Discussion on significance ATLAS Statistics Forum CERN/Phone, 2 December, 2009 Glen Cowan Physics.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
G. Cowan Lectures on Statistical Data Analysis Lecture 4 page 1 Statistical Data Analysis: Lecture 4 1Probability, Bayes’ theorem 2Random variables and.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
G. Cowan Lectures on Statistical Data Analysis Lecture 8 page 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem 2Random variables and.
1 Introduction to Statistics − Day 3 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Brief catalogue of probability densities.
G. Cowan Lectures on Statistical Data Analysis Lecture 4 page 1 Lecture 4 1 Probability (90 min.) Definition, Bayes’ theorem, probability densities and.
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
1 Introduction to Statistics − Day 2 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Brief catalogue of probability densities.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Computing and Statistical Data Analysis / Stat 1 1 Computing and Statistical Data Analysis London Postgraduate Lectures on Particle Physics; University.
G. Cowan SUSSP65, St Andrews, August 2009 / Statistical Methods 1 page 1 Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65.
G. Cowan Lectures on Statistical Data Analysis Lecture 6 page 1 Statistical Data Analysis: Lecture 6 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Statistical Data Analysis: Lecture 5 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 12 page 1 Statistical Data Analysis: Lecture 12 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Aachen 2014 / Statistics for Particle Physics, Lecture 11 Statistical Methods for Particle Physics Lecture 1: probability, random variables, MC.
1 Introduction to Statistical Methods for High Energy Physics Glen Cowan 2005 CERN Summer Student Lectures CERN Summer Student Lectures on Statistics Glen.
Introduction to Statistics − Day 2
Lectures on Statistical Data Analysis
TAE 2018 / Statistics Lecture 1
Graduierten-Kolleg RWTH Aachen February 2014 Glen Cowan
Computing and Statistical Data Analysis / Stat 8
Computing and Statistical Data Analysis Stat 3: The Monte Carlo Method
Lecture 3 1 Probability Definition, Bayes’ theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests general.
Statistics for Particle Physics Lecture 1: Fundamentals
Computing and Statistical Data Analysis / Stat 7
Introduction to Statistics − Day 4
Presentation transcript:

G. Cowan Statistical Methods in Particle Physics1 Statistical Methods in Particle Physics Day 1: Introduction 清华大学高能物理研究中心 2010 年 4 月 12—16 日 Glen Cowan Physics Department Royal Holloway, University of London

G. Cowan Statistical Methods in Particle Physics2 Outline of lectures Day #1: Introduction Review of probability and Monte Carlo Review of statistics: parameter estimation Day #2: Multivariate methods (I) Event selection as a statistical test Cut-based, linear discriminant, neural networks Day #3: Multivariate methods (II) More multivariate classifiers: BDT, SVM,... Day #4: Significance tests for discovery and limits Including systematics using profile likelihood Day #5: Bayesian methods Bayesian parameter estimation and model selection

G. Cowan Statistical Methods in Particle Physics3 Day #1: outline Probability and its role in data analysis Definition, interpretation of probability Bayes’ theorem Random variables and their properties A catalogue of distributions The Monte Carlo method Parameter estimation Method of maximum likelihood Method of least squares

G. Cowan Statistical Methods in Particle PhysicsLecture 1 page 4 Some statistics books, papers, etc. G. Cowan, Statistical Data Analysis, Clarendon, Oxford, 1998 see also R.J. Barlow, Statistics: A Guide to the Use of Statistical Methods in the Physical Sciences, Wiley, 1989 see also hepwww.ph.man.ac.uk/~roger/book.html L. Lyons, Statistics for Nuclear and Particle Physics, CUP, 1986 F. James., Statistical and Computational Methods in Experimental Physics, 2nd ed., World Scientific, 2006 S. Brandt, Statistical and Computational Methods in Data Analysis, Springer, New York, 1998 (with program library on CD) C. Amsler et al. (Particle Data Group), Review of Particle Physics, Physics Letters B667 (2008) 1; see also pdg.lbl.gov sections on probability statistics, Monte Carlo

5 Data analysis in particle physics Observe events of a certain type Measure characteristics of each event (particle momenta, number of muons, energy of jets,...) Theories (e.g. SM) predict distributions of these properties up to free parameters, e.g., , G F, M Z,  s, m H,... Some tasks of data analysis: Estimate (measure) the parameters; Quantify the uncertainty of the parameter estimates; Test the extent to which the predictions of a theory are in agreement with the data (→ presence of New Physics?) G. Cowan Statistical Methods in Particle Physics

G. Cowan Statistical Methods in Particle Physics6 A definition of probability Consider a set S with subsets A, B,... Kolmogorov axioms (1933) Also define conditional probability:

G. Cowan Statistical Methods in Particle Physics7 Interpretation of probability I. Relative frequency A, B,... are outcomes of a repeatable experiment cf. quantum mechanics, particle scattering, radioactive decay... II. Subjective probability A, B,... are hypotheses (statements that are true or false) Both interpretations consistent with Kolmogorov axioms. In particle physics frequency interpretation often most useful, but subjective probability can provide more natural treatment of non-repeatable phenomena: systematic uncertainties, probability that Higgs boson exists,...

G. Cowan Statistical Methods in Particle Physics8 Bayes’ theorem From the definition of conditional probability we have and but, so Bayes’ theorem First published (posthumously) by the Reverend Thomas Bayes (1702−1761) An essay towards solving a problem in the doctrine of chances, Philos. Trans. R. Soc. 53 (1763) 370; reprinted in Biometrika, 45 (1958) 293.

9 The law of total probability Consider a subset B of the sample space S, B ∩ A i AiAi B S divided into disjoint subsets A i such that [ i A i = S, → → → law of total probability Bayes’ theorem becomes G. Cowan Statistical Methods in Particle Physics

10 Random variables and probability density functions A random variable is a numerical characteristic assigned to an element of the sample space; can be discrete or continuous. Suppose outcome of experiment is continuous value x → f(x) = probability density function (pdf) Or for discrete outcome x i with e.g. i = 1, 2,... we have x must be somewhere probability mass function x must take on one of its possible values G. Cowan

Statistical Methods in Particle Physics11 Cumulative distribution function Probability to have outcome less than or equal to x is cumulative distribution function Alternatively define pdf with G. Cowan

Statistical Methods in Particle Physics12 Other types of probability densities Outcome of experiment characterized by several values, e.g. an n-component vector, (x 1,... x n ) Sometimes we want only pdf of some (or one) of the components → marginal pdf → joint pdf Sometimes we want to consider some components as constant → conditional pdf x 1, x 2 independent if G. Cowan

Statistical Methods in Particle Physics13 Expectation values Consider continuous r.v. x with pdf f (x). Define expectation (mean) value as Notation (often): ~ “centre of gravity” of pdf. For a function y(x) with pdf g(y), (equivalent) Variance: Notation: Standard deviation:  ~ width of pdf, same units as x. G. Cowan

Statistical Methods in Particle Physics14 Covariance and correlation Define covariance cov[x,y] (also use matrix notation V xy ) as Correlation coefficient (dimensionless) defined as If x, y, independent, i.e.,, then →x and y, ‘uncorrelated’ N.B. converse not always true. G. Cowan

Statistical Methods in Particle Physics15 Correlation (cont.) G. Cowan

Statistical Methods in Particle Physics16 Some distributions Distribution/pdfExample use in HEP BinomialBranching ratio MultinomialHistogram with fixed N PoissonNumber of events found UniformMonte Carlo method ExponentialDecay time GaussianMeasurement error Chi-squareGoodness-of-fit Cauchy Mass of resonance LandauIonization energy loss G. Cowan

Statistical Methods in Particle Physics17 Binomial distribution Consider N independent experiments (Bernoulli trials): outcome of each is ‘success’ or ‘failure’, probability of success on any given trial is p. Define discrete r.v. n = number of successes (0 ≤ n ≤ N). Probability of a specific outcome (in order), e.g. ‘ssfsf’ is But order not important; there are ways (permutations) to get n successes in N trials, total probability for n is sum of probabilities for each permutation. G. Cowan

Statistical Methods in Particle Physics18 Binomial distribution (2) The binomial distribution is therefore random variable parameters For the expectation value and variance we find: G. Cowan

Statistical Methods in Particle Physics19 Binomial distribution (3) Binomial distribution for several values of the parameters: Example: observe N decays of W ±, the number n of which are W→  is a binomial r.v., p = branching ratio. G. Cowan

Statistical Methods in Particle Physics20 Multinomial distribution Like binomial but now m outcomes instead of two, probabilities are For N trials we want the probability to obtain: n 1 of outcome 1, n 2 of outcome 2,  n m of outcome m. This is the multinomial distribution for G. Cowan

Statistical Methods in Particle Physics21 Multinomial distribution (2) Now consider outcome i as ‘success’, all others as ‘failure’. → all n i individually binomial with parameters N, p i for all i One can also find the covariance to be Example: represents a histogram with m bins, N total entries, all entries independent. G. Cowan

Statistical Methods in Particle Physics22 Poisson distribution Consider binomial n in the limit → n follows the Poisson distribution: Example: number of scattering events n with cross section  found for a fixed integrated luminosity, with G. Cowan

Statistical Methods in Particle Physics23 Uniform distribution Consider a continuous r.v. x with  ∞ < x < ∞. Uniform pdf is: N.B. For any r.v. x with cumulative distribution F(x), y = F(x) is uniform in [0,1]. Example: for  0 → , E  is uniform in [E min, E max ], with 2 G. Cowan

Statistical Methods in Particle Physics24 Exponential distribution The exponential pdf for the continuous r.v. x is defined by: Example: proper decay time t of an unstable particle (  = mean lifetime) Lack of memory (unique to exponential): G. Cowan

Statistical Methods in Particle Physics25 Gaussian distribution The Gaussian (normal) pdf for a continuous r.v. x is defined by: Special case:  = 0,  2 = 1 (‘standard Gaussian’): (N.B. often ,  2 denote mean, variance of any r.v., not only Gaussian.) If y ~ Gaussian with ,  2, then x = (y   ) /  follows  (x). G. Cowan

Statistical Methods in Particle Physics26 Gaussian pdf and the Central Limit Theorem The Gaussian pdf is so useful because almost any random variable that is a sum of a large number of small contributions follows it. This follows from the Central Limit Theorem: For n independent r.v.s x i with finite variances  i 2, otherwise arbitrary pdfs, consider the sum Measurement errors are often the sum of many contributions, so frequently measured values can be treated as Gaussian r.v.s. In the limit n → ∞, y is a Gaussian r.v. with G. Cowan

Statistical Methods in Particle Physics27 Central Limit Theorem (2) The CLT can be proved using characteristic functions (Fourier transforms), see, e.g., SDA Chapter 10. Good example: velocity component v x of air molecules. OK example: total deflection due to multiple Coulomb scattering. (Rare large angle deflections give non-Gaussian tail.) Bad example: energy loss of charged particle traversing thin gas layer. (Rare collisions make up large fraction of energy loss, cf. Landau pdf.) For finite n, the theorem is approximately valid to the extent that the fluctuation of the sum is not dominated by one (or few) terms. Beware of measurement errors with non-Gaussian tails. G. Cowan

Statistical Methods in Particle Physics28 Multivariate Gaussian distribution Multivariate Gaussian pdf for the vector are column vectors, are transpose (row) vectors, For n = 2 this is where  = cov[x 1, x 2 ]/(  1  2 ) is the correlation coefficient. G. Cowan

Statistical Methods in Particle Physics29 Chi-square (  2 ) distribution The chi-square pdf for the continuous r.v. z (z ≥ 0) is defined by n = 1, 2,... = number of ‘degrees of freedom’ (dof) For independent Gaussian x i, i = 1,..., n, means  i, variances  i 2, follows  2 pdf with n dof. Example: goodness-of-fit test variable especially in conjunction with method of least squares. G. Cowan

Statistical Methods in Particle Physics30 Cauchy (Breit-Wigner) distribution The Breit-Wigner pdf for the continuous r.v. x is defined by  = 2, x 0 = 0 is the Cauchy pdf.) E[x] not well defined, V[x] →∞. x 0 = mode (most probable value)  = full width at half maximum Example: mass of resonance particle, e.g. , K *,  0,...  = decay rate (inverse of mean lifetime) G. Cowan

Statistical Methods in Particle Physics31 Landau distribution For a charged particle with  = v /c traversing a layer of matter of thickness d, the energy loss  follows the Landau pdf: L. Landau, J. Phys. USSR 8 (1944) 201; see also W. Allison and J. Cobb, Ann. Rev. Nucl. Part. Sci. 30 (1980) 253.    d  G. Cowan

Statistical Methods in Particle Physics32 Landau distribution (2) Long ‘Landau tail’ → all moments ∞ Mode (most probable value) sensitive to , → particle i.d. G. Cowan

Statistical Methods in Particle PhysicsLecture 4 page 33 Beta distribution Often used to represent pdf of continuous r.v. nonzero only between finite limits.

G. Cowan Statistical Methods in Particle PhysicsLecture 4 page 34 Gamma distribution Often used to represent pdf of continuous r.v. nonzero only in [0,∞]. Also e.g. sum of n exponential r.v.s or time until nth event in Poisson process ~ Gamma

G. Cowan Statistical Methods in Particle PhysicsLecture 4 page 35 Student's t distribution = number of degrees of freedom (not necessarily integer) = 1 gives Cauchy, → ∞ gives Gaussian.

G. Cowan Statistical Methods in Particle PhysicsLecture 4 page 36 Student's t distribution (2) If x ~ Gaussian with  = 0,  2 = 1, and z ~  2 with n degrees of freedom, then t = x / (z/n) 1/2 follows Student's t with = n. This arises in problems where one forms the ratio of a sample mean to the sample standard deviation of Gaussian r.v.s. The Student's t provides a bell-shaped pdf with adjustable tails, ranging from those of a Gaussian, which fall off very quickly, (  → ∞, but in fact already very Gauss-like for  = two dozen), to the very long-tailed Cauchy ( = 1). Developed in 1908 by William Gosset, who worked under the pseudonym "Student" for the Guinness Brewery.

Statistical Methods in Particle Physics37 What it is: a numerical technique for calculating probabilities and related quantities using sequences of random numbers. The usual steps: (1) Generate sequence r 1, r 2,..., r m uniform in [0, 1]. (2) Use this to produce another sequence x 1, x 2,..., x n distributed according to some pdf f (x) in which we’re interested (x can be a vector). (3) Use the x values to estimate some property of f (x), e.g., fraction of x values with a < x < b gives → MC calculation = integration (at least formally) MC generated values = ‘simulated data’ → use for testing statistical procedures The Monte Carlo method G. Cowan

Statistical Methods in Particle Physics38 Random number generators Goal: generate uniformly distributed values in [0, 1]. Toss coin for e.g. 32 bit number... (too tiring). → ‘random number generator’ = computer algorithm to generate r 1, r 2,..., r n. Example: multiplicative linear congruential generator (MLCG) n i+1 = (a n i ) mod m, where n i = integer a = multiplier m = modulus n 0 = seed (initial value) N.B. mod = modulus (remainder), e.g. 27 mod 5 = 2. This rule produces a sequence of numbers n 0, n 1,... G. Cowan

Statistical Methods in Particle Physics39 Random number generators (2) The sequence is (unfortunately) periodic! Example (see Brandt Ch 4): a = 3, m = 7, n 0 = 1 ← sequence repeats Choose a, m to obtain long period (maximum = m  1); m usually close to the largest integer that can represented in the computer. Only use a subset of a single period of the sequence. G. Cowan

Statistical Methods in Particle Physics40 Random number generators (3) are in [0, 1] but are they ‘random’? Choose a, m so that the r i pass various tests of randomness: uniform distribution in [0, 1], all values independent (no correlations between pairs), e.g. L’Ecuyer, Commun. ACM 31 (1988) 742 suggests a = m = Far better algorithms available, e.g. TRandom3, period See F. James, Comp. Phys. Comm. 60 (1990) 111; Brandt Ch. 4 G. Cowan

Statistical Methods in Particle Physics41 The transformation method Given r 1, r 2,..., r n uniform in [0, 1], find x 1, x 2,..., x n that follow f (x) by finding a suitable transformation x (r). Require: i.e. That is, setand solve for x (r). G. Cowan

Statistical Methods in Particle Physics42 Example of the transformation method Exponential pdf: Set and solve for x (r). → works too.) G. Cowan

Statistical Methods in Particle Physics43 The acceptance-rejection method Enclose the pdf in a box: (1) Generate a random number x, uniform in [x min, x max ], i.e. r 1 is uniform in [0,1]. (2) Generate a 2nd independent random number u uniformly distributed between 0 and f max, i.e. (3) If u < f (x), then accept x. If not, reject x and repeat. G. Cowan

Statistical Methods in Particle Physics44 Example with acceptance-rejection method If dot below curve, use x value in histogram. G. Cowan

Statistical Methods in Particle Physics45 Parameter estimation The parameters of a pdf are constants that characterize its shape, e.g. r.v. Suppose we have a sample of observed values: parameter We want to find some function of the data to estimate the parameter(s): ← estimator written with a hat Sometimes we say ‘estimator’ for the function of x 1,..., x n ; ‘estimate’ for the value of the estimator with a particular data set. G. Cowan

Statistical Methods in Particle Physics46 Properties of estimators If we were to repeat the entire measurement, the estimates from each would follow a pdf: biased large variance best We want small (or zero) bias (systematic error): → average of repeated measurements should tend to true value. And we want a small variance (statistical error): → small bias & variance are in general conflicting criteria G. Cowan

Statistical Methods in Particle Physics47 An estimator for the mean (expectation value) Parameter: Estimator: We find: (‘sample mean’) G. Cowan

Statistical Methods in Particle Physics48 An estimator for the variance Parameter: Estimator: (factor of n  1 makes this so) (‘sample variance’) We find: where G. Cowan

Statistical Methods in Particle Physics49 The likelihood function Suppose the outcome of an experiment is: x 1,..., x n, which is modeled as a sample from a joint pdf with parameter(s)  : Now evaluate this with the data sample obtained and regard it as a function of the parameter(s). This is the likelihood function: (x i constant) If the x i are independent observations of x ~ f(x;  ), then, G. Cowan

Statistical Methods in Particle Physics50 Maximum likelihood estimators If the hypothesized  is close to the true value, then we expect a high probability to get data like that which we actually found. So we define the maximum likelihood (ML) estimator(s) to be the parameter value(s) for which the likelihood is maximum. ML estimators not guaranteed to have any ‘optimal’ properties, (but in practice they’re very good). G. Cowan

Statistical Methods in Particle Physics51 ML example: parameter of exponential pdf Consider exponential pdf, and suppose we have data, The likelihood function is The value of  for which L(  ) is maximum also gives the maximum value of its logarithm (the log-likelihood function): G. Cowan

Statistical Methods in Particle Physics52 ML example: parameter of exponential pdf (2) Find its maximum by setting → Monte Carlo test: generate 50 values using  = 1: We find the ML estimate: G. Cowan

Statistical Methods in Particle Physics53 Variance of estimators: Monte Carlo method Having estimated our parameter we now need to report its ‘statistical error’, i.e., how widely distributed would estimates be if we were to repeat the entire measurement many times. One way to do this would be to simulate the entire experiment many times with a Monte Carlo program (use ML estimate for MC). For exponential example, from sample variance of estimates we find: Note distribution of estimates is roughly Gaussian − (almost) always true for ML in large sample limit. G. Cowan

Statistical Methods in Particle Physics54 Variance of estimators from information inequality The information inequality (RCF) sets a lower bound on the variance of any estimator (not only ML): Often the bias b is small, and equality either holds exactly or is a good approximation (e.g. large data sample limit). Then, Estimate this using the 2nd derivative of ln L at its maximum: G. Cowan

Statistical Methods in Particle Physics55 Variance of estimators: graphical method Expand ln L (  ) about its maximum: First term is ln L max, second term is zero, for third term use information inequality (assume equality): i.e., → to get, change  away from until ln L decreases by 1/2. G. Cowan

Statistical Methods in Particle Physics56 Example of variance by graphical method ML example with exponential: Not quite parabolic ln L since finite sample size (n = 50). G. Cowan

Statistical Methods in Particle Physics57 The method of least squares Suppose we measure N values, y 1,..., y N, assumed to be independent Gaussian r.v.s with Assume known values of the control variable x 1,..., x N and known variances The likelihood function is We want to estimate , i.e., fit the curve to the data points. G. Cowan

Statistical Methods in Particle Physics58 The method of least squares (2) The log-likelihood function is therefore So maximizing the likelihood is equivalent to minimizing Minimum of this quantity defines the least squares estimator Often minimize  2 numerically (e.g. program MINUIT ). G. Cowan

Statistical Methods in Particle Physics59 Example of least squares fit Fit a polynomial of order p: G. Cowan

Statistical Methods in Particle Physics60 Variance of LS estimators In most cases of interest we obtain the variance in a manner similar to ML. E.g. for data ~ Gaussian we have and so or for the graphical method we take the values of  where 1.0 G. Cowan

Statistical Methods in Particle Physics61 Goodness-of-fit with least squares The value of the  2 at its minimum is a measure of the level of agreement between the data and fitted curve: It can therefore be employed as a goodness-of-fit statistic to test the hypothesized functional form (x;  ). We can show that if the hypothesis is correct, then the statistic t =  2 min follows the chi-square pdf, where the number of degrees of freedom is n d = number of data points  number of fitted parameters G. Cowan

Statistical Methods in Particle Physics62 Goodness-of-fit with least squares (2) The chi-square pdf has an expectation value equal to the number of degrees of freedom, so if  2 min ≈ n d the fit is ‘good’. More generally, find the p-value: E.g. for the previous example with 1st order polynomial (line), whereas for the 0th order polynomial (horizontal line), This is the probability of obtaining a  2 min as high as the one we got, or higher, if the hypothesis is correct. G. Cowan

Statistical Methods in Particle Physicspage 63 Summary We have quickly reviewed a large amount of material: Probability Distributions and their properties Monte Carlo Parameter estimation (ML, LS) For a slower-paced treatment, see, e.g. the slides from the University of London course: Next: statistical tests and multivariate methods