Bayesian Statistics at work: The Troublesome Extraction of the angle a

Slides:



Advertisements
Similar presentations
Tests of Hypotheses Based on a Single Sample
Advertisements

CKM Fits: What the Data Say Stéphane T’Jampens LAPP (CNRS/IN2P3 & Université de Savoie) On behalf of the CKMfitter group
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Fundamentals of Data Analysis Lecture 12 Methods of parametric estimation.
Bayesian inference Gil McVean, Department of Statistics Monday 17 th November 2008.
Slide 1 Statistics for HEP Roger Barlow Manchester University Lecture 1: Probability.
A Discussion of the Bayesian Approach Reference: Chapter 1 and notes from Dr. David Madigan.
THE PROCESS OF SCIENCE. Assumptions  Nature is real, understandable, knowable through observation  Nature is orderly and uniform  Measurements yield.
7 November The 2003 CHEBS Seminar 1 The problem with costs Tony O’Hagan CHEBS, University of Sheffield.
The Calibration Process
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 7 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 8 Tests of Hypotheses Based on a Single Sample.
Frequentist versus Bayesian. Glen CowanStatistics in HEP, IoP Half Day Meeting, 16 November 2005, Manchester The Bayesian approach In Bayesian statistics.
Additional Slides on Bayesian Statistics for STA 101 Prof. Jerry Reiter Fall 2008.
Evidence Based Medicine
Non-parametric Tests. With histograms like these, there really isn’t a need to perform the Shapiro-Wilk tests!
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Conceptual Modelling and Hypothesis Formation Research Methods CPE 401 / 6002 / 6003 Professor Will Zimmerman.
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics;
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling Introduction STATISTICS Introduction.
Three Frameworks for Statistical Analysis. Sample Design Forest, N=6 Field, N=4 Count ant nests per quadrat.
Lecture V Probability theory. Lecture questions Classical definition of probability Frequency probability Discrete variable and probability distribution.
Bayesian Inference, Review 4/25/12 Frequentist inference Bayesian inference Review The Bayesian Heresy (pdf)pdf Professor Kari Lock Morgan Duke University.
Question paper 1997.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Sampling and estimation Petter Mostad
G. Cowan Lectures on Statistical Data Analysis Lecture 8 page 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem 2Random variables and.
MPS/MSc in StatisticsAdaptive & Bayesian - Lect 71 Lecture 7 Bayesian methods: a refresher 7.1 Principles of the Bayesian approach 7.2 The beta distribution.
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
Univariate Gaussian Case (Cont.)
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Introduction to emulators Tony O’Hagan University of Sheffield.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
Establishing by the laboratory of the functional requirements for uncertainty of measurements of each examination procedure Ioannis Sitaras.
On triangular norms, metric spaces and a general formulation of the discrete inverse problem or starting to think logically about uncertainty On triangular.
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
Canadian Bioinformatics Workshops
Research Design
Virtual University of Pakistan
Introduction to Quantitative Research
Intro to Research Methods
Univariate Gaussian Case (Cont.)
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Biointelligence Laboratory, Seoul National University
Chapter 1 Introduction.
ESTIMATION.
The scope and focus of the Research
Bayesian Generalized Product Partition Model
Formulate the Research Problem
Statistical Data Analysis
Reasoning Under Uncertainty in Expert System
Unit 6 Probability.
Bayesian Inference, Basics
Revealing priors on category structures through iterated learning
Warmup To check the accuracy of a scale, a weight is weighed repeatedly. The scale readings are normally distributed with a standard deviation of
Estimating with Confidence
Rai University , November 2014
Statistical Data Analysis
LECTURE 09: BAYESIAN LEARNING
Total Energy is Conserved.
CS 594: Empirical Methods in HCC Introduction to Bayesian Analysis
Lecture # 2 MATHEMATICAL STATISTICS
Probability.
CS639: Data Management for Data Science
Computing and Statistical Data Analysis / Stat 10
Bayesian analysis at work troublesome examples
Presentation transcript:

Bayesian Statistics at work: The Troublesome Extraction of the angle a Stéphane T’JAMPENS LAPP (CNRS/IN2P3 & Université de Savoie) J. Charles, A. Hocker, H. Lacker, F.R. Le Diberder, S. T’Jampens, hep-ph-0607246

P(model|data)  Likelihood(data;model)  Prior(model) Bayesian Statistics in 1 slide The Bayesian approach is based on the use of inverse probability (“posterior”): Bayesian: probability about the model (degree of belief), given the data P(model|data)  Likelihood(data;model)  Prior(model) Bayes’rule Cox – Principles of Statistical Inference (2006) “it treats information derived from data (“likelihood”) as on exactly equal footing with probabilities derived from vague and unspecified sources (“prior”). The assumption that all aspects of uncertainties are directly comparable is often unacceptable.” “nothing guarantees that my uncertainty assessment is any good for you - I'm just expressing an opinion (degree of belief). To convince you that it's a good uncertainty assessment, I need to show that the statistical model I created makes good predictions in situations where we know what the truth is, and the process of calibrating predictions against reality is inherently frequentist.” (e.g., MC simulations)

Uniform prior: model of ignorance? Cox – Principles of Statistical Inference (2006) A central problem : specifying a prior distribution for a parameter about which nothing is known  flat prior Problems: Not re-parametrization invariant (metric dependent): uniform in q is not uniform in z=cosq Favors large values too much [the prior probability for the range 0.1 to 1 is 10 times less than for 1 to 10] Flat priors in several dimensions may produce clearly unacceptable answers. In simple problems, appropriate* flat priors yield essentially same answer as non-Bayesian sampling theory. However, in other situations, particularly those involving more than two parameters, ignorance priors lead to different and entirely unacceptable answers. * (uniform prior for scalar location parameter, Jeffreys’ prior for scalar scale parameter).

Uniform Prior in Multidimensional Parameter Space Hypersphere: 6D space One knows nothing about the individual Cartesian coordinates x,y,z… One has achieved the remarkable feat of learning something about the radius of the hypersphere, whereas one knew nothing about the Cartesian coordinates and without making any experiment. What do we known about the radius r =√(x^2+y^2+…) ?

Isospin Analysis : B→hh J. Charles et al. – hep-ph/0607246 Gronau/London (1990) MA: Modulus & Argument RI: Real & Imaginary Improper posterior

Isospin Analysis: removing information from B0→p0p0 No model-independent constraint on a can be inferred in this case  Information is extracted on a, which is introduced by the priors (where else?)

Conclusion PHYSTAT Conferences: http://www.phystat.org Statistics is not a science, it is mathematics (Nature will not decide for us) [You will not learn it in Physics books  go to the professional literature!] Many attempts to define “ignorance” prior to “let the data speak by themselves” but none convincing. Priors are informative. Quite generally a prior that gives results that are reasonable from various viewpoints for a single parameter will have unappealing features if applied independently to many parameters. In a multiparameter space, credible Bayesian intervals generally under-cover. If the problem has some invariance properties, then the prior should have the corresponding structure. specification of priors is fraught with pitfalls (especially in high dimensions). Examine the consequences of your assumptions (metric, priors, etc.) Check for robustness: vary your assumptions Exploring the frequentist properties of the result should be strongly encouraged.

BACKUP SLIDES

P(model|data) Likelihood(data,model)  Prior(model) Digression: Statistics D.R. Cox, Principles of Statistical Inference, CUP (2006) W.T. Eadie et al., Statistical Methods in Experimental Physics, NHP (1971) www.phystat.org Statistics tries answering a wide variety of questions  two main different! frameworks: Frequentist: probability about the data (randomness of measurements), given the model P(data|model) Hypothesis testing: given a model, assess the consistency of the data with a particular parameter value  1-CL curve (by varying the parameter value) [only repeatable events (Sampling Theory)] Bayesian: probability about the model (degree of belief), given the data P(model|data) Likelihood(data,model)  Prior(model)

D.R. Cox – PHYSTAT 05

D.R. Cox – PHYSTAT 05

Sujective/Objective Cox/Hinkley – Theoretical Statistics “It is important not to be misled by somewhat emotive words like subjective and objective. There are appreciable personal elements entering into all phases of scientific investigations. So far as statistical analysis is concerned, formulation of a model and of a question for analysis are two crucial elements in which judgment, personal experience, etc., play an important role. Yet they are open to rational discussion. [...] Given the model and a question about it, it is, however, reasonable to expect an objective assessment of the contribution of the data, and to a large extent this is provided by the frequentist approach”