1 A Bayesian statistical method for particle identification in shower counters IX International Workshop on Advanced Computing and Analysis Techniques.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
Bayesian Estimation in MARK
LECTURE 11: BAYESIAN PARAMETER ESTIMATION
1 Methods of Experimental Particle Physics Alexei Safonov Lecture #21.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
Probability Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
An Optimal Learning Approach to Finding an Outbreak of a Disease Warren Scott Warren Powell
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Bayesian Models Honors 207, Intro to Cognitive Science David Allbritton An introduction to Bayes' Theorem and Bayesian models of human cognition.
Statistics for Marketing & Consumer Research Copyright © Mario Mazzocchi 1 Further advanced methods Chapter 17.
Lecture II-2: Probability Review
B. RAMAMURTHY EAP#2: Data Mining, Statistical Analysis and Predictive Analytics for Automotive Domain CSE651C, B. Ramamurthy 1 6/28/2014.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
1 Probability and Statistics  What is probability?  What is statistics?
1 Theoretical Physics Experimental Physics Equipment, Observation Gambling: Cards, Dice Fast PCs Random- number generators Monte- Carlo methods Experimental.
Harrison B. Prosper Workshop on Top Physics, Grenoble Bayesian Statistics in Analysis Harrison B. Prosper Florida State University Workshop on Top Physics:
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Naive Bayes Classifier
“PREDICTIVE MODELING” CoSBBI, July Jennifer Hu.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics;
Maximum Likelihood - "Frequentist" inference x 1,x 2,....,x n ~ iid N( ,  2 ) Joint pdf for the whole random sample Maximum likelihood estimates.
Sample variance and sample error We learned recently how to determine the sample variance using the sample mean. How do we translate this to an unbiased.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Uncertainty in Expert Systems
Release Validation J. Apostolakis, M. Asai, G. Cosmo, S. Incerti, V. Ivantchenko, D. Wright for Geant4 12 January 2009.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 07: BAYESIAN ESTIMATION (Cont.) Objectives:
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Bayesian Prior and Posterior Study Guide for ES205 Yu-Chi Ho Jonathan T. Lee Nov. 24, 2000.
Sampling and estimation Petter Mostad
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
The Uniform Prior and the Laplace Correction Supplemental Material not on exam.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
Univariate Gaussian Case (Cont.)
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Statistical Data Analysis: Lecture 5 1Probability, Bayes’ theorem 2Random variables and.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Naive Bayes Classifier. REVIEW: Bayesian Methods Our focus this lecture: – Learning and classification methods based on probability theory. Bayes theorem.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
A Study on Speaker Adaptation of Continuous Density HMM Parameters By Chin-Hui Lee, Chih-Heng Lin, and Biing-Hwang Juang Presented by: 陳亮宇 1990 ICASSP/IEEE.
Bayesian Estimation and Confidence Intervals Lecture XXII.
Bayesian Inference: Multiple Parameters
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Lecture 1.31 Criteria for optimal reception of radio signals.
CS479/679 Pattern Recognition Dr. George Bebis
Chapter 4 Probability.
Bayes for Beginners Stephanie Azzopardi & Hrvoje Stojic
Special Topics In Scientific Computing
Statistics in Applied Science and Technology
More about Posterior Distributions
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Statistical NLP: Lecture 4
LECTURE 09: BAYESIAN LEARNING
LECTURE 07: BAYESIAN ESTIMATION
CS639: Data Management for Data Science
Mathematical Foundations of BME Reza Shadmehr
Presentation transcript:

1 A Bayesian statistical method for particle identification in shower counters IX International Workshop on Advanced Computing and Analysis Techniques in Physics Research December 1-5, 2003 N.Takashimizu 1, A.Kimura 2, A.shibata 3 and T.Sasaki 3 1 Shimane University 2 Ritsumeikan University 3 High Energy Accelerator Research Organization

2 Introduction We made an attempt to identify particle using Bayesian statistical method. The particle identification will be possible by extracting pattern of showers because the energy distribution differ with incident particle or energy. Using Bayesian method in addition to the existing particle identification method, the improvement of experimental precision is expected.

3 Bayes’ Theorem Bayes ’ theorem is a simple formula which gives the probability of a hypothesis H from an observation A. We can calculate the conditional probability of H which causes A as follows. –P(A|H) : The probability of A given by H –P(H) : The probability prior to the observations –P(A) : The probability of A whether H is true or not Bayes ’ theorem gives a learning system how to update parameters after observing A.

4 Bayesian Estimation Bayesian estimation is a statistical method based on the Bayes ’ theorem. –Think of unknown parameters as probability variables and give them density distributions instead of estimating particular value. Represent information about parameters as prior distribution p ( θ , x ) before we make observations. –Generally the prior distribution is not sharp because our knowledge about parameter is insufficient before observation.

5 Bayesian Estimation When we make an observation the posterior distribution can be calculated by using both data generation model and prior distribution. The predictive distribution of the future observation based on the observed data x= ( x 1 , x 2 , …x n ) is the expectation of the model for all possible posterior distribution.

6 Appling to the shower Now we apply the bayesian estimation to the electromagretic shower P(  ) m(x| q ) P(  |x) P(x n+1 |x) Model of the energy deposit in the shower characterized by mean  and variance  Conditional distribution  of N events given Prediction of the next event Prior distribution of parameters

7 Shower Modeling Divide a calorimeter into 16 blocks vertically to the incident direction. Model distribution of electromagnetic shower is denoted in terms of the sum of energy deposit in each block e 1, e 2,…………..,  Nb (Nb= 16). 11 22  Nb x y z ……

8 Model Distribution If the shape of the shower  is multivariate normal distribution N(θ,Σ) then the model is presented as When the shower is caused by particle  with incident energy E 0 the model above is represented by To simplify the calculation we assume there is no correlation among energy deposit in each block.

9 Model Distribution After N observation the model will be a joint probability density

10 Posterior Distribution When we assume prior distribution is uniform, it is given by The posterior distribution is given in terms of the model and the prior distribution when observing n showers caused by , E 0

11 Predictive Distribution Finally the next shower can be predicted on condition that n- shower, particle and incident energy are known.

12 Particle Identification Given the next shower the conditional probability for occurrence of that shower is obtained from the predictive distribution. Selecting the most probable condition, that is, a parameter set of  and E 0, enable us the particle identification.

13 Bayesian Learning for simulation data Monte Carlo simulation(Geant4) –Calorimeter configuration Material : Lead Grass Pb (66.0%), O (19.9%), Si (12.7%), K (0.8%), Na (0.4%), As (0.2%) density : 5.2 g/cm 3 Size : 20cm Structure : A total of 20*20*20 lead grass of 1cm cube x y z 20

14 Bayesian Learning for simulation data –Incident angle : (0,0,1) –Incident position : (10,10,0) –Data for learning :  = (e-,  -) E 0 = (0.5,1.0,2.0,3.0)GeV Incident direction y x z

15 Energy distribution

16 Result e - , 0.5e - , 1.0  - , 0.5 e - , 3.0e - , 2.0  - , 1.0  - , 2.0  - , 3.0 e - , 0.5 e - , 1.0 e - , 2.0 e - , 3.0  - , 0.5  - , 1.0  - , 2.0  - , Condition Data for learning

17 Summary We made an attempt to identify particle by means of modeling the shower profile based on Bayesian statistics and develop the possibility for Bayesian approach. Without any other information e.g. charges of particles given by tracking detectors, we have obtained a high percentage of correct identification for e  and   Future plan improvement of model and prior distribution