Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen1 Statistical Learning Basics Jens Zimmermann Max-Planck-Institut für Physik,

Similar presentations


Presentation on theme: "Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen1 Statistical Learning Basics Jens Zimmermann Max-Planck-Institut für Physik,"— Presentation transcript:

1 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen1 Statistical Learning Basics Jens Zimmermann zimmerm@mppmu.mpg.de Max-Planck-Institut für Physik, München Forschungszentrum Jülich GmbH XEUS and MAGIC Example:  -hadron Separation Basic Concepts and Notions Classical Methods Statistical Learning Methods Statistical Learning Theory Conclusion

2 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen2 XEUS and MAGIC X-rays: 0.1 – 10 keVGamma-rays: 10 – 1000 GeV X-ray Evolving Universe Spectroscopy Mission Major Atmospheric Gamma Imaging Cerenkov Telescope Launched into space ~ 2015 Built in La Palma 2003 AGN SNR GRB First galaxies Metal synthesis IGM Charge distribution Cherenkov photons ellipse

3 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen3 Example:  -hadron separation  photon: small  hadron: uniform  „Hillas“ parameters: length width size... Photon excess for small  - significance of excess - number of excess events beforeafter

4 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen4 Example:  -hadron separation Choose (preprocessed) inputs ( length, width, size,... ) Classification: photon vs. hadron Offline Analysis: Train with simulated photons Comparison to classical „ supercuts “ method Neural Network based on linear separation details to be discussed

5 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen5 Training of Statistical Learning Methods Statistical Learning Method: From N examplesinfer a rule Important: Generalisation vs. Overtraining Without noise and separable by complicated boundary? 0 1 2 3 4 5 6 Easily separable but with noise? Too high degree of polynomial results in interpolation but too low degree means bad approximation

6 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen6 Classification vs. Regression Pileup vs. Single photon Reconstruction of the incident position with subpixel resolution  =E out -E true  =x out -x true XEUS – x[µm]MAGIC – E[GeV]  ²= ²+   ² Gamma vs. Myon vs. Hadron event Reconstruction of the primary photon energy validation training XEUS – Photon recognition 0 output 1 photons pileups

7 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen7 Inputs and Preprocessing Reasonable selection of inputs = Steering the search in function space MAGIC A B C D A C B D XEUS as many as necessary, as few as possible highest possible analysis level make use of symmetries reflection rotation Measure importance of inputs correlation relevance

8 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen8 Motivation and Training Data Lack of time „Online“ application, usually trigger Very fast hardware implementations of statistical learning methods (down to few 100 ns) Training with offline analysis neural network trigger at the H1 experiment

9 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen9 Motivation and Training Data Lack of knowledge „Offline“ application No theoretical description of the data Theoretical prediction does not match data Theory too complicated to construct algorithm Performance increase with statistical learning methods Incorporate prior knowledge by preprocessing Training with Monte Carlo simulation (careful!) or modified experiment mesh experiment

10 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen10 Classical Methods Classification: “cuts” two univariate cuts vs. one multivariate cut XEUS: patterns which can be generated by single photons Regression: “fit” MAGIC: estimate energy of the primary photonminimise relative error

11 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen11 Statistical Learning Methods Decision Trees C4.5 CAL5 Local Density Estimators Naïve Bayes “Maximum Likelihood” k-Nearest Neighbours CART Linear Separation Neural Network Support Vector Machine Linear Discriminant Analysis Meta Learning Strategies Bagging Boosting Random Subspace

12 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen12 Some Events 0 1 2 3 4 5 6 x10 # formulas # slides 0 1 2 3 4 5 6 x10 # formulas# slides 4221 288 7119 6431 2936 1534 4844 5651 2555 1216 Experimentalists Theorists

13 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen13 Decision Trees 0 2 4 6 x10 # formulas #formulas < 20 exp #formulas > 60th 0 2 4 6 x10 # slides 20 < #formulas < 60? #slides > 40exp #slides < 40th #slides < 40#slides > 40 expth #formulas < 20 #formulas > 60 rest exp th all events subset 20 < #formulas < 60

14 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen14 Local Density Estimators Search for similar events that are already classified and count the members of the two classes. 0 1 2 3 4 5 6 x10 # formulas # slides 0 1 2 3 4 5 6 x10 # formulas # slides 0 1 2 3 4 5 6 x10

15 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen15 Methods Based on Linear Separation Divide the input space into regions separated by one or more hyperplanes. Extrapolation is done! 0 1 2 3 4 5 6 x10 # formulas # slides 0 1 2 3 4 5 6 x10 # formulas # slides 0 1 2 3 4 5 6 x10

16 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen16 Meta-Learning Strategies Training Data Classifier 1 Classifier 2 Classifier 3 Classifier n Combine different classifications to one final decision

17 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen17 Statistical Learning Theory error on training settrue error loss function, here misclassifications PAC-Learning (probably approximately correct) finite hypotheses space H, size of training set is n, target function y is in H probably approximately

18 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen18 Statistical Learning Theory VC-Framework (Vapnik-Chervonenkis) VC-dimension of linear separation in two dimensions is three because three points can be shattered but not four points. bound for the true error depending on VC-dimension “generalisation error”

19 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen19 Conclusion Many applications for statistical learning methods in high energy and astrophysics Classification or Regression Online or Offline Many different methods with three basic ideas (decision trees, local density estimators, linear separation) Rich theory

20 Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen20 Next Talk Very promising results with statistical learning methods But: Can they be trusted? Can they be controlled? Can one calculate uncertainties?


Download ppt "Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen1 Statistical Learning Basics Jens Zimmermann Max-Planck-Institut für Physik,"

Similar presentations


Ads by Google