Download presentation
Presentation is loading. Please wait.
Published byDora Lester Modified over 9 years ago
1
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 1 Statistical Analysis Tools for Particle Physics IDPASC School of Flavour Physics Valencia, 2-7 May, 2013 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA
2
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 2 Outline Multivariate methods for HEP Event selection as a statistical test Neyman-Pearson lemma and likelihood ratio test Some multivariate classifiers: Linear Neural networks Boosted Decision Trees Support Vector Machines
3
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 3
4
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 4
5
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 5 A simulated SUSY event in ATLAS high p T muons high p T jets of hadrons missing transverse energy p p
6
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 6 Background events This event from Standard Model ttbar production also has high p T jets and muons, and some missing transverse energy. → can easily mimic a SUSY event.
7
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 7 Event selection as a statistical test For each event we measure a set of numbers: x 1 = jet p T x 2 = missing energy x 3 = particle i.d. measure,... follows some n-dimensional joint probability density, which depends on the type of event produced, i.e., was it E.g. hypotheses H 0, H 1,... Often simply “signal”, “background”
8
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 8 Finding an optimal decision boundary In particle physics usually start by making simple “cuts”: x i < c i x j < c j Maybe later try some other type of decision boundary: H0H0 H0H0 H0H0 H1H1 H1H1 H1H1
9
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 9
10
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 10
11
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 11
12
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 12
13
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 13
14
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 14
15
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 15
16
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 16
17
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 17
18
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 18
19
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 19 2
20
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 20
21
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 21
22
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 22
23
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 23
24
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 24
25
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 25
26
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 26
27
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 27
28
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 28
29
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 29
30
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 30
31
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 31
32
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 32
33
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 33
34
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 34
35
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 35
36
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 36 Overtraining training sample independent test sample If decision boundary is too flexible it will conform too closely to the training points → overtraining. Monitor by applying classifier to independent test sample.
37
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 37
38
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 38
39
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 39
40
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 40 Neural network example from LEP II Signal: e e → W W (often 4 well separated hadron jets) Background: e e → qqgg (4 less well separated hadron jets) ← input variables based on jet structure, event shape,... none by itself gives much separation. Neural network output: (Garrido, Juste and Martinez, ALEPH 96-144)
41
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 41
42
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 42
43
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 43
44
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 44
45
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 45
46
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 46 Kernel-based PDE (KDE, Parzen window) Consider d dimensions, N training events, x 1,..., x N, estimate f (x) with Use e.g. Gaussian kernel: kernel bandwidth (smoothing parameter) Need to sum N terms to evaluate function (slow); faster algorithms only count events in vicinity of x (k-nearest neighbor, range search).
47
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 47
48
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 48
49
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 49
50
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 50 Particle i.d. in MiniBooNE Detector is a 12-m diameter tank of mineral oil exposed to a beam of neutrinos and viewed by 1520 photomultiplier tubes: H.J. Yang, MiniBooNE PID, DNP06 Search for to e oscillations required particle i.d. using information from the PMTs.
51
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 51 Decision trees Out of all the input variables, find the one for which with a single cut gives best improvement in signal purity: Example by MiniBooNE experiment, B. Roe et al., NIM 543 (2005) 577 where w i. is the weight of the ith event. Resulting nodes classified as either signal/background. Iterate until stop criterion reached based on e.g. purity or minimum number of events in a node. The set of cuts defines the decision boundary.
52
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 52 Finding the best single cut The level of separation within a node can, e.g., be quantified by the Gini coefficient, calculated from the (s or b) purity as: For a cut that splits a set of events a into subsets b and c, one can quantify the improvement in separation by the change in weighted Gini coefficients: where, e.g., Choose e.g. the cut to the maximize ; a variant of this scheme can use instead of Gini e.g. the misclassification rate:
53
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 53
54
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 54
55
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 55
56
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 56
57
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 57
58
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 58
59
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 59
60
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 60
61
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 61
62
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 62 Monitoring overtraining From MiniBooNE example: Performance stable after a few hundred trees.
63
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 63 Comparison of boosting algorithms A number of boosting algorithms on the market; differ in the update rule for the weights.
64
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 64
65
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 65 Support Vector Machines Map input variables into high dimensional feature space: x → Maximize distance between separating hyperplanes (margin) subject to constraints allowing for some misclassification. Final classifier only depends on scalar products of (x): So only need kernel Bishop ch 7
66
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 66
67
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 67
68
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 68
69
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 69
70
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 70
71
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 71
72
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 72
73
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 73
74
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 74
75
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 75
76
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 76
77
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 77
78
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 78
79
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 79 Summary on multivariate methods Particle physics has used several multivariate methods for many years: linear (Fisher) discriminant neural networks naive Bayes and has in the last several years started to use a few more k-nearest neighbour boosted decision trees support vector machines The emphasis is often on controlling systematic uncertainties between the modeled training data and Nature to avoid false discovery. Although many classifier outputs are "black boxes", a discovery at 5 significance with a sophisticated (opaque) method will win the competition if backed up by, say, 4 evidence from a cut-based method.
80
Quotes I like “If you believe in something you don't understand, you suffer,...” – Stevie Wonder “Keep it simple. As simple as possible. Not any simpler.” – A. Einstein G. Cowan page 80 IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools
81
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 81 Extra slides
82
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 82 Two distinct event selection problems In some cases, the event types in question are both known to exist. Example: separation of different particle types (electron vs muon) Use the selected sample for further study. In other cases, the null hypothesis H 0 means "Standard Model" events, and the alternative H 1 means "events of a type whose existence is not yet established" (to do so is the goal of the analysis). Many subtle issues here, mainly related to the heavy burden of proof required to establish presence of a new phenomenon. Typically require p-value of background-only hypothesis below ~ 10 (a 5 sigma effect) to claim discovery of "New Physics".
83
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 83 Using classifier output for discovery y f(y)f(y) y N(y)N(y) Normalized to unity Normalized to expected number of events excess? signal background search region Discovery = number of events found in search region incompatible with background-only hypothesis. p-value of background-only hypothesis can depend crucially distribution f(y|b) in the "search region". y cut
84
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 84
85
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 85 Single top quark production (CDF/D0) Top quark discovered in pairs, but SM predicts single top production. Use many inputs based on jet properties, particle i.d.,... signal (blue + green) Pair-produced tops are now a background process.
86
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools page 86 Different classifiers for single top Also Naive Bayes and various approximations to likelihood ratio,.... Final combined result is statistically significant (>5 level) but not easy to understand classifier outputs.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.