Download presentation
Presentation is loading. Please wait.
Published byJonah Potter Modified over 8 years ago
1
Information and Statistics in Nuclear Experiment and Theory - Introduction D. G. Ireland 16 November 2015 ISNET-3, ECT* Trento
2
1 What are we doing here this week?
3
2 Key Questions to be addressed How can we estimate statistical uncertainties of calculated quantities? How we can assess the systematic errors arising from physical approximations? How can model-based extrapolations be validated and verified? How can we improve the predictive power of theoretical models? When is the application of statistical methods justified, and can they give robust results? What experimental data are crucial for better constraining current nuclear models? How can the uniqueness and usefulness of an observable be assessed, i.e., its information content with respect to current theoretical models? How can statistical tools of nuclear theory help planning future experiments and experimental programs? How to quantitatively compare the predictive power of different theoretical models?
4
3 Methods to be discussed Statistical methods and methods of statistical learning: parameter estimation, covariance analysis, robust techniques and least-squares alternatives, regression diagnostics, outliers detection Information theory Bayesian approaches and consistent inclusion of a priori expectations Uncertainty quantification and Monte-Carlo error propagation Computational techniques
5
4 Our Task Nature (QCD) Reactions manifests itself in are accessible to Theories help us understand inspire Measurements
6
5 Measurement “When you can measure what you are speaking about, and express it in numbers, you know something about it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts advanced to the stage of science.” ― William Thomson, 1st Baron Kelvin “When you can measure what you are speaking about, and express it in numbers, you know something about it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts advanced to the stage of science.” ― William Thomson, 1st Baron Kelvin
7
Scattering Experiments 6 “high p T event”
8
7 But... There is no such thing as a complete measurement!
9
8 Recall: Data Points summarise PDFs
10
9 Uncertainty “Uncertainty is an uncomfortable position. But certainty is an absurd one.” ― Voltaire
11
10 How we update our knowledge
12
Information Gain All models of reality Models consistent with data 11
13
Channel Noisy Channel Communication 12 {X}{Y} Input Symbols Measured Symbols
14
Experiment Experimental Measurement 13 {A x }{A y } Input Symbols Measured Symbols
15
Quantifying Information Information quantified with Shannon Entropy: For N discrete outcomes: Maximum Entropy is lnN Minimum Entropy is 0 14 Conditional Entropy of X given Y: Mutual Information between X and Y: Average uncertainty in x remaining when y is known Average reduction in uncertainty about x when y is known C. E. Shannon
16
Example: Monty Hall Problem Start: Maximum entropy = ln3 After door is opened: Entropy = ln3 – 2 / 3 ln2 So information gained is 2 / 3 ln2 [Checking this answer reveals the solution to the original problem!] 15
17
16 Required Accuracy
18
17 Take care if you have only partial data...
19
18
20
19 Thanks to IoP Publishing...
21
20 Thank you for participating
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.