POISSON TALES FISH TALES

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
SI485i : NLP Day 2 Probability Review. Introduction to Probability Experiment (trial) Repeatable procedure with well-defined possible outcomes Outcome.
Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham 2005.
Assuming normally distributed data! Naïve Bayes Classifier.
Statistics for Particle Physics: Intervals Roger Barlow Karlsruhe: 12 October 2009.
Programme in Statistics (Courses and Contents). Elementary Probability and Statistics (I) 3(2+1)Stat. 101 College of Science, Computer Science, Education.
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Bayesian Models Honors 207, Intro to Cognitive Science David Allbritton An introduction to Bayes' Theorem and Bayesian models of human cognition.
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Statistical Analysis of Systematic Errors and Small Signals Reinhard Schwienhorst University of Minnesota 10/26/99.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
1 NA387 Lecture 6: Bayes’ Theorem, Independence Devore, Sections: 2.4 – 2.5.
Bayes for Beginners Presenters: Shuman ji & Nick Todd.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
“PROBABILITY” Some important terms Event: An event is one or more of the possible outcomes of an activity. When we toss a coin there are two possibilities,
Harrison B. Prosper Workshop on Top Physics, Grenoble Bayesian Statistics in Analysis Harrison B. Prosper Florida State University Workshop on Top Physics:
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Random Experiment Random Variable: Continuous, Discrete Sample Space: S Event: A, B, E Null Event Complement of an Event A’ Union of Events (either, or)
November 2004CSA4050: Crash Concepts in Probability1 CSA4050: Advanced Topics in NLP Probability I Experiments/Outcomes/Events Independence/Dependence.
Bayes’ Theorem Bayes’ Theorem allows us to calculate the conditional probability one way (e.g., P(B|A) when we know the conditional probability the other.
Likelihood function and Bayes Theorem In simplest case P(B|A) = P(A|B) P(B)/P(A) and we consider the likelihood function in which we view the conditional.
Sample variance and sample error We learned recently how to determine the sample variance using the sample mean. How do we translate this to an unbiased.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Modeling Data Greg Beckham. Bayes Fitting Procedure should provide – Parameters – Error estimates on the parameters – A statistical measure of goodness.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Education as a Signaling Device and Investment in Human Capital Topic 3 Part I.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Psychology 202a Advanced Psychological Statistics September 24, 2015.
Bootstrap ? See herehere. Maximum Likelihood and Model Choice The maximum Likelihood Ratio Test (LRT) allows to compare two nested models given a dataset.Likelihood.
BUSA Probability. Probability – the bedrock of randomness Definitions Random experiment – observing the close of the NYSE and the Nasdaq Sample.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Conditional Probability If two events are not mutually exclusive, the fact that we know that B has happened will have an effect on the probability of A.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Bayes for Beginners Anne-Catherine Huys M. Berk Mirza Methods for Dummies 20 th January 2016.
PROBABILITY 1. Basic Terminology 2 Probability 3  Probability is the numerical measure of the likelihood that an event will occur  The probability.
Statistics for Managers 5th Edition
MGS3100_07.ppt/Apr 7, 2016/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Decision Tree & Bayes’ Theorem Apr 7, 2016.
Data Modeling Patrice Koehl Department of Biological Sciences
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #8
Probability and Statistics for Particle Physics
Probability Theory and Parameter Estimation I
Qian Liu CSE spring University of Pennsylvania
Chapter 4 Probability.
Discrete Probability Distributions
Bayesian Methods Allow updating an existing probability when additional information is made available Allows flexibility when integrating different types.
Quick Review Probability Theory
Incorporating New Information to Decision Trees (posterior probabilities) MGS Chapter 6 Part 3.
Probability and Statistics
Mathematical representation of Bayes theorem.
Natural Language Processing
Introduction to Probabilities
MEGN 537 – Probabilistic Biomechanics Ch
OVERVIEW OF BAYESIAN INFERENCE: PART 1
CSCI 5832 Natural Language Processing
Location-Scale Normal Model
Advanced Artificial Intelligence
Bayesian Inference, Basics
Statistical NLP: Lecture 4
Introduction to Probability & Statistics Expectations
Part 3 The Foundation of Statistical Inference: Probability and Sampling Chapter 4 Basic Probability Concepts.
Wellcome Trust Centre for Neuroimaging
First, a question Can we find the perfect value for a parameter?
General Probability Rules
Homework We wish to classify two species of fish (salmon and bass) based on their luminosity as measured by a camera. We have the following likelihoods:
ECE457 Applied Artificial Intelligence Spring 2008 Lecture #8
basic probability and bayes' rule
Presentation transcript:

POISSON TALES FISH TALES If you give a man Cash he will analyze for a day. If you teach a man Poisson, he will analyze for a lifetime. POISSON TALES If you give a man a fish he will eat for a day. If you teach a man to fish he will eat for a lifetime. FISH TALES

It is all about Probabilities! The Poisson Likelihood Probability Calculus – Bayes’ Theorem Priors – the gamma distribution Source intensity and background marginalization Hardness Ratios (BEHR)

Deriving the Poisson Likelihood 1 N,t®¥

Probability Calculus A or B :: p(A+B) = p(A) + p(B) - p(AB) A and B :: p(AB) = p(A|B) p(B) = p(B|A) p(A) Bayes’ Theorem :: p(B|A) = p(A|B) p(B) / p(A) p(Model |Data) = p(Model) p(Data|Model) / p(Data) posterior distribution prior distribution likelihood normalization marginalization :: p(a|D) = p(ab|D) db

Priors

Priors Incorporate known information Forced acknowledgement of bias Non-informative priors – flat – range – least informative (Jeffrey’s) – gamma

Priors the gamma distribution

Panning for Gold: Source and Background

Hardness Ratios Simple, robust, intuitive summary Proxy for spectral fitting Useful for large samples Most needed for low counts

BEHR http://hea-www.harvard.edu/AstroStat/BEHR/