Introduction to Bayesian inference POORFISH workshop Helsinki 22.-26.5.2006 Samu Mäntyniemi Fisheries and Environmental Management group (FEM) Department.

Slides:



Advertisements
Similar presentations
Evaluation of standard ICES stock assessment and Bayesian stock assessment in the light of uncertainty: North Sea herring as an example Samu MäntyniemiFEM,
Advertisements

Value of Information and Value of Control in fisheries management: North Sea herring as an example Samu Mäntyniemi, Sakari Kuikka, Laurence Kell, Mika.
1 COMM 301: Empirical Research in Communication Lecture 15 – Hypothesis Testing Kwan M Lee.
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
Reasoning under Uncertainty: Marginalization, Conditional Prob., and Bayes Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 6, 2013.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
Inference Sampling distributions Hypothesis testing.
Last Time (Sampling &) Estimation Confidence Intervals Started Hypothesis Testing.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
Introduction to Probability and Probabilistic Forecasting L i n k i n g S c i e n c e t o S o c i e t y Simon Mason International Research Institute for.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
EPIDEMIOLOGY AND BIOSTATISTICS DEPT Esimating Population Value with Hypothesis Testing.
Introduction  Bayesian methods are becoming very important in the cognitive sciences  Bayesian statistics is a framework for doing inference, in a principled.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Solved the Maze? Start at phil’s house. At first, you can only make right turns through the maze. Each time you cross the red zigzag sign (under Carl’s.
Inductive Reasoning Bayes Rule. Urn problem (1) A B A die throw determines from which urn to select balls. For outcomes 1,2, and 3, balls are picked from.
Baysian Approaches Kun Guo, PhD Reader in Cognitive Neuroscience School of Psychology University of Lincoln Quantitative Methods 2011.
Inference about a Mean Part II
Ch. 9 Fundamental of Hypothesis Testing
Lecture 9: p-value functions and intro to Bayesian thinking Matthew Fox Advanced Epidemiology.
Bayesian Inference Using JASP
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Additional Slides on Bayesian Statistics for STA 101 Prof. Jerry Reiter Fall 2008.
Bayesian Inference, Basics Professor Wei Zhu 1. Bayes Theorem Bayesian statistics named after Thomas Bayes ( ) -- an English statistician, philosopher.
Hypothesis Testing: One Sample Cases. Outline: – The logic of hypothesis testing – The Five-Step Model – Hypothesis testing for single sample means (z.
St5219: Bayesian hierarchical modelling lecture 2.1.
Digital Statisticians INST 4200 David J Stucki Spring 2015.
Multidisciplinary evaluation of an environmentally driven health risk: the case study of herring and dioxin (EVAHER) HERC Seminar , Mari Vanhatalo.
“PREDICTIVE MODELING” CoSBBI, July Jennifer Hu.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
Hypothesis Testing Hypothesis Testing Topic 11. Hypothesis Testing Another way of looking at statistical inference in which we want to ask a question.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Chapter 10 – Sampling Distributions Math 22 Introductory Statistics.
Computing & Information Sciences Kansas State University Wednesday, 22 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 22 of 42 Wednesday, 22 October.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 of 41 Monday, 25 October.
Likelihood function and Bayes Theorem In simplest case P(B|A) = P(A|B) P(B)/P(A) and we consider the likelihood function in which we view the conditional.
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
DNA Identification: Bayesian Belief Update Cybergenetics © TrueAllele ® Lectures Fall, 2010 Mark W Perlin, PhD, MD, PhD Cybergenetics, Pittsburgh,
Economics 173 Business Statistics Lecture 4 Fall, 2001 Professor J. Petry
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
Bayesian Inference, Review 4/25/12 Frequentist inference Bayesian inference Review The Bayesian Heresy (pdf)pdf Professor Kari Lock Morgan Duke University.
Simple examples of the Bayesian approach For proportions and means.
MATH 643 Bayesian Statistics. 2 Discrete Case n There are 3 suspects in a murder case –Based on available information, the police think the following.
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
INTRODUCTION TO CLINICAL RESEARCH Introduction to Statistical Inference Karen Bandeen-Roche, Ph.D. July 12, 2010.
1 DECISION MAKING Suppose your patient (from the Brazilian rainforest) has tested positive for a rare but serious disease. Treatment exists but is risky.
Do I need statistical methods? Samu Mäntyniemi. Learning from experience Which way a bottle cap is going to land? Think, and then write down your opinion.
Course on Bayesian Methods in Environmental Valuation
Objective Evaluation of Intelligent Medical Systems using a Bayesian Approach to Analysis of ROC Curves Julian Tilbury Peter Van Eetvelt John Curnow Emmanuel.
Sampling Distributions Statistics Introduction Let’s assume that the IQ in the population has a mean (  ) of 100 and a standard deviation (  )
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
FIXETH LIKELIHOODS this is correct. Bayesian methods I: theory.
CSC321: Lecture 8: The Bayesian way to fit models Geoffrey Hinton.
Bayesian Estimation and Confidence Intervals Lecture XXII.
Computer Science cpsc322, Lecture 25
Data Analysis Patrice Koehl Department of Biological Sciences
Bayesian Estimation and Confidence Intervals
Chapter 9: Hypothesis Tests Based on a Single Sample
CS 188: Artificial Intelligence Fall 2008
Bayes for Beginners Luca Chech and Jolanda Malamud
CS 594: Empirical Methods in HCC Introduction to Bayesian Analysis
CS639: Data Management for Data Science
Section 11.1: Significance Tests: Basics
Mathematical Foundations of BME Reza Shadmehr
Presentation transcript:

Introduction to Bayesian inference POORFISH workshop Helsinki Samu Mäntyniemi Fisheries and Environmental Management group (FEM) Department of Biological and Environmental Sciences University of Helsinki

Aims of Bayesian inference Provide a measure of uncertainty about the unobserved state of Nature Make quantitative inference about unobserved state of Nature, based on observed facts, e.g., Stock size | observed catches Disease status of a patient | blood test result Age structure | catch samples Provide transparent way of updating already accumulated knowledge based on new observations integrating information from different sources

Probability as a measure of uncertainty Probability measures a personal degree of belief on propositions Exemplary propositions: Finland will win Eurovision song contest 2007 Sweden is the world champion of ice hockey in 2006 It rains in Helsinki today It rained in Helsinki yesterday In the beginning of 2005 the biomass of Baltic herring stock was 100kg Finnish fleet will catch more salmon in 2010 than it did on 2006, if the TAC for 2007 and beyond is P(prop. is true | my information)=1 : “I am sure that.. 0<P(prop. is true | my information)<1 : “I am uncertain whether…

Probability calculus as information processor How my knowledge about the state of Nature (N) changes in the light of new evidence (X)? 3 steps are required 1. Specify what is already known about N before the new information is obtained, P(N) : Prior distribution 2. Specify what is known about the different types of new information (X) before obtaining it under alternative hypotheses about N, P(X|N) : Conditional distribution of evidence 3. Record the new information, and combine with the old knowledge P(N|X) : Posterior distribution of N

Introductory example You have been to a tropical country where malaria occurs Later you take a malaria test that is claimed to have 95% sensitivity (gives positive result if you have the disease) and 98% specificity (gives negative result if you don’t have the disease) Your test result turns out to be positive: should you worry?

Prior probabilities The probability to catch malaria is (say) 0.1% if preventive drugs are used Our probabilities: P(test+|malaria+)=95% -Probability of having positive test, if one has malaria P(test-|malaria-)=98% P(test+|malaria-)=2% P(malaria)=0.1% -Probability of having malaria prior to knowing the test result

Probabilities Malaria + Malaria - Test + Test - Test + Test - 0.1% 99,9% 95% 5% 98% 2% Malaria+ & test + 0.1% x 95% = 0.095% Malaria - & test + 99,9% x 2% = 1.998% Prob. of a positive test: 0.095%+1.998%=2.093%

Probability calculus Prob. of malaria given the positive test result = “malaria positives” / ”all positives” = 0.095% / 2.093% ≈ 4.5% Formally: This is the Bayes’ theorem!

Inference Taking into account that the probability to catch malaria is so low, it is not likely that you have malaria even though your malaria test is positive.

Example: population size 1) specify information about population size prior to seeing data, P(N) This can be based on pop. size in previous years and information about population parameters from other studies 2) specify probability to observe the data set given each possible population size, P(data | N) These are based on knowledge about the sampling process Once the data has been observed, these probabilities form the likelihood function for the population size 3) Once the data set has been observed, compute probabilities for each population size given the observed data set, P(N | data)

posterior  likelihood X prior P(N | data)  P(data | N) P(N)

Try it yourself! Implement the malaria example by using MS Excel or R In order to understand how the information processing works, try doing the following : 1. Assume that the malaria test was repeated and found to be negative. Calculate the probability that you have malaria based on all the information you have (1 pos. & 1 neg & prior) 2. Then assume that the test was still repeated for additional two times, and found to be positive in both. Calculate now the probability that you have malaria, based on the four sequential test results. (1 pos & 1 neg & 1 pos & 1 pos & prior). What if you had the results in different order? (like 3 pos & 1 neg & prior 3. Alter the prior probability of having malaria and see how it affects the resulting inference in the cases with different number of repeated tests. 4. Alter the sensitivity and specificity of the test and examine the effects