Welcome to Amsterdam!. Bayesian Modeling for Cognitive Science: A WinBUGS Workshop.

Slides:



Advertisements
Similar presentations
Bayesian Statistics Without Tears: Prelude
Advertisements

Introduction to Monte Carlo Markov chain (MCMC) methods
MCMC estimation in MlwiN
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Bayesian Estimation in MARK
Flipping A Biased Coin Suppose you have a coin with an unknown bias, θ ≡ P(head). You flip the coin multiple times and observe the outcome. From observations,
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Markov-Chain Monte Carlo
Bayesian statistics – MCMC techniques
Industrial Engineering College of Engineering Bayesian Kernel Methods for Binary Classification and Online Learning Problems Theodore Trafalis Workshop.
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Introduction  Bayesian methods are becoming very important in the cognitive sciences  Bayesian statistics is a framework for doing inference, in a principled.
Bayesian estimation Bayes’s theorem: prior, likelihood, posterior
1 The Basics of Regression Regression is a statistical technique that can ultimately be used for forecasting.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
CENTER FOR BIOLOGICAL SEQUENCE ANALYSIS Bayesian Inference Anders Gorm Pedersen Molecular Evolution Group Center for Biological Sequence Analysis Technical.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Applied Bayesian Analysis for the Social Sciences Philip Pendergast Computing and Research Services Department of Sociology
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Bayesian Analysis for Extreme Events Pao-Shin Chu and Xin Zhao Department of Meteorology School of Ocean & Earth Science & Technology University of Hawaii-
Analyzing iterated learning Tom Griffiths Brown University Mike Kalish University of Louisiana.
Monté Carlo Simulation MGS 3100 – Chapter 9. Simulation Defined A computer-based model used to run experiments on a real system.  Typically done on a.
A Practical Course in Graphical Bayesian Modeling; Class 1 Eric-Jan Wagenmakers.
Standard Error of the Mean
Bayesian Inference Using JASP
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
2 nd Order CFA Byrne Chapter 5. 2 nd Order Models The idea of a 2 nd order model (sometimes called a bi-factor model) is: – You have some latent variables.
WinBUGS Demo Saghir A. Bashir Amgen Ltd, Cambridge, U.K. 4 th January 2001.
St5219: Bayesian hierarchical modelling lecture 2.1.
R2WinBUGS: Using R for Bayesian Analysis Matthew Russell Rongxia Li 2 November Northeastern Mensurationists Meeting.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
Fast Simulators for Assessment and Propagation of Model Uncertainty* Jim Berger, M.J. Bayarri, German Molina June 20, 2001 SAMO 2001, Madrid *Project of.
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
Latent Class Regression Model Graphical Diagnostics Using an MCMC Estimation Procedure Elizabeth S. Garrett Scott L. Zeger Johns Hopkins University
CHAPTER 5 Probability Theory (continued) Introduction to Bayesian Networks.
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
Bayesian Statistics, Modeling & Reasoning What is this course about? P548: Intro Bayesian Stats with Psych Applications Instructor: John Miyamoto 01/04/2016:
G. Cowan Lectures on Statistical Data Analysis Lecture 4 page 1 Lecture 4 1 Probability (90 min.) Definition, Bayes’ theorem, probability densities and.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
04/21/2005 CS673 1 Being Bayesian About Network Structure A Bayesian Approach to Structure Discovery in Bayesian Networks Nir Friedman and Daphne Koller.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
G. Cowan Lectures on Statistical Data Analysis Lecture 12 page 1 Statistical Data Analysis: Lecture 12 1Probability, Bayes’ theorem 2Random variables and.
Anders Nielsen Technical University of Denmark, DTU-Aqua Mark Maunder Inter-American Tropical Tuna Commission An Introduction.
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
An Introduction to AD Model Builder PFRP
(Day 3).
Oliver Schulte Machine Learning 726
MCMC Output & Metropolis-Hastings Algorithm Part I
MCMC Stopping and Variance Estimation: Idea here is to first use multiple Chains from different initial conditions to determine a burn-in period so the.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Bayesian estimation Bayes’s theorem: prior, likelihood, posterior
Bayesian data analysis
Introducing Bayesian Approaches to Twin Data Analysis
Bayesian inference Presented by Amir Hadadi
Lecture 4 1 Probability (90 min.)
Predictive distributions
Multidimensional Integration Part I
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Ch13 Empirical Methods.
Lecture 4 1 Probability Definition, Bayes’ theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests general.
Lecture 4 1 Probability Definition, Bayes’ theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests general.
Bayesian Statistics on a Shoestring Assaf Oron, May 2008
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

Welcome to Amsterdam!

Bayesian Modeling for Cognitive Science: A WinBUGS Workshop

Contributors Michael Lee

Contributors Dora Matzke

Contributors Ruud Wetzels

Contributors EJ Wagenmakers

Assistants Don van Ravenzwaaij

Assistants Gilles Dutilh

Assistants Helen Steingröver

Why We Like Bayesian Modeling  It is fun.  It is cool.  It is easy.  It is principled.  It is superior.  It is useful.  It is flexible.

Our Goals This Week Are…  For you to experience some of the possibilities that WinBUGS has to offer.  For you to get some hands-on training by trying out some programs.  For you to work at your own pace.  For you to get answers to questions when you get stuck.

Our Goals This Week Are NOT…  For you to become a Bayesian graphical modeling expert in one week.  For you to gain deep insight in the statistical foundations of Bayesian inference.  For you to get frustrated when the programs do not work or you do not understand the materials (please ask questions).

Logistics  You should now have the course book, information on how to get wireless access, and a USB stick. The stick contains a pdf of the book and the computer programs.

Logistics  Brief plenary lectures are at 09:30 and 14:00.  All plenary lectures are in this room.  All practicals are in the computer rooms on the next floor.  Coffee and tea are available in the small opposite the computer rooms.

What is Bayesian Inference? Why be Bayesian?

What is Bayesian Inference?

“Common sense expressed in numbers”

What is Bayesian Inference? “The only statistical procedure that is coherent, meaning that it avoids statements that are internally inconsistent.”

What is Bayesian Inference? “The only good statistics”

Outline  Bayes in a Nutshell  The Bayesian Revolution  This Course

Bayesian Inference in a Nutshell  In Bayesian inference, uncertainty or degree of belief is quantified by probability.  Prior beliefs are updated by means of the data to yield posterior beliefs.

Bayesian Parameter Estimation: Example  We prepare for you a series of 10 factual questions of equal difficulty.  You answer 9 out of 10 questions correctly.  What is your latent probability θ of answering any one question correctly?

Bayesian Parameter Estimation: Example  We start with a prior distribution for θ. This reflect all we know about θ prior to the experiment. Here we make a standard choice and assume that all values of θ are equally likely a priori.

Bayesian Parameter Estimation: Example  We then update the prior distribution by means of the data (technically, the likelihood) to arrive at a posterior distribution.  The posterior distribution is a compromise between what we knew before the experiment and what we have learned from the experiment. The posterior distribution reflects all that we know about θ.

Mode = % confidence interval: (0.59, 0.98)

Outline  Bayes in a Nutshell  The Bayesian Revolution  This Course

The Bayesian Revolution  Until about 1990, Bayesian statistics could only be applied to a select subset of very simple models.  Only recently, Bayesian statistics has undergone a transformation; With current numerical techniques, Bayesian models are “limited only by the user’s imagination.”

The Bayesian Revolution in Statistics

Why Bayes is Now Popular Markov chain Monte Carlo!

Markov Chain Monte Carlo  Instead of calculating the posterior analytically, numerical techniques such as MCMC approximate the posterior by drawing samples from it.  Consider again our earlier example…

Mode = % confidence interval: (0.59, 0.98) With 9000 samples, almost identical to analytical result.

Want to Know More About MCMC?

MCMC  With MCMC, the models you can build and estimate are said to be “limited only by the user’s imagination”.  But how do you get MCMC to work? Option 1: write the code it yourself. Option 2: use WinBUGS!

Outline  Bayes in a Nutshell  The Bayesian Revolution  This Course

Bayesian Cognitive Modeling: A Practical Course  …is a course book under development, used at several universities.  …is still regularly updated.  …will eventually be published by Cambridge University Press.  …greatly benefits from your suggestions for improvement! [e.g., typos, awkward sentences, new exercises, new applications, etc.]

 …requires you to run computer code. Do not mindlessly copy-paste the code, but study it first, and try to discover why it does its job.  …did not print very well (i.e., the quality of some of the pictures is below par). You will receive a better version tomorrow! Bayesian Cognitive Modeling: A Practical Course

WinBUGS Bayesian inference Using Gibbs Sampling You want to have this installed (plus the registration key)

WinBUGS  Knows many probability distributions (likelihoods);  Allows you to specify a model;  Allows you to specify priors;  Will then automatically run the MCMC sampling routines and produce output.

WinBUGS knows many statistical distributions (e.g., the binomial distribution, the Gaussian distribution, the Poisson distribution). These distributions form the elementary building blocks from which you may construct infinitely many models.

WinBUGS & R  WinBUGS produces MCMC samples.  We want to analyze the output in a nice program, such as R or Matlab.  This can be accomplished using the R package “R2WinBUGS”, or the Matlab function “matbugs”.

R: “Here are the data and a bunch of commands” WinBUGS: “OK, I did what you wanted, here’s the samples you asked for”

Matlab: “Here are the data and a bunch of commands” WinBUGS: “OK, I did what you wanted, here’s the samples you asked for”

Getting Started  Work through some of the exercises of the book.  Most of you will want to get started with the chapter “getting started”.  For those of you who have worked with the book before, you can start wherever you want. Note that most early chapters have been restructured (and new content was added).

Running the R programs  The R scripts have extension.R. You can use “File” -> “Open Script” to read these.  You can run these scripts by copying-and- pasting the scripts in the R console.

Saving Your Work  If you want to save your work, please do this on the USB stick!

WARNING  The first chapters are mostly about simple statistical models. This lays the groundwork for the later chapters on more complicated cognitive modeling.  The idea is that you have to walk before you can run.

Questions?  Feel free to ask questions when you are stuck.  Answers to the exercises for the first few chapters can be found at the end of the book!!

Inside every Non-Bayesian, there is a Bayesian struggling to get out Dennis Lindley