JAGS. Learning Objectives Be able to represent ecological systems as a network of known and unknowns linked by deterministic and stochastic relationships.

Slides:



Advertisements
Similar presentations
Introduction to Monte Carlo Markov chain (MCMC) methods
Advertisements

Other MCMC features in MLwiN and the MLwiN->WinBUGS interface
MCMC estimation in MlwiN
Lecture #4: Bayesian analysis of mapped data Spatial statistics in practice Center for Tropical Ecology and Biodiversity, Tunghai University & Fushan Botanical.
A Tutorial on Learning with Bayesian Networks
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for 1 Lecture Notes for E Alpaydın 2010.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review By Mary Kathryn Cowles and Bradley P. Carlin Presented by Yuting Qi 12/01/2006.
Exact Inference in Bayes Nets
Bayesian Estimation in MARK
Flipping A Biased Coin Suppose you have a coin with an unknown bias, θ ≡ P(head). You flip the coin multiple times and observe the outcome. From observations,
An Introduction to LDA Tools Kuan-Yu Chen Institute of Information Science, Academia Sinica.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
An Introduction to Variational Methods for Graphical Models.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Making rating curves - the Bayesian approach. Rating curves – what is wanted? A best estimate of the relationship between stage and discharge at a given.
Midterm Review. The Midterm Everything we have talked about so far Stuff from HW I won’t ask you to do as complicated calculations as the HW Don’t need.
Descriptive Statistics In SAS Exploring Your Data.
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Goal: Reconstruct Cellular Networks Biocarta. Conditions Genes.
Applied Bayesian Analysis for the Social Sciences Philip Pendergast Computing and Research Services Department of Sociology
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Everything you ever wanted to know about BUGS, R2winBUGS, and Adaptive Rejection Sampling A Presentation by Keith Betts.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Department of Geography, Florida State University
A Practical Course in Graphical Bayesian Modeling; Class 1 Eric-Jan Wagenmakers.
Bayesian Inference Using JASP
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Gaussian process regression Bernád Emőke Gaussian processes Definition A Gaussian Process is a collection of random variables, any finite number.
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
WinBUGS Demo Saghir A. Bashir Amgen Ltd, Cambridge, U.K. 4 th January 2001.
R2WinBUGS: Using R for Bayesian Analysis Matthew Russell Rongxia Li 2 November Northeastern Mensurationists Meeting.
Tomas Radivoyevitch · David G. Hoel. Biologically-based risk estimation for radiation-induced chronic myeloid leukemia. Radiat Environ Biophys (2000) 39:153–159.
Three Frameworks for Statistical Analysis. Sample Design Forest, N=6 Field, N=4 Count ant nests per quadrat.
Ben Stöver WS 2012/2013 Ancestral state reconstruction Molecular Phylogenetics – exercise.
Lecture 2: Statistical learning primer for biologists
1 Francisco José Vázquez Polo [ Miguel Ángel Negrín Hernández [ {fjvpolo or
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Wei Sun and KC Chang George Mason University March 2008 Convergence Study of Message Passing In Arbitrary Continuous Bayesian.
Tomas Radivoyevitch · David G. Hoel. Biologically-based risk estimation for radiation-induced chronic myeloid leukemia. Radiat Environ Biophys (2000) 39:153–159.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Item Parameter Estimation: Does WinBUGS Do Better Than BILOG-MG?
Bayesian Statistics, Modeling & Reasoning What is this course about? P548: Intro Bayesian Stats with Psych Applications Instructor: John Miyamoto 01/04/2016:
The Uniform Prior and the Laplace Correction Supplemental Material not on exam.
Pattern Recognition and Machine Learning
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Fitting normal distribution: ML 1Computer vision: models, learning and inference. ©2011 Simon J.D. Prince.
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
1 Getting started with WinBUGS Mei LU Graduate Research Assistant Dept. of Epidemiology, MD Anderson Cancer Center Some material was taken from James and.
Anders Nielsen Technical University of Denmark, DTU-Aqua Mark Maunder Inter-American Tropical Tuna Commission An Introduction.
Markov Chain Monte Carlo (MCMC). “The genie cannot be put back into the bottle. The Bayesian machine, together with MCMC, is arguably the most powerful.
JAGS & Bayesian Regression. A new approach to insight Pose question and think of the answer needed to answer it. Ask: How do the data arise? What is the.
How many iterations in the Gibbs sampler? Adrian E. Raftery and Steven Lewis (September, 1991) Duke University Machine Learning Group Presented by Iulian.
Bursts modelling Using WinBUGS Tim Watson May 2012 :diagnostics/ :transformation/ :investment planning/ :portfolio optimisation/ :investment economics/
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
Stochasticity and Probability. A new approach to insight Pose question and think of the answer needed to answer it. Ask: How do the data arise? What is.
Bayesian Statistics, Modeling & Reasoning What is this course about?
Hierarchical Models.
CS 2750: Machine Learning Directed Graphical Models
Bayesian Methods Allow updating an existing probability when additional information is made available Allows flexibility when integrating different types.
Introducing Bayesian Approaches to Twin Data Analysis
Chapter Six Normal Curves and Sampling Probability Distributions
Data Mining Lecture 11.
OVERVIEW OF BAYESIAN INFERENCE: PART 1
Location-Scale Normal Model
Robust Full Bayesian Learning for Neural Networks
First, a question Can we find the perfect value for a parameter?
Presentation transcript:

JAGS

Learning Objectives Be able to represent ecological systems as a network of known and unknowns linked by deterministic and stochastic relationships. Understand the basis for factoring simple Bayesian models using laws of probability. Be able to derive conditional probabilities from networks (DAGS). Use conditional probabilities as basis for writing JAGS code.

Exercise: write a Bayesian model for the model of light Limitation of Trees ϒ= max. growth rate at high light c=minimum light requirement α=slope of curve at low light

Bayesian Networks The heads of the arrows specify variables on the left hand side of the conditioning in the joint distribution; the tails are on the right hand Side. Anything without an arrow leading to it is either known as data or “known” as a prior with numeric arguments. If the prior is uninformative, these quantities are “known unknowns”. They are also called “directed acyclic graphs (DAG)” and “Bayesian belief networks”

Process model Parameter model

BUGS [GUI interface] Bayesian inference Using Gibbs Sampling WinBUGS (Windows) OpenBUGS (cross-platform) GeoBUGS JAGS [command line] Just Another Gibbs Sampler JAGS (cross-platform) R2WinBUGS R2OpenBUGS Others… rjags R2jags Others… BUGS JAGS ImperativeDeclarative

From R: 1.Specify initial conditions 2.Specify data 3.Specify model 4.Specify run conditions 5.Call JAGS 6.Manipulate JAGS output (rjags, coda others) JAGS From JAGS: 1.Choose algorithm 2.Generate chains 3.Send chain output back to R IMPERATIVE DECLARATIVE

tree.data<-read.csv("Hemlock-light-data.csv") # Specify initial condition. mod.inits<-function(){ list(a=rnorm(1,40,2), b=ruinf(1,1.5,2.5), c=runif(1,-5,5), tau=0.001) } # Specify data; must be a list. data=list( n=nrow(tree.data), x=as.numeric(tree.data$Light), y=as.numeric(tree.data$Observed.growth.rate ) )

# JAGS model model{ for (i in 1:n) { mu[i]<-(a*x[i]+c)/((a/b)+x[i]+c) # prediction/scientific model y[i]~ dnorm(mu[i],tau) # likelihood } # Prior distributions tau~dgamma(0.001,.001) a~dgamma(0.001,.001) c~dunif(-10, 10) b~dgamma(.001,.001) sigma<-1/sqrt(tau) } # end of model The JAGS model can be inserted inside of R code or saved as a text file and called from R.

# Set run conditions: number of iterations for adaptation & runs, number of chains, etc n.adapt=500 n.update = 1000 n.iter = 3000 # Call to JAGS jm=jags.model(“jags_light_example.R",data=data,mod.inits,n.chains=3,n.adapt = n.adapt) # Burnin the chain (we normally throw these out) update(jm, n.iter=n.update) # Generate coda and jags objects for parameters to monitor and deviance. zm<-coda.samples(jm,variable.names=c("a", "b", "c","mu“, “deviance”),n.iter=n.iter, n.thin=10) zj<-jags.samples(jm,variable.names=c("a", "b","c","mu“, “deviance”), n.iter=n.iter, n.thin=10)

# Plot parameters plot(zm,,ask = dev.interactive()) xyplot(zm,ask = dev.interactive()) densityplot(zm,ask = dev.interactive()) # Plot predictions b=summary(zj$mu,quantile,c(.025,.5,.976))$stat plot(tree.data$Light, tree.data$Observed.growth.rate, xlab="Light", ylab="Growth Rate", pch=16, col="blue") lines(tree.data$Light, b[2,]) lines(tree.data$Light, b[1,], lty="dashed") lines(tree.data$Light, b[3,], lty="dashed") # Convergence diagnostics rejectionRate(zm) # sampling conjugate gelman.diag(zm) # var in chains, stable=1, want r < 1.2 heidel.diag(zm) # requires convergence raftery.diag(zm) # how many iter you need for convergence

The gang Kiona Ogle Mevin Hooten Tom Hobbs