Bayesian Analysis for Extreme Events Pao-Shin Chu and Xin Zhao Department of Meteorology School of Ocean & Earth Science & Technology University of Hawaii-

Slides:



Advertisements
Similar presentations
Introduction to Monte Carlo Markov chain (MCMC) methods
Advertisements

Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Gibbs Sampling Methods for Stick-Breaking priors Hemant Ishwaran and Lancelot F. James 2001 Presented by Yuting Qi ECE Dept., Duke Univ. 03/03/06.
Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review By Mary Kathryn Cowles and Bradley P. Carlin Presented by Yuting Qi 12/01/2006.
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Bayesian Estimation in MARK
Flipping A Biased Coin Suppose you have a coin with an unknown bias, θ ≡ P(head). You flip the coin multiple times and observe the outcome. From observations,
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
BAYESIAN INFERENCE Sampling techniques
Industrial Engineering College of Engineering Bayesian Kernel Methods for Binary Classification and Online Learning Problems Theodore Trafalis Workshop.
1 Vertically Integrated Seismic Analysis Stuart Russell Computer Science Division, UC Berkeley Nimar Arora, Erik Sudderth, Nick Hay.
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
A Bayesian view of language evolution by iterated learning Tom Griffiths Brown University Mike Kalish University of Louisiana.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Extreme Value Analysis, August 15-19, Bayesian analysis of extremes in hydrology A powerful tool for knowledge integration and uncertainties assessment.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Applied Bayesian Analysis for the Social Sciences Philip Pendergast Computing and Research Services Department of Sociology
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Analyzing iterated learning Tom Griffiths Brown University Mike Kalish University of Louisiana.
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
A Practical Course in Graphical Bayesian Modeling; Class 1 Eric-Jan Wagenmakers.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Introduction to Monte Carlo Methods D.J.C. Mackay.
Bayes Factor Based on Han and Carlin (2001, JASA).
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Statistical Decision Theory
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
St5219: Bayesian hierarchical modelling lecture 2.1.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Bayes’ Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.
Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Weather and Climate Hazards in Hawaii and Taiwan Pao-Shin Chu Department of Meteorology University of Hawaii Honolulu, Hawaii, USA Presented at the Department.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
Bayesian Travel Time Reliability
A shared random effects transition model for longitudinal count data with informative missingness Jinhui Li Joint work with Yingnian Wu, Xiaowei Yang.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
NAME SWG th Annual NOAA Climate Diagnostics and Prediction Workshop State College, Pennsylvania Oct. 28, 2005.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
SIR method continued. SIR: sample-importance resampling Find maximum likelihood (best likelihood × prior), Y Randomly sample pairs of r and N 1973 For.
Efficiency Measurement William Greene Stern School of Business New York University.
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Markov Chain Monte Carlo in R
Introduction to Sampling based inference and MCMC
MCMC Output & Metropolis-Hastings Algorithm Part I
Advanced Statistical Computing Fall 2016
Bayesian data analysis
CSCI 5822 Probabilistic Models of Human and Machine Learning
CAP 5636 – Advanced Artificial Intelligence
Predictive distributions
Forecasting Seasonal and
Robust Full Bayesian Learning for Neural Networks
CS639: Data Management for Data Science
Presentation transcript:

Bayesian Analysis for Extreme Events Pao-Shin Chu and Xin Zhao Department of Meteorology School of Ocean & Earth Science & Technology University of Hawaii- Manoa

Why Bayesian inference? A rigorous way to make probability statements about the parameters of interest. An ability to update these statements as new information is received. Recognition that parameters are changing over time rather than forever fixed.

An efficient way to provide a coherent and rational framework for reducing uncertainties by incorporating diverse information sources (e.g., subjective beliefs, historical records, model simulations). An example: annual rates of US hurricanes ( Elsner and Bossak, 2002 ) Uncertainty modeling and learning from data ( Berliner, 2003 )

Some applications of Bayesian analysis for climate research Change-point analysis for extreme events (e.g., tropical cyclones, heavy rainfall, summer heat waves) Why change-point analysis? Tropical cyclone prediction (Chu and Zhao, 2007, J. Climate; Lu, Chu, and Chen, 2010, Weather & Forecasting, accepted) Clustering of typhoon tracks in the WNP (Chu et al., 2010, Regional typhoon activity as revealed by track patterns and climate change, in Hurricanes and Climate Change, Elsner et al., Eds., Springer, in press)

Other Examples Predicting climate variations (e.g., ENSO) Quantifying uncertainties in projections of future climate change

Change-point analysis for tropical cyclones Given the Poisson intensity parameter (i.e., the mean seasonal TC rates), the probability mass function (PMF) of tropical cyclones occurring in T years is where and,. The λ is regarded as a random variable, not a constant.

Gamma density is known as a conjugate prior and posterior for λ. A functional choice for λ is a gamma distribution where λ>0, h´ >0, T´>0. h´ and T´ are prior parameters.

The PDF of h tropical cyclones in T years when the Poisson intensity is codified as a gamma density with prior parameters T’ and h’ is a negative binomial distribution (Epstein, 1985)

A hierarchical Bayesian tropical cyclone model adapted from Elsner and Jagger (2004)

Hypothesis model for change-point analysis (Consider 3 hypo.) H0H0 H1H1 H2H2

Markov Chain Monte Carlo (MCMC) approach Standard Monte Carlo methods produce a set of independent simulated values according to some probability distribution. MCMC methods produce chains in which each of the simulated values is mildly dependent on the preceding value. The basic principle is that once this chain has run sufficiently long enough it will find its way to the desired posterior distribution. One of the most widely used MCMC algorithms is the Gibbs sampler for producing chain values. The idea is that if it is possible to express each of the coefficients to be estimated as conditioned on all of the others, then by cycling through these conditional statements, we can eventually reach the true joint distribution of interest.

Θ = [θ 1,θ 2,…θ p ] Gibbs Sampler (We can generate a value from the conditional distribution for one part of the θ given the values of the rest of other parts of θ; it involves successive drawing from conditional posterior densities P(θ k |h, θ 1,…,θ k-1,θ k+1,…,θ p ) for k from 1 to p)

Bayesian inference under each hypothesis

With the prior knowledge, we can apply the Gibbs sampler to draw samples from the posterior distribution of the model parameters under each respective hypothesis.

Hypothesis Analysis Under uniform prior assumption for hypothesis space

Annual major hurricane count series for the ENP P(H 2 |h) = P(H 1 |h) = P(H 0 |h) = = 1982 and = 1999, 3 epochs

Why RJMCMC? Because parameter spaces within different hypotheses are typically different from each other, a simulation has to be run independently for each of the candidate hypotheses. If the hypotheses have large dimension, the MCMC approach is not efficient. Green (1995)

Reversible jump sampling for moving between spaces of differing dimensions A trans-dimensional Markov chain simulation in which the dimension of the parameter space can change from one iteration to the next Useful for model or hypothesis selection problems

4 different gamma models ( 4 epochs with 3 change-points)

4 different gamma models (3 changepoints)

Prior specification

Extreme rainfall events in Hawaii

Summary Why Bayesian analysis Applications for climate research (extreme events and climate change) Change-point analysis Mathematical model of rare event count series Hypothesis model Bayesian inference under each hypothesis Major hurricane series in the eastern North Pacific Recent Advance (RJMCMC)