April 2008Stat 877 © Brian S. Yandell1 Reversible Jump Details Brian S. Yandell www.stat.wisc.edu/~yandell/statgen University of Wisconsin-Madison April.

Slides:



Advertisements
Similar presentations
Things to do in Lecture 1 Outline basic concepts of causality
Advertisements

Managerial Economics in a Global Economy
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
The Simple Regression Model
Automated Regression Modeling Descriptive vs. Predictive Regression Models Four common automated modeling procedures Forward Modeling Backward Modeling.
Isoparametric Elements Element Stiffness Matrices
NOTATION & ASSUMPTIONS 2 Y i =  1 +  2 X 2i +  3 X 3i + U i Zero mean value of U i No serial correlation Homoscedasticity Zero covariance between U.
The General Linear Model Or, What the Hell’s Going on During Estimation?
Quantitative Methods 2 Lecture 3 The Simple Linear Regression Model Edmund Malesky, Ph.D., UCSD.
Multiple Regression. Outline Purpose and logic : page 3 Purpose and logic : page 3 Parameters estimation : page 9 Parameters estimation : page 9 R-square.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
P M V Subbarao Professor Mechanical Engineering Department
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
April 2010UW-Madison © Brian S. Yandell1 Bayesian QTL Mapping Brian S. Yandell University of Wisconsin-Madison UW-Madison,
Stat 200b. Chapter 8. Linear regression models.. n by 1, n by 2, 2 by 1, n by 1.
Copyright (c) Bani K. Mallick1 STAT 651 Lecture #18.
Curve-Fitting Regression
Dimensional reduction, PCA
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Variability Measures of spread of scores range: highest - lowest standard deviation: average difference from mean variance: average squared difference.
Ordinary least squares regression (OLS)
The Simple Regression Model
2.3. Measures of Dispersion (Variation):
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Objectives of Multiple Regression
Bayes Factor Based on Han and Carlin (2001, JASA).
Ordinary Least-Squares Emmanuel Iarussi Inria. Many graphics problems can be seen as finding the best set of parameters for a model, given some data Surface.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Physics 114: Exam 2 Review Lectures 11-16
2 Multicollinearity Presented by: Shahram Arsang Isfahan University of Medical Sciences April 2014.
Factor Analysis Psy 524 Ainsworth. Assumptions Assumes reliable correlations Highly affected by missing data, outlying cases and truncated data Data screening.
October 2001Jackson Labs Workshop © Brian S. Yandell 1 Efficient and Robust Statistical Methods for Quantitative Trait Loci Analysis Brian S. Yandell University.
GG 313 Geological Data Analysis Lecture 13 Solution of Simultaneous Equations October 4, 2005.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Environmental Modeling Basic Testing Methods - Statistics III.
June 1999NCSU QTL Workshop © Brian S. Yandell 1 Bayesian Inference for QTLs in Inbred Lines II Brian S Yandell University of Wisconsin-Madison
BayesNCSU QTL II: Yandell © Bayesian Interval Mapping 1.what is Bayes? Bayes theorem? Bayesian QTL mapping Markov chain sampling18-25.
Chap 5 The Multiple Regression Model
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Stats of Engineers, Lecture 8. 1.If the sample mean was larger 2.If you increased your confidence level 3.If you increased your sample size 4.If the population.
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Inverse Modeling of Surface Carbon Fluxes Please read Peters et al (2007) and Explore the CarbonTracker website.
Idiot's guide to... General Linear Model & fMRI Elliot Freeman, ICN. fMRI model, Linear Time Series, Design Matrices, Parameter estimation,
Anders Nielsen Technical University of Denmark, DTU-Aqua Mark Maunder Inter-American Tropical Tuna Commission An Introduction.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
April 2008UW-Madison © Brian S. Yandell1 Bayesian QTL Mapping Brian S. Yandell University of Wisconsin-Madison UW-Madison,
June 1999NCSU QTL Workshop © Brian S. Yandell 1 Bayesian Inference for QTLs in Inbred Lines II Brian S Yandell University of Wisconsin-Madison
September 2002Jax Workshop © Brian S. Yandell1 Bayesian Model Selection for Quantitative Trait Loci using Markov chain Monte Carlo in Experimental Crosses.
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Multiple Regression.
Inference for Regression
Chapter 14 Introduction to Multiple Regression
Multiple Regression.
Multiple Regression.
Bayesian Inference for QTLs in Inbred Lines
No notecard for this quiz!!
Bayesian Interval Mapping
CHAPTER- 17 CORRELATION AND REGRESSION
Note on Outbred Studies
Some issues in multivariate regression
Robust Full Bayesian Learning for Neural Networks
Bayesian Interval Mapping
2.3. Measures of Dispersion (Variation):
Regression and Correlation of Data
Presentation transcript:

April 2008Stat 877 © Brian S. Yandell1 Reversible Jump Details Brian S. Yandell University of Wisconsin-Madison April 2008

Stat 877 © Brian S. Yandell2 reversible jump idea expand idea of MCMC to compare models adjust for parameters in different models –augment smaller model with innovations –constraints on larger model calculus “change of variables” is key –add or drop parameter(s) –carefully compute the Jacobian consider stepwise regression –Mallick (1995) & Green (1995) –efficient calculation with Hausholder decomposition

April 2008Stat 877 © Brian S. Yandell3 model selection in regression known regressors (e.g. markers) –models with 1 or 2 regressors jump between models –centering regressors simplifies calculations

April 2008Stat 877 © Brian S. Yandell4 slope estimate for 1 regressor recall least squares estimate of slope note relation of slope to correlation

April 2008Stat 877 © Brian S. Yandell5 2 correlated regressors slopes adjusted for other regressors

April 2008Stat 877 © Brian S. Yandell6 Gibbs Sampler for Model 1 mean slope variance

April 2008Stat 877 © Brian S. Yandell7 Gibbs Sampler for Model 2 mean slopes variance

April 2008Stat 877 © Brian S. Yandell8 updates from 2->1 drop 2nd regressor adjust other regressor

April 2008Stat 877 © Brian S. Yandell9 updates from 1->2 add 2nd slope, adjusting for collinearity adjust other slope & variance

April 2008Stat 877 © Brian S. Yandell10 model selection in regression known regressors (e.g. markers) –models with 1 or 2 regressors jump between models –augment with new innovation z

April 2008Stat 877 © Brian S. Yandell11 change of variables change variables from model 1 to model 2 calculus issues for integration –need to formally account for change of variables –infinitessimal steps in integration (db) –involves partial derivatives (next page)

April 2008Stat 877 © Brian S. Yandell12 Jacobian & the calculus Jacobian sorts out change of variables –careful: easy to mess up here!

April 2008Stat 877 © Brian S. Yandell13 geometry of reversible jump a1a1 a1a1 a2a2 a2a2

April 2008Stat 877 © Brian S. Yandell14 QT additive reversible jump a1a1 a1a1 a2a2 a2a2

April 2008Stat 877 © Brian S. Yandell15 credible set for additive 90% & 95% sets based on normal regression line corresponds to slope of updates a1a1 a2a2

April 2008Stat 877 © Brian S. Yandell16 multivariate updating of effects more computations when m > 2 avoid matrix inverse –Cholesky decomposition of matrix simultaneous updates –effects at all loci accept new locus based on –sampled new genos at locus –sampled new effects at all loci also long-range positions updates before after

April 2008Stat 877 © Brian S. Yandell17 References Satagopan, Yandell (1996); Heath (1997); Sillanpää, Arjas (1998); Stephens, Fisch (1998) Green (1995); Richardson, Green (1997); Green 2003, 2004