Statistical Genomics Zhiwu Zhang Washington State University Lecture 27: Bayesian theorem.

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

Bayesian inference of normal distribution
1 Bayesian methods for parameter estimation and data assimilation with crop models David Makowski and Daniel Wallach INRA, France September 2006.
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
Bayesian Estimation in MARK
LECTURE 11: BAYESIAN PARAMETER ESTIMATION
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Baysian Approaches Kun Guo, PhD Reader in Cognitive Neuroscience School of Psychology University of Lincoln Quantitative Methods 2011.
Unit 8 Section 8-6.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Statistical Decision Theory
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Maximum Likelihood Estimator of Proportion Let {s 1,s 2,…,s n } be a set of independent outcomes from a Bernoulli experiment with unknown probability.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Learning Theory Reza Shadmehr Linear and quadratic decision boundaries Kernel estimates of density Missing data.
Maximum Likelihood - "Frequentist" inference x 1,x 2,....,x n ~ iid N( ,  2 ) Joint pdf for the whole random sample Maximum likelihood estimates.
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
DNA Identification: Bayesian Belief Update Cybergenetics © TrueAllele ® Lectures Fall, 2010 Mark W Perlin, PhD, MD, PhD Cybergenetics, Pittsburgh,
Modeling Data Greg Beckham. Bayes Fitting Procedure should provide – Parameters – Error estimates on the parameters – A statistical measure of goodness.
- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays Check out This presentation.
The Uniform Prior and the Laplace Correction Supplemental Material not on exam.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Statistical Genomics Zhiwu Zhang Washington State University Lecture 26: Kernel method.
Statistical Genomics Zhiwu Zhang Washington State University Lecture 19: SUPER.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Statistical Genomics Zhiwu Zhang Washington State University Lecture 25: Ridge Regression.
Statistical Genomics Zhiwu Zhang Washington State University Lecture 29: Bayesian implementation.
Statistical Genomics Zhiwu Zhang Washington State University Lecture 4: Statistical inference.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Bayesian Estimation and Confidence Intervals Lecture XXII.
Lecture 28: Bayesian methods
MCMC Output & Metropolis-Hastings Algorithm Part I
Probability Theory and Parameter Estimation I
Lecture 28: Bayesian Tools
Presented by: Karen Miller
Bayesian statistics So far we have thought of probabilities as the long term “success frequency”: #successes / #trails → P(success). In Bayesian statistics.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Special Topics In Scientific Computing
Washington State University
Logarithmic Functions and Their Graphs
More about Posterior Distributions
(Very Brief) Introduction to Bayesian Statistics
Lecture 23: Cross validation
Washington State University
Statistical NLP: Lecture 4
Lecture 16: Likelihood and estimates of variances
Pattern Recognition and Machine Learning
Lecture 26: Bayesian theory
LESSON 16: THE CHI-SQUARE DISTRIBUTION
Immediate activity Gender and subject choice brain dump, without looking at your books or notes write down as much as you can in connection to gender and.
Wellcome Trust Centre for Neuroimaging
Lecture 26: Bayesian theory
Bayes for Beginners Luca Chech and Jolanda Malamud
Lecture 27: Bayesian theorem
CS639: Data Management for Data Science
Homework We wish to classify two species of fish (salmon and bass) based on their luminosity as measured by a camera. We have the following likelihoods:
Lecture 17: Likelihood and estimates of variances
Lecture 23: Cross validation
Lecture 29: Bayesian implementation
Mathematical Foundations of BME Reza Shadmehr
Applied Statistics and Probability for Engineers
Testing Hypotheses about a Population Proportion
Presentation transcript:

Statistical Genomics Zhiwu Zhang Washington State University Lecture 27: Bayesian theorem

 Homework 6 (last) posted, due April 29, Friday, 3:10PM  Final exam: May 3, 120 minutes (3:10-5:10PM), 50  Evaluation due May 6 (7 out of 19 received). Administration

Outline  Concept development for genomic selection  Bayesian theorem  Bayesian transformation  Bayesian likelihood  Bayesian alphabet for genomic selection

All SNPs have same distribution y=x 1 g 1 + x 2 g 2 + … + x p g p + e ~N(0, b~N(0, I σ g 2 ) UK σa2)σa2) rrBLUP gBLUP

Selection of priors Distributions of g i LSE solve LL solely Flat Identical normal RR solve REML by EMMA σg2σg2

More realistic y=x 1 g 1 + x 2 g 2 + … + x p g p + e N(0, I σ g 2 ) … Out of control and overfitting?

Need help from Thomas Bayes "An Essay towards solving a Problem in the Doctrine of Chances" which was read to the Royal Society in 1763 after Bayes' death by Richard Price

An example from middle school A school by 60% boys and 40% girls. All boy wear pants. Half girls wear pants and half wear skirt. What is the probability to meet a student with pants? P(Pants)=60%*100+40%50%=80%

Probability P(pants)=60%*100+40%50%=80% P(Boy)*P(Pants | Boy) + P(Girl)*P(Pants | Girl)

Inverse question A school by 60% boys and 40% girls. All boy wear pants. Half girls wear pants and half wear skirt. Meet a student with pants. What is the probability the student is a boy? 60%*100+40%50% 60%*100% = 75% P(Boy | Pants)

P(Pants | Boy) P(Boy) + P(Pants | Girl) P(Girl) 60%*100+40%50% 60%*100 = 75% P(Pants | Boy) P(Boy) P(Pants) P(Pants | Boy) P(Boy)

Bayesian theorem P(Boy|Pants)P(Pants)=P(Pants|Boy)P(Boy)

Bayesian transformation P(Boy | Pants) P(Pants | Boy) P(Boy) y(data)  parameters  Likelihood of data given parameters P(y|  ) Distribution of parameters (prior) P(  ) P(  | y) Posterior distribution of  given y

Bayesian for hard problem A public school containing 60% males and 40% females. What is the probability to draw four males? -- Probability (36%) Four males were draw from a public school. What are the gender proportions? -- Inverse probability (?)

Prior knowledge Unsure Reject 100% male Gender distribution 100% female unlikely Likely Safe Four males were draw from a public school. What are the gender proportions? -- Inverse probability (?)

P(G|y) Probability of unknown given data (hard to solve) Probability of observed given unknown (easy to solve) Prior knowledge of unknown (freedom) P(G) Transform hard problem to easy one

P(y|G) p=seq(0, 1,.01) n=4 k=n pyp=dbinom(k,n,p) theMax=pyp==max(pyp) pMax=p[theMax] plot(p,pyp,type="b",main=paste("Data=", pMax,sep=""))

P(G) ps=p*10-5 pd=dnorm(ps) theMax=pd==max(pd) pMax=p[theMax] plot(p,pd,type="b",main=paste("Prior=", pMax,sep=""))

P(y|G) P(G) ppy=pd*pyp theMax=ppy==max(ppy) pMax=p[theMax] plot(p,ppy,type="b",main=paste("Optimum=", pMax,sep=""))

Depend what you believe

Ten are all males

Control of unknown parameters y=x 1 g 1 + x 2 g 2 + … + x p g p + e N(0, I σ g1 2 )N(0, I σ gp 2 )N(0, I σ g2 2 ) … Prior distribution

Selection of priors Prior distributions of g i RR Flat Others Bayes

One choice is inverse Chi-Square y=x 1 g 1 + x 2 g 2 + … + x p g p + e N(0, I σ g1 2 )N(0, I σ gp 2 )N(0, I σ g2 2 ) … σ gi 2 ~X -1 (v, S)Hyper parameters

Bayesian likelihood P(g i, σ gi 2, σ e 2 v, s | y) = P(y | g i, σ gi 2, σ e 2 v, s) P(g i, σ gi 2, σ e 2 v, s)

Variation of assumption σ gi 2 ~X -1 (v, S) with probability 1-π σ gi 2 =0 with probability π σ gi 2 >0 for all i Bayes A Bayes B }

Methodmarker effect Genomic Effect Variance Residual variance Unknown parameter Bayes AAll SNPsX -2 (v,S) X -2 (0,-2) Bayes BP(1-π)X -2 (v,S)X -2 (0,-2) Bayes Cπ P(1-π)X -2 (v,S’)X -2 (0,-2) π Bayes Dπ P(1-π)X -2 (v,S)X -2 (0,-2) S π BayesianLASSO P(1-π) Double exponential effects λ t BayesMulti, BayesR P(1-π) Multiple normal distributions γ Bayes alphabet

Highlight  Concept development for genomic selection  Bayesian theorem  Bayesian transformation  Bayesian likelihood  Bayesian alphabet for genomic selection