Lecture 26: Bayesian theory

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

March 2006Alon Slapak 1 of 1 Bayes Classification A practical approach Example Discriminant function Bayes theorem Bayes discriminant function Bibliography.
Pattern Recognition and Machine Learning
1 Bayesian methods for parameter estimation and data assimilation with crop models David Makowski and Daniel Wallach INRA, France September 2006.
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
Estimation, Variation and Uncertainty Simon French
LECTURE 11: BAYESIAN PARAMETER ESTIMATION
A Brief Introduction to Bayesian Inference Robert Van Dine 1.
Introduction  Bayesian methods are becoming very important in the cognitive sciences  Bayesian statistics is a framework for doing inference, in a principled.
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Bayesian Models Honors 207, Intro to Cognitive Science David Allbritton An introduction to Bayes' Theorem and Bayesian models of human cognition.
Baysian Approaches Kun Guo, PhD Reader in Cognitive Neuroscience School of Psychology University of Lincoln Quantitative Methods 2011.
1 September 4, 2003 Bayesian System Identification and Structural Reliability Soheil Saadat, Research Associate Mohammad N. Noori, Professor & Head Department.
T HOMAS B AYES TO THE RESCUE st5219: Bayesian hierarchical modelling lecture 1.4.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Machine Learning Saarland University, SS 2007 Holger Bast [with input from Ingmar Weber] Max-Planck-Institut für Informatik Saarbrücken, Germany Lecture.
A quick intro to Bayesian thinking 104 Frequentist Approach 10/14 Probability of 1 head next: = X Probability of 2 heads next: = 0.51.
Statistical Decision Theory
Estimation in Sampling!? Chapter 7 – Statistical Problem Solving in Geography.
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
Introduction to Probability Theory March 24, 2015 Credits for slides: Allan, Arms, Mihalcea, Schutze.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics;
Maximum Likelihood - "Frequentist" inference x 1,x 2,....,x n ~ iid N( ,  2 ) Joint pdf for the whole random sample Maximum likelihood estimates.
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
DNA Identification: Bayesian Belief Update Cybergenetics © TrueAllele ® Lectures Fall, 2010 Mark W Perlin, PhD, MD, PhD Cybergenetics, Pittsburgh,
Modeling Data Greg Beckham. Bayes Fitting Procedure should provide – Parameters – Error estimates on the parameters – A statistical measure of goodness.
- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
1 URBDP 591 A Lecture 12: Statistical Inference Objectives Sampling Distribution Principles of Hypothesis Testing Statistical Significance.
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
Course on Bayesian Methods in Environmental Valuation
Univariate Gaussian Case (Cont.)
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Statistical Genomics Zhiwu Zhang Washington State University Lecture 25: Ridge Regression.
Statistical Genomics Zhiwu Zhang Washington State University Lecture 29: Bayesian implementation.
Sampling random numbers (from various distributions)
Statistical Genomics Zhiwu Zhang Washington State University Lecture 4: Statistical inference.
I. Statistical Methods for Genome-Enabled Prediction of Complex Traits OUTLINE THE CHALLENGES OF PREDICTING COMPLEX TRAITS ORDINARY LEAST SQUARES (OLS)
Computing with R & Bayesian Statistical Inference P548: Intro Bayesian Stats with Psych Applications Instructor: John Miyamoto 01/11/2016: Lecture 02-1.
Statistical Genomics Zhiwu Zhang Washington State University Lecture 27: Bayesian theorem.
Bayesian Estimation and Confidence Intervals Lecture XXII.
Bayesian Estimation and Confidence Intervals
MCMC Output & Metropolis-Hastings Algorithm Part I
Probability Theory and Parameter Estimation I
Presented by: Karen Miller
Bayesian statistics So far we have thought of probabilities as the long term “success frequency”: #successes / #trails → P(success). In Bayesian statistics.
Introduction to the bayes Prefix in Stata 15
Special Topics In Scientific Computing
Bayes theorem.
More about Posterior Distributions
(Very Brief) Introduction to Bayesian Statistics
Statistical NLP: Lecture 4
Lecture 16: Likelihood and estimates of variances
Sampling Distribution of a Sample Proportion
Wellcome Trust Centre for Neuroimaging
Lecture 26: Bayesian theory
LECTURE 09: BAYESIAN LEARNING
LECTURE 07: BAYESIAN ESTIMATION
Bayes for Beginners Luca Chech and Jolanda Malamud
Lecture 27: Bayesian theorem
Sampling Distribution of a Sample Proportion
CS639: Data Management for Data Science
Lecture 17: Likelihood and estimates of variances
Lecture 29: Bayesian implementation
Mathematical Foundations of BME Reza Shadmehr
Presentation transcript:

Lecture 26: Bayesian theory Statistical Genomics Lecture 26: Bayesian theory Zhiwu Zhang Washington State University

Outline Concept development for genomic selection Bayesian theorem Bayesian transformation Bayesian likelihood Bayesian alphabet for genomic selection

All SNPs have same distribution rrBLUP gi~N(0, I σg2) y=x1g1 + x2g2 + … + xpgp + e gBLUP U ~N(0, K σa2)

Selection of priors Distributions of gi Flat Identical normal LSE solve LL solely RR solve REML by EMMA Distributions of gi

Out of control and overfitting? More realistic Out of control and overfitting? … N(0, I σg12) N(0, I σg22) N(0, I σgp2) y=x1g1 + x2g2 + … + xpgp + e

Need help from Thomas Bayes "An Essay towards solving a Problem in the Doctrine of Chances" which was read to the Royal Society in 1763 after Bayes' death by Richard Price

An example from middle school A school by 60% boys and 40% girls. All boy wear pants. Half girls wear pants and half wear skirt. What is the probability to meet a student with pants? P(Pants)=60%*100+40%50%=80%

P(Boy)*P(Pants | Boy) + P(Girl)*P(Pants | Girl) Probability P(pants)=60%*100+40%50%=80% P(Boy)*P(Pants | Boy) + P(Girl)*P(Pants | Girl)

Inverse question A school by 60% boys and 40% girls. All boy wear pants. Half girls wear pants and half wear skirt. Meet a student with pants. What is the probability the student is a boy? P(Boy | Pants) 60%*100% = 75% 60%*100+40%50%

P(Pants | Boy) P(Boy) + P(Pants | Girl) P(Girl) P(Boy|Pants) 60%*100 = 75% 60%*100+40%50% P(Pants | Boy) P(Boy) P(Pants | Boy) P(Boy) + P(Pants | Girl) P(Girl) P(Pants | Boy) P(Boy) P(Pants)

P(Boy|Pants)P(Pants)=P(Pants|Boy)P(Boy) Bayesian theorem q(parameters) X P(Boy|Pants)P(Pants)=P(Pants|Boy)P(Boy) Constant y(data)

Bayesian transformation Posterior distribution of q given y q(parameters) y(data) P(q | y) P(Boy | Pants) ∝ P(Pants | Boy) P(Boy) Likelihood of data given parameters P(y|q) Distribution of parameters (prior) P(q)

Bayesian for hard problem A public school containing 60% males and 40% females. What is the probability to draw four males? -- Probability (0.6^4=12.96%) Four males were draw from a public school. What are the male proportion? -- Inverse probability (?)

Prior knowledge Gender distribution 100% male 100% female Safe unlikely Likely Unsure Reject Four males were draw from a public school. What is the male proportion? -- Inverse probability (?)

P(G|y) ∝ P(y|G) P(G) Transform hard problem to easy one Probability of unknown given data (hard to solve) Probability of observed given unknown (easy to solve) Prior knowledge of unknown (freedom)

P(y|G) Probability of having 4 males given male proportion p=seq(0, 1, .01) n=4 k=n pyp=dbinom(k,n,p) theMax=pyp==max(pyp) pMax=p[theMax] plot(p,pyp,type="b",main=paste("Data=", pMax,sep=""))

Probability of male proportion P(G) Probability of male proportion ps=p*10-5 pd=dnorm(ps) theMax=pd==max(pd) pMax=p[theMax] plot(p,pd,type="b",main=paste("Prior=", pMax,sep=""))

P(G|y)∝ P(y|G) P(G) Probability of male proportion given 4 males drawn ppy=pd*pyp theMax=ppy==max(ppy) pMax=p[theMax] plot(p,ppy,type="b",main=paste("Optimum=", pMax,sep=""))

Depend what you believe Male=Female More Male

Ten are all males More Male Male=Female Much more male vs. 57%

Bayesian likelihood ∝ Posterior distribution of q given y q(parameters) y(data) P(q | y) P(Boy | Pants) ∝ P(Pants | Boy) P(Boy) Likelihood of data given parameters P(y|q) Distribution of parameters (prior) P(q)

Highlight Concept development for genomic selection Bayesian theorem Bayesian transformation Bayesian likelihood Bayesian alphabet for genomic selection