Download presentation
Presentation is loading. Please wait.
Published byRebecca Garrett Modified over 9 years ago
1
The Uniform Prior and the Laplace Correction Supplemental Material not on exam
2
Bayesian Inference We start with P( ) - prior distribution about the values of P(x 1, …, x n | ) - likelihood of examples given a known value Given examples x 1, …, x n, we can compute posterior distribution on Where the marginal likelihood is
3
Binomial Distribution: Laplace Est. In this case the unknown parameter is = P(H) Simplest prior P( ) = 1 for 0< <1 Likelihood where h is number of heads in the sequence Marginal Likelihood:
4
Marginal Likelihood Using integration by parts we have: Multiply both side by n choose h, we have
5
Marginal Likelihood - Cont The recursion terminates when h = n Thus We conclude that the posterior is
6
Bayesian Prediction How do we predict using the posterior? We can think of this as computing the probability of the next element in the sequence – Assumption: if we know , the probability of X n+1 is independent of X 1, …, X n
7
Bayesian Prediction Thus, we conclude that
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.