Course on Bayesian Methods in Environmental Valuation

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Bayes rule, priors and maximum a posteriori
The Poisson distribution
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
A Tutorial on Learning with Bayesian Networks
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
By C. Yeshwanth and Mohit Gupta.  An inference method that uses Bayes’ rule to update prior beliefs based on data  Allows a-priori information about.
Lecture (10) Mathematical Expectation. The expected value of a variable is the value of a descriptor when averaged over a large number theoretically infinite.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Review of Basic Probability and Statistics
1 Engineering Computation Part 6. 2 Probability density function.
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Learning Bayesian Networks. Dimensions of Learning ModelBayes netMarkov net DataCompleteIncomplete StructureKnownUnknown ObjectiveGenerativeDiscriminative.
Presenting: Assaf Tzabari
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Simple Bayesian Supervised Models Saskia Klein & Steffen Bollmann 1.
Learning Bayesian Networks (From David Heckerman’s tutorial)
Multivariate Probability Distributions. Multivariate Random Variables In many settings, we are interested in 2 or more characteristics observed in experiments.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Chapter Two Probability Distributions: Discrete Variables
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
A quick intro to Bayesian thinking 104 Frequentist Approach 10/14 Probability of 1 head next: = X Probability of 2 heads next: = 0.51.
Statistical Decision Theory
Moment Generating Functions
Random Sampling, Point Estimation and Maximum Likelihood.
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
Exam I review Understanding the meaning of the terminology we use. Quick calculations that indicate understanding of the basis of methods. Many of the.
Theory of Probability Statistics for Business and Economics.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
Lecture 2 Forestry 3218 Lecture 2 Statistical Methods Avery and Burkhart, Chapter 2 Forest Mensuration II Avery and Burkhart, Chapter 2.
Determination of Sample Size: A Review of Statistical Theory
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
1 Francisco José Vázquez Polo [ Miguel Ángel Negrín Hernández [ {fjvpolo or
Probability Review-1 Probability Review. Probability Review-2 Probability Theory Mathematical description of relationships or occurrences that cannot.
Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Bayesian Prior and Posterior Study Guide for ES205 Yu-Chi Ho Jonathan T. Lee Nov. 24, 2000.
MATH 643 Bayesian Statistics. 2 Discrete Case n There are 3 suspects in a murder case –Based on available information, the police think the following.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
SADC Course in Statistics The Poisson distribution.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Bayesian Approach Jake Blanchard Fall Introduction This is a methodology for combining observed data with expert judgment Treats all parameters.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Hierarchical Bayesian Analysis: Binomial Proportions Dwight Howard’s Game by Game Free Throw Success Rate – 2013/2014 NBA Season Data Source:
Conditional Expectation
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Crash course in probability theory and statistics – part 2 Machine Learning, Wed Apr 16, 2008.
MECH 373 Instrumentation and Measurements
Engineering Probability and Statistics - SE-205 -Chap 3
Business Statistics Topic 4
CS 2750: Machine Learning Density Estimation
Chapter Six Normal Curves and Sampling Probability Distributions
Bayes Net Learning: Bayesian Approaches
Chapter 7: Sampling Distributions
Review of Probabilities and Basic Statistics
OVERVIEW OF BAYESIAN INFERENCE: PART 1
Chapter 5 Some Important Discrete Probability Distributions
More about Posterior Distributions
The Binomial and Geometric Distributions
Statistical NLP: Lecture 4
Mathematical Foundations of BME Reza Shadmehr
Moments of Random Variables
Presentation transcript:

Course on Bayesian Methods in Environmental Valuation Basics (continued): Models for proportions and means Francisco José Vázquez Polo [www.personales.ulpgc.es/fjvpolo.dmc] Miguel Ángel Negrín Hernández [www.personales.ulpgc.es/mnegrin.dmc] {fjvpolo or mnegrin}@dmc.ulpgc.es 1

Binomial and Beta distributions Problem: Suppose that θ represents a percentage and we are interested in its estimation: Examples: Probability of a single head occurs when we throw a coin. probability of using public transport Probability of paying for the entry to a natural park.

Binomial and Beta distributions Binomial distribution: X has a binomial distribution with parameters θ and n if its density function is: Moments:

Prior: Beta distribution θ has a beta distribution with parameters α and β if its density function is: 2. Moments:

Prior: Beta distribution Advantages of the Beta distribution: - Its natural unit range from 0 to 1 - The beta distribution is a conjugate family for the binomial distribution - It is very flexible

Prior: Beta distribution

Prior: Beta distribution - Elicitation - Non-informative prior: Beta(1,1), Beta(0.5, 0.5)

Beta-Binomial Model 1.Model Given θ the observations X1,…,Xm are mutually independent with B(x|θ,1) density function: The joint density of X1,…,Xn given θ is:

Beta-Binomial Model The conjugate prior distribution for θ is the beta distribution Beta(α0, β0) with density: The posterior distribution of θ given X has density:

Updating parameters Prior Posterior

Posterior: Beta distribution Posterior moments:

Binomial and Beta distributions Example: We are studying the willingness to pay for a natural park in Gran Canaria (price of 5€). We have a sample of 20 individuals and 14 of them are willing to pay 5 euros for the entry. Elicit the prior information Obtain the posterior distribution (mean, mode, variance)

Poisson and Gamma distributions Problem: Suppose that λ represents a the mean of a discrete variable X. Model used in analyzing count data. Examples: Number of visits to an specialist Number of visitors to state parks The number of people killed in road accidents

Poisson and Gamma distributions Poisson distribution: X has a Poisson distribution with parameters λ if its density function is: Moments:

Prior: Gamma distribution λ has a gamma distribution with parameters α and β if its density function is: 2. Moments:

Prior: Gamma distribution Advantages of the Gamma distribution: - The gamma distribution is a conjugate family for the Poisson distribution - It is very flexible

Prior: Gamma distribution - Elicitation - Non-informative prior: Gamma(1,0), Gamma(0.5,0)

Poisson-Gamma Model The conjugate prior distribution for λ is the gamma distribution Gamma(α0, β0) with density: The posterior distribution of θ given X has density:

Updating parameters Prior Posterior

Posterior: Gamma Distribution Posterior moments:

Posterior: Gamma Distribution Example: We are studying the number of visits to a natural park during the last two months. We have data of the weekly visits: {10, 8, 35, 15, 12, 6, 9, 17} Elicit the prior information Obtain the posterior distribution (mean, mode, variance)

Other conjugated analysis

Good & Bad News MCMC Only simple models result in equations More complex models require numerical methods to compute posterior mean, posterior standard deviations, prediction, and so on. MCMC