Stat 305 2009 Lab 6 Parameter Estimation.

Slides:



Advertisements
Similar presentations
Point Estimation Notes of STAT 6205 by Dr. Fan.
Advertisements

The Estimation Problem How would we select parameters in the limiting case where we had ALL the data? k → l  l’ k→ l’ Intuitively, the actual frequencies.
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Fundamentals of Data Analysis Lecture 12 Methods of parametric estimation.
Parameter Estimation: Maximum Likelihood Estimation Chapter 3 (Duda et al.) – Sections CS479/679 Pattern Recognition Dr. George Bebis.
Today Today: Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Section 6.1 Let X 1, X 2, …, X n be a random sample from a distribution described by p.m.f./p.d.f. f(x ;  ) where the value of  is unknown; then  is.
A gentle introduction to Gaussian distribution. Review Random variable Coin flip experiment X = 0X = 1 X: Random variable.
Stat 321 – Lecture 26 Estimators (cont.) The judge asked the statistician if she promised to tell the truth, the whole truth, and nothing but the truth?
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Maximum-Likelihood estimation Consider as usual a random sample x = x 1, …, x n from a distribution with p.d.f. f (x;  ) (and c.d.f. F(x;  ) ) The maximum.
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
July 3, Department of Computer and Information Science (IDA) Linköpings universitet, Sweden Minimal sufficient statistic.
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
Maximum likelihood (ML)
The Neymann-Pearson Lemma Suppose that the data x 1, …, x n has joint density function f(x 1, …, x n ;  ) where  is either  1 or  2. Let g(x 1, …,
Maximum Likelihood Estimation
STATISTICAL INFERENCE PART I POINT ESTIMATION
Consistency An estimator is a consistent estimator of θ, if , i.e., if
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Conditional Expectation
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
Week 101 Test on Pairs of Means – Case I Suppose are iid independent of that are iid. Further, suppose that n 1 and n 2 are large or that are known. We.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 9.1: Parameter estimation CIS Computational Probability.
Stat 223 Introduction to the Theory of Statistics
CS479/679 Pattern Recognition Dr. George Bebis
Statistical Estimation
Standard Errors Beside reporting a value of a point estimate we should consider some indication of its precision. For this we usually quote standard error.
PDF, Normal Distribution and Linear Regression
STATISTICS POINT ESTIMATION
Stat 223 Introduction to the Theory of Statistics
12. Principles of Parameter Estimation
6 Point Estimation.
LECTURE 06: MAXIMUM LIKELIHOOD ESTIMATION
Probability Theory and Parameter Estimation I
Confidence Intervals – Introduction
STAT 311 REVIEW (Quick & Dirty)
Ch3: Model Building through Regression
STATISTICAL INFERENCE PART I POINT ESTIMATION
Stat Final Lab.
Parameter, Statistic and Random Samples
Stat Lab 7.
Estimation Maximum Likelihood Estimates Industrial Engineering
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
CI for μ When σ is Unknown
More about Posterior Distributions
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 9.1: Parameter estimation CIS Computational Probability.
Modelling data and curve fitting
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Confidence Intervals Chapter 11.
POINT ESTIMATOR OF PARAMETERS
10701 / Machine Learning Today: - Cross validation,
The total score of the quizzes:
Estimation Maximum Likelihood Estimates Industrial Engineering
Stat 223 Introduction to the Theory of Statistics
Stat Lab 9.
STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS
STATISTICAL INFERENCE PART I POINT ESTIMATION
Chapter 9 Chapter 9 – Point estimation
12. Principles of Parameter Estimation
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Estimation – Posterior intervals
Applied Statistics and Probability for Engineers
Maximum Likelihood Estimation (MLE)
Presentation transcript:

Stat 305 2009 Lab 6 Parameter Estimation

? The form of f is known, but θ is unknown. Distribution: f(x; θ) x12

? The form of f is known, but θ is unknown. Distribution: f(x; θ) θ

How to estimate θ ? Methods of finding estimators Method of Moment estimation Maximum Likelihood estimation

Method of Moment Estimation θ = g ( E(X), E(X2), E(X3), …) g must be known and the required moments must exist. E(Xk) g may not be unique. = g ( m, m2, m3, …)

Maximum Likelihood Estimation Likelihood function is a joint density function of observing the data we have. L(θ) = f(x1,…,xn; θ) = (if Xi are indept and fi is the density of xi) (same distribution, say f1 ) =

Maximum Likelihood Estimation = L(θ) Assume that x1…xn is a random sample of size n from the distribution with density In most situations, it is easy to find the MLE by maximizing the log likelihood function, i.e. ln L(θ) Maximum likelihood estimator is an estimator maximizing the likelihood function L(θ)

Examples: 1) Assume that X1,…Xn is a random sample from Ga(α,β) Density of Gamma distribution : for X>0 and both α and β > 0

E(X) = α/β β= α0/E(X) β= α0/m ^ 1) Assume that X1,…Xn is a random sample from Ga(α,β) (a) Method of moment estimator : Case 1) If β is unknown but α is known, say α0 E(X) = α/β β= α0/E(X) The method of moment estimator of β given α=α0 is β= α0/m ^

E(X) = α/β α= E(X) β0 ^ α= m β0 1) Assume that X1,…Xn is a random sample from Ga(α,β) (a) Method of moment estimator : E(X) = α/β Case 2) If α is unknown but β is known, say β0 α= E(X) β0 The method of moment estimator of α given β=β0 is α= m β0 ^

β=E(X) / [E(X2) – {E(X)}2] m / [m2 - m2] 1) Assume that X1,…Xn is a random sample from Ga(α,β) (a) Method of moment estimator : E(X) = α/β = K0 (known) Case 3) Both α and β are unknown E(X2)=α(α-1)/β2 After some simple algebra, K0 K0 β=E(X) / [E(X2) – {E(X)}2] m / [m2 - m2] K0 K0 α={E(X)}2 / [E(X2) – {E(X)}2] K0 m2 / [m2 - m2] K0 K0 K0

β= α0/m Non-uniqueness ^ 1) Assume that X1,…Xn is a random sample from Ga(α,β) Non-uniqueness (a) Method of moment estimator : Case 1) If β is unknown but α is known, say α0 The method of moment estimator of β given α=α0 is β= α0/m ^ by E(X) = α/β The method of moment estimator of β given α=α0 is

βn= α0/m ^ 1) Assume that X1,…Xn is a random sample from Ga(α,β) (b) Maximum Likelihood estimator : Case 1) If β is unknown but α is known, say α0 βn= α0/m ^ Check if it is inside the parameter space of β i.e. is it >0 ?

1) Assume that X1,…Xn is a random sample from Ga(α,β) (b) Maximum Likelihood estimator : Case 2) If α is unknown but β is known, say β0 There is no explicit form of the solution, so we have to find it by some numerical methods. optim(par, fn, …) par: Initial values for the parameters to be optimized over. fn: A function to be minimized.

√ 2) Assume that X1,…Xn is a random sample from N(μ,1) (a) Method of moment estimator of μ: (b) Maximum Likelihood estimator of μ: What is the parameter space of μ? If μ can be any real number, then √ However, if μ is restricted to be non-negative, then is still the MLE of μ?

2) Assume that X1,…Xn is a random sample from N(μ,1) (b) Maximum Likelihood estimator of μ: However, if μ is restricted to be non-negative, then is still the MLE of μ? μ≥0 Log-likelihood function: >0 when <0 when

μ≥0 If >0 when <0 when 2) Assume that X1,…Xn is a random sample from N(μ,1) μ≥0 (b) Maximum Likelihood estimator of μ: >0 when <0 when μ If μ

μ≥0 If If 2) Assume that X1,…Xn is a random sample from N(μ,1) (b) Maximum Likelihood estimator of μ: If If

Exercises for students: 1) Assume that X1,…Xn is a random sample from Bin(1,p) a) Find the method of moment estimator of p. b) Find the MLE of p when (i) 0< p < 1 and (ii) p ≥ ½. 2) Assume that X1,…Xn is a random sample from the uniform distribution over [0,θ], where θ is positive and finite. a) Find the method of moment estimator of θ. b) Find the MLE of θ. c) What is the MLE of θ if [0,θ] is changed to be [θ-1/2, θ+1/2], where θ is any finite real number?