Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.

Slides:



Advertisements
Similar presentations
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Advertisements

Copula Regression By Rahul A. Parsa Drake University &
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
Nguyen Ngoc Anh Nguyen Ha Trang
Multivariate linear models for regression and classification Outline: 1) multivariate linear regression 2) linear classification (perceptron) 3) logistic.
Probability & Statistical Inference Lecture 7 MSc in Computing (Data Analytics)
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 2. Hypothesis Testing.
Binary Response Lecture 22 Lecture 22.
1 Econometrics 1 Lecture 7 Multicollinearity. 2 What is multicollinearity.
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
Today Today: Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Maximum likelihood (ML) and likelihood ratio (LR) test
Estimation of parameters. Maximum likelihood What has happened was most likely.
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference (Sec. )
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
Inference about a Mean Part II
The Binary Logit Model Definition Characteristics Estimation 0.
Lecture 7 1 Statistics Statistics: 1. Model 2. Estimation 3. Hypothesis test.
Factor Analysis Ulf H. Olsson Professor of Statistics.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
The maximum likelihood method Likelihood = probability that an observation is predicted by the specified model Plausible observations and plausible models.
Likelihood probability of observing the data given a model with certain parameters Maximum Likelihood Estimation (MLE) –find the parameter combination.
Estimation Basic Concepts & Estimation of Proportions
The Triangle of Statistical Inference: Likelihoood
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Lecture note for Stat 231: Pattern Recognition and Machine Learning 4. Maximum Likelihood Prof. A.L. Yuille Stat 231. Fall 2004.
Learning Theory Reza Shadmehr logistic regression, iterative re-weighted least squares.
9-1 MGMG 522 : Session #9 Binary Regression (Ch. 13)
Lecture Slide #1 Logistic Regression Analysis Estimation and Interpretation Hypothesis Tests Interpretation Reversing Logits: Probabilities –Averages.
Forecasting Choices. Types of Variable Variable Quantitative Qualitative Continuous Discrete (counting) Ordinal Nominal.
Likelihood Methods in Ecology November 16 th – 20 th, 2009 Millbrook, NY Instructors: Charles Canham and María Uriarte Teaching Assistant Liza Comita.
M.Sc. in Economics Econometrics Module I Topic 7: Censored Regression Model Carol Newman.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Extreme Value Theory: Part II Sample (N=1000) from a Normal Distribution N(0,1) and fitted curve.
Generalised method of moments approach to testing the CAPM Nimesh Mistry Filipp Levin.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Computer Vision Lecture 6. Probabilistic Methods in Segmentation.
5. Maximum Likelihood –II Prof. Yuille. Stat 231. Fall 2004.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Maximum Likelihood Estimation
Université d’Ottawa - Bio Biostatistiques appliquées © Antoine Morin et Scott Findlay :32 1 Logistic regression.
M.Sc. in Economics Econometrics Module I Topic 4: Maximum Likelihood Estimation Carol Newman.
Logistic Regression Saed Sayad 1www.ismartsoft.com.
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
Review of statistical modeling and probability theory Alan Moses ML4bio.
CS 2750: Machine Learning Linear Models for Classification Prof. Adriana Kovashka University of Pittsburgh February 15, 2016.
MathematicalMarketing Slide 5.1 OLS Chapter 5: Ordinary Least Square Regression We will be discussing  The Linear Regression Model  Estimation of the.
Computacion Inteligente Least-Square Methods for System Identification.
Conditional Expectation
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
MathematicalMarketing Slide 3c.1 Mathematical Tools Chapter 3: Part c – Parameter Estimation We will be discussing  Nonlinear Parameter Estimation  Maximum.
Statistics 350 Lecture 3.
Estimating Volatilities and Correlations
Probability Theory and Parameter Estimation I
M.Sc. in Economics Econometrics Module I
Ch3: Model Building through Regression
Probabilistic Models for Linear Regression
Estimation Maximum Likelihood Estimates Industrial Engineering
More about Posterior Distributions
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Statistical Assumptions for SLR
EC 331 The Theory of and applications of Maximum Likelihood Method
Estimation Maximum Likelihood Estimates Industrial Engineering
Simple Linear Regression
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Presentation transcript:

Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution is not Normal? We can use an alternative estimator: MLE. See “Generalized Linear Models” in S-Plus.

OLS vs. MLE If assumptions of OLS hold, OLS and MLE give exactly same estimates! So, using MLE instead of OLS is OK! MLE called “Generalized Linear Models” in S-Plus. More general than “Linear Regression” Allows you to specify dist’n of error.

Example: Ozone Attainment “Out of Attainment” if ozone exceeds standard on a given day. Model distribution of number of days out of attainment in a given county over 20 years. Use a Poisson Distribution Estimate the parameter using Maximum Likelihood.

MLE Principle: choose parameter(s) that make observing the given data the most probable (or “likely”). How do we measure “likelihood”? If we know sampling distribution, know how “probable” or “likely” any given data are. So we can measure likelihood. We must know the distribution.

Graph of Likelihood

Log-Likelihood Maximizing log-likelihood is equivalent to maximizing likelihood.

Solution We can model number of exceedences as Poisson distribution. 1 parameter. Estimated with maximum likelihood Estimated parameter () is 2.45