CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Instructor Longin Jan Latecki C22: The Method of Least Squares.

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

1 MAXIMUM LIKELIHOOD ESTIMATION OF REGRESSION COEFFICIENTS X Y XiXi 11  1  +  2 X i Y =  1  +  2 X We will now apply the maximum likelihood principle.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Managerial Economics in a Global Economy
Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: introduction to maximum likelihood estimation Original citation: Dougherty,
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Ch. 19 Unbiased Estimators Ch. 20 Efficiency and Mean Squared Error CIS 2033: Computational Probability and Statistics Prof. Longin Jan Latecki Prepared.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Objectives (BPS chapter 24)
The General Linear Model. The Simple Linear Model Linear Regression.
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Business Statistics - QBM117 Statistical inference for regression.
1 A MONTE CARLO EXPERIMENT In the previous slideshow, we saw that the error term is responsible for the variations of b 2 around its fixed component 
Simple Linear Regression Analysis
Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: maximum likelihood estimation of regression coefficients Original citation:
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Linear Regression Analysis Additional Reference: Applied Linear Regression Models – Neter, Kutner, Nachtsheim, Wasserman The lecture notes of Dr. Thomas.
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Lecture 3: Inference in Simple Linear Regression BMTRY 701 Biostatistical Methods II.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Regression. Types of Linear Regression Model Ordinary Least Square Model (OLS) –Minimize the residuals about the regression linear –Most commonly used.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Data Analysis.
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
CIS 2033 A Modern Introduction to Probability and Statistics Understanding Why and How Chapter 17: Basic Statistical Models Slides by Dan Varano Modified.
1 HETEROSCEDASTICITY: WEIGHTED AND LOGARITHMIC REGRESSIONS This sequence presents two methods for dealing with the problem of heteroscedasticity. We will.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and 2 Now, we need procedures to calculate  and 2 , themselves.
The simple linear regression model and parameter estimation
Chapter 4: Basic Estimation Techniques
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Chapter 2. Two-Variable Regression Analysis: Some Basic Ideas
Chapter 5: The Simple Regression Model
BIVARIATE REGRESSION AND CORRELATION
Slides by JOHN LOUCKS St. Edward’s University.
Simple Linear Regression - Introduction
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
CIS 2033 based on Dekking et al
Introduction to Instrumentation Engineering
6-1 Introduction To Empirical Models
C14: The central limit theorem
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Statistical Assumptions for SLR
The Simple Linear Regression Model: Specification and Estimation
Regression Lecture-5 Additional chapters of mathematics
Simple Linear Regression
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and 2 Now, we need procedures to calculate  and 2 , themselves.
C19: Unbiased Estimators
C19: Unbiased Estimators
Regression Models - Introduction
St. Edward’s University
Presentation transcript:

CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C22: The Method of Least Squares

22.1 – Least Squares Given is a bivariate dataset (x1, y1), …, (xn, yn), where x1, …, xn are nonrandom and Yi = α + βxi + Ui are random variables for i = 1, 2,..., n. The random variables U1, U2, …, Un have zero expectation and variance σ 2 Method of Least Squares: Choose a value for α and β such that S( α,β)=( ) is minimal.

22.1 – Regression The observed value y i corresponding to x i and the value α+βx i on the regression line y = α + βx.

22.1– Estimation  After some calculus magic, we get two equations to estimate α and β: Method of Least Squares: Choose a value for α and β such that S( α,β)=( ) is minimal. To find the least squares estimates, we differentiate S(α, β) with respect to α and β, and we set the derivatives equal to 0:

22.1– Estimation  After some simple algebraic rearranging, we obtain: (slope) (intercept)

Regression line y = 0.25 x –2.35 for points

22.1– Least Square Estimators are Unbiased  The estimators for α and β are unbiased.  For the simple linear regression model, the random variable is an unbiased estimator for σ 2.

22.2– Residuals A way to explore whether the linear regression model is appropriate to model a given bivariate dataset is to inspect a scatter plot of the so-called residuals ri against the xi. The ith residual ri is defined as the vertical distance between the ith point and the estimated regression line: We always have

22.2– Heteroscedasticity  Homoscedasticity: The assumption of equal variance of the Ui (and therefore Yi). In case the variance of Yi depends on the value of xi, we speak of heteroscedasticity. For instance, heteroscedasticity occurs when Yi with a large expected value have a larger variance than those with small expected values. This produces a “fanning out” effect, which can be observed in the figure:

22.3– Relation with Maximum Likelihood What are the maximum likelihood estimates for α and β ? To apply the method of least squares no assumption is needed about the type of distribution of the Ui. In case the type of distribution of the Ui is known, the maximum likelihood principle can be applied. In particular, when the Ui are independent with an N(0, σ 2 ) distribution. Then Y i has an N ( α + β x i, σ 2 ) distribution, making the probability density function

When Yi are independent, and eachYi has an N(α+βxi, σ2) distribution, and assuming that the linear model is appropriate to model a given bivariate dataset, the residuals ri should look like the realization of a random sample from a normal distribution. An example is shown in the figure below:

22.3– Maximum Likelihood For fixed σ >0 the loglikelihood l ( α, β, σ ) obtains the maximum when is minimal. Hence, when random variables independent with a N (0,σ 2 ) distribution, the maximum likelihood principle and the least squares method return the same estimators. The maximum likelihood estimator for σ 2 is: