CIS 2033 based on Dekking et al

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
Chapter 12 Simple Linear Regression
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Ch11 Curve Fitting Dr. Deshi Ye
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: maximum likelihood estimation of regression coefficients Original citation:
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Lecture 3: Inference in Simple Linear Regression BMTRY 701 Biostatistical Methods II.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C22: The Method of Least Squares.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
1 Introduction What does it mean when there is a strong positive correlation between x and y ? Regression analysis aims to find a precise formula to relate.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
The simple linear regression model and parameter estimation
Chapter 4: Basic Estimation Techniques
Chapter 11: Linear Regression and Correlation
Lecture 11: Simple Linear Regression
Regression Analysis AGEC 784.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Multiple Regression Analysis: Estimation
Ch3: Model Building through Regression
Chapter 5: The Simple Regression Model
Ch12.1 Simple Linear Regression
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Simple Linear Regression - Introduction
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
Modelling data and curve fitting
C14: The central limit theorem
Regression Models - Introduction
Chapter 10: Covariance and Correlation
Multiple Regression Models
Simple Linear Regression
Statistical Assumptions for SLR
Linear regression Fitting a straight line to observations.
The Simple Linear Regression Model: Specification and Estimation
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
CIS 2033 based on Dekking et al
Simple Linear Regression
Simple Linear Regression
Simple Linear Regression
C19: Unbiased Estimators
C3: Conditional Probability And Independence
3.2. SIMPLE LINEAR REGRESSION
Topic 11: Matrix Approach to Linear Regression
C19: Unbiased Estimators
Chapter 10: Covariance and Correlation
MATH 3033 based on Dekking et al
MATH 3033 based on Dekking et al
Chapter 10: Covariance and Correlation
Regression Models - Introduction
Presentation transcript:

CIS 2033 based on Dekking et al CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Slides by Kier Heilman Instructor Longin Jan Latecki C22: The Method of Least Squares 1

22.1 – Least Squares Consider the random variables: Yi = α + βxi + Ui for i = 1, 2, . . ., n. where random variables U1, U2, …, Un have zero expectation and variance σ 2 Method of Least Squares: Choose a value for α and β such that S(α,β)=( ) is minimal. 2

22.1 – Regression The observed value yi corresponding to xi and the value α+βxi on the regression line y = α + βx. 3

22.1– Estimation After some calculus magic, we have the following two simultaneously equations to estimate α and β: 4

22.1– Estimation After some simple algebraic rearranging, we put the equations in terms of α and β: (slope) (intercept) 5

22.1– Least Square Estimators are Unbiased All estimators for α and β are unbiased. For the simple linear regression model, the random variable is an unbiased estimator for δ2. 6

22.2– Residuals Residual: The vertical distance between the ith point and the estimated regression line: The sum of the residuals is zero. 7

22.2– Heteroscedasticity Homoscedasticity: The assumption of equal variance of the Ui (and therefore Yi). For instance, heteroscedasticity occurs when Yi with a large expected value have a larger variance than those with small expected values. 8

22.3– Relation with Maximum Likelihood What are the maximum likelihood estimates for α and β? To apply the method of least squares no assumption is needed about the type of distribution of the Ui. In case the type of distribution of the Ui is known, the maximum likelihood principle can be applied. Consider, for instance, the classical situation where the Ui are independent with an N(0, σ2) distribution. Using the maximum likelihood estimation for a normal distribution: Yi has an N (α + βxi, σ2) distribution, making the probability density function 9

22.3– Maximum Likelihood For fixed σ >0 the loglikelihood l (α, β, σ) obtains the maximum when is minimal. Hence, when random variables independent with a N(0,δ2) distribution, the maximum likelihood principle and the least squares method return the same estimators. The maximum likelihood estimator for σ 2 is: 10