Week 3 2007Lecture 3Slide #1 Minimizing e 2 : Deriving OLS Estimators The problem Deriving b 0 Deriving b 1 Interpreting b 0 and b 1.

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
The Simple Regression Model
Quantitative Methods 2 Lecture 3 The Simple Linear Regression Model Edmund Malesky, Ph.D., UCSD.
Sociology 601 Class 17: October 28, 2009 Review (linear regression) –new terms and concepts –assumptions –reading regression computer outputs Correlation.
Econ 140 Lecture 151 Multiple Regression Applications Lecture 15.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Introduction to Linear Regression.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Statistics for the Social Sciences
January 31, 2006Lecture 2bSlide #1 Minimizing e 2 : A Refresher in Calculus Minimizing error The derivative Slope at a point Differentiation Rules of Derivation.
The Simple Regression Model
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Business Statistics - QBM117 Least squares regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Notes on Weighted Least Squares Straight line Fit Passing Through The Origin Amarjeet Bhullar November 14, 2008.
Simple Linear Regression Analysis
Relationships Among Variables
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Active Learning Lecture Slides
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© The McGraw-Hill Companies, Inc., 2000 Business and Finance College Principles of Statistics Lecture 10 aaed EL Rabai week
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Statistical Methods Statistical Methods Descriptive Inferential
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
Slide 8- 1 Copyright © 2010 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Business Statistics First Edition.
Regression Regression relationship = trend + scatter
September 1, 2009 Session 2Slide 1 PSC 5940: Regression Review and Questions about “Causality” Session 2 Fall, 2009.
Lecture 14 Summary of previous Lecture Regression through the origin Scale and measurement units.
Correlation & Regression Analysis
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
Lecture 8: Ordinary Least Squares Estimation BUEC 333 Summer 2009 Simon Woodcock.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
The simple linear regression model and parameter estimation
Lecture 9 Sections 3.3 Objectives:
Notes on Weighted Least Squares Straight line Fit Passing Through The Origin Amarjeet Bhullar November 14, 2008.
CHAPTER 3 Describing Relationships
REGRESSION G&W p
CHAPTER 3 Describing Relationships
Linear Regression Bonus
Ordinary Least Squares (OLS) Regression
Multiple Regression.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Econ 3790: Business and Economics Statistics
S519: Evaluation of Information Systems
Lesson 5.3 How do you write linear equations in point-slope form?
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Least-Squares Regression
The Multiple Regression Model
Chapter 5 LSRL.
Objective Find slope by using the slope formula..
Forms of a linear equation
Discrete Least Squares Approximation
Least-Squares Regression
Product moment correlation
2-4: Writing Linear Equations Using Slope Intercept Form
y = mx + b Linear Regression line of best fit REMEMBER:
6.1.1 Deriving OLS OLS is obtained by minimizing the sum of the square errors. This is done using the partial derivative 6.
Presentation transcript:

Week Lecture 3Slide #1 Minimizing e 2 : Deriving OLS Estimators The problem Deriving b 0 Deriving b 1 Interpreting b 0 and b 1

Week Lecture 3Slide #2 Measuring Error: Residuals 2 Objective: to minimize ∑e. For computational reasons, we minimize ∑e 2 Y X eiei ejej

Week Lecture 3Slide #3 OLS Derivation of b 0 Use partial derivation in this step:

Week Lecture 3Slide #4 Derivation of b 0, step 2

Week Lecture 3Slide #5 Derivation of b 1 Step 1: Multiply out e 2

Week Lecture 3Slide #6 Derivation of b 1 Step 2: Differentiate w.r.t. b 1

Week Lecture 3Slide #7 Derivation of b 1 Step 3: Substitute for b 0

Week Lecture 3Slide #8 Derivation of b 1 Step 4: Simplify and Isolate b 1

Week Lecture 3Slide #9 Calculating b 0 and b 1 The formula for b 1 and b 0 allow you (or preferably your computer) to calculate the error-minimizing slope and intercept for any data set representing a bi-variate, linear relationship. No other line, using the same data, will result in as small a squared-error (e 2 ). OLS gives best fit.

Week Lecture 3Slide #10 Interpreting b 1 and b 0 For each 1-unit increase in X, you get b 1 units change in Y When X is zero, Y will be equal to b 0. Note that regression with no independent variables is simply the mean.

Week Lecture 3Slide #11 Calculate b 0 and b 1 for: Check results against Stata.