Welcome to Econ 420 Applied Regression Analysis Study Guide Week Two Ending Sunday, September 9 (Note: You must go over these slides and complete every.

Slides:



Advertisements
Similar presentations
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Nine.
Advertisements

Managerial Economics in a Global Economy
Marietta College Week 2 1 Collect Asst 2: Due Tuesday in class 1.#3, Page 25 2.
Welcome to Econ 420 Applied Regression Analysis
Simple Linear Regression 1. 2 I want to start this section with a story. Imagine we take everyone in the class and line them up from shortest to tallest.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Fourteen.
Stat 112: Lecture 7 Notes Homework 2: Due next Thursday The Multiple Linear Regression model (Chapter 4.1) Inferences from multiple regression analysis.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Conclusion to Bivariate Linear Regression Economics 224 – Notes for November 19, 2008.
Correlation and Regression
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Three Ending Tuesday, September 11 (Note: You must go over these slides and complete every.
Bivariate Regression Analysis
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
The Simple Linear Regression Model: Specification and Estimation
Multiple Linear Regression Model
Marietta College Week 14 1 Tuesday, April 12 2 Exam 3: Monday, April 25, 12- 2:30PM Bring your laptops to class on Thursday too.
The Simple Regression Model
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Marietta College Week 13 1 Tuesday, April 5 2 Exam 3: Monday, April 25, 12- 2:30PM.
Statistics: Data Presentation & Analysis Fr Clinic I.
Chapter Topics Types of Regression Models
Linear Regression and Correlation Analysis
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Simple Linear Regression Analysis
Topic 3: Regression.
Multiple Regression and Correlation Analysis
Business Statistics - QBM117 Statistical inference for regression.
Correlation and Regression Analysis
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Introduction to Regression Analysis, Chapter 13,
So are how the computer determines the size of the intercept and the slope respectively in an OLS regression The OLS equations give a nice, clear intuitive.
Relationships Among Variables
Ordinary Least Squares
Multiple Linear Regression Analysis
Chapter 8: Bivariate Regression and Correlation
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Hypothesis Testing in Linear Regression Analysis
Multiple Regression. In the previous section, we examined simple regression, which has just one independent variable on the right side of the equation.
Understanding Multivariate Research Berry & Sanders.
CORRELATION & REGRESSION
Chapter 6 & 7 Linear Regression & Correlation
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
12a - 1 © 2000 Prentice-Hall, Inc. Statistics Multiple Regression and Model Building Chapter 12 part I.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Twelve.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Five Ending Wednesday, September 26 (Note: Exam 1 is on September 27)
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week One Ending Sunday, September 2 (Note: You must go over these slides and complete every.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Thirteen.
Discussion of time series and panel models
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Four Ending Wednesday, September 19 (Assignment 4 which is included in this study guide.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Seven.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
CHAPTER 8 Linear Regression. Residuals Slide  The model won’t be perfect, regardless of the line we draw.  Some points will be above the line.
ANOVA, Regression and Multiple Regression March
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
Regression Analysis: A statistical procedure used to find relations among a set of variables B. Klinkenberg G
Introduction Many problems in Engineering, Management, Health Sciences and other Sciences involve exploring the relationships between two or more variables.
Chapter 4 Demand Estimation
The simple linear regression model and parameter estimation
Micro Economics in a Global Economy
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Welcome to Econ 420 Applied Regression Analysis
Chapter 15 Linear Regression
Managerial Economics in a Global Economy
Chapter 4 Demand Estimation
A medical researcher wishes to determine how the dosage (in mg) of a drug affects the heart rate of the patient. Find the correlation coefficient & interpret.
Presentation transcript:

Welcome to Econ 420 Applied Regression Analysis Study Guide Week Two Ending Sunday, September 9 (Note: You must go over these slides and complete every task outlined here by the end of the day on September 8)

Last week I asked you to report your heights and weights before Sunday September 2 –That meant by the end of the day on Saturday, September 1. –I did not hear from 4 of the students who are registered in this class Remember that this affects your grade

Here is our sample data on height and weight. ObservationHeight (H or X)Weight (W or Y) 1.Jackie Philip D Bryan Rita Shane Keith Kelsie Di72185

Assignment 1(Carries 30 points and is due before noon on Thursday, September 6) 1.Use the data set on the previous slide and the formulas on Page 8 (1-5 and 1-6) to estimated the coefficients β 0 ^ and β 1 ^ in the equation below W = β 0 ^ + β 1 ^ H –Make sure to show your work. –Do the estimated coefficients make sense to you? –What is the meaning of the estimated coefficients?

Assignment 1 continued 2. Answer Question 5 on Page Answer Question 8 on Page 15 Type your answers and send them to me as an attachment. Remember that I have an old version of word (2003). If you are using a newer version of word, you will need to save your work in the old format.

Note: The following notes are not going to take the place of the discussions covered in your text books First read the book Then look at the notes

Total, Explained and Residual Sum of Squares (PP11-13) Remember our height/weight example What is the average weight of the class? Duplicate the graph on Page 12 where Y is the weight and X is the height –The Fitted Line will be upward sloping –The Average Line (average weight) will be horizontal

Suppose instead of using the fitted line to predict someone’s weight we use the average line Y is the actual weight of a person. Y^ is the predicted weight according to the fitted line. Y bar is the average weight in the sample. (Y – Ybar) is how much the weight of a given individual is different from the average. (Y^ - Ybar) is how much our fitted line is closer to the actual weight than the average weight. (Y – Y^) is our residual –The portion of the weight that was not predicted (explained) by our fitted line

Remember we have 8 observations in our sample Some of our weights are below average and some are above average. Look at Equation 1-8, Page 12 –The reason why we square (Y – Ybar), (Y^ - Ybar) and (Y – Y^) is because we do not want the positive differences to cancel the negative differences Note: the best fitted line will be the one with the lowest (Y – Y^) 2

Multiple Regression Model (Chapter 2, PP20-29) Is height the only factor affecting weight? –Of course not. –What are some other factors affecting an individual’s weight? Age Calorie in take per day ……

So a better model will be Y = β 0 + β 1 X 1 + β 2 X 2 + β 3 X 3 + e –Where Y is weight and X1 through X 3 are Wight, Age, and Calorie intake. We will use EViews to estimate the coefficients of the a multiple regression model.

The meaning of the estimated coefficients Our estimated equations will be Y^ = β 0 ^ + β 1 ^ X 1 + β 2 ^ X 2 + β 3 ^ X 3 –Bonus: Can someone tell me why didn’t I put an “e” at the end of the above equation? β 1 ^ measures the effect of one more inch of height on weight, holding the age and the calorie intake constant and ignoring the effect of all other variables on weight. Similarly β 2 ^ measures the effect of one more year of age on weight, holding the weight and the calorie intake constant and ignoring the effect of all other variables on weight.

How big should the sample be? The bigger the sample the closer the β ^ will be to β. Rule of thumb: Degrees of Freedom >30 Degrees of Freedom = n- k-1 –Where n is the sample size and k is the number of independent variables.

The Classical Assumption Assumptions that have to be met in order for OLS to give us the best estimators.

Assumption 1 The regression equation Is linear in coefficients (not linear in variables) Is correctly specified (right functional form, no omitted variables, no irrelevant variables) Has additive error term

Assumption 2 Two or more independent variables are not perfectly correlated with each other. If violated  Perfect Multicollinearity Example Consumption = f (inflation, real interest rate, nominal interest rate, ….) Since real interest = nominal interest – inflations, The 3 independent variables are perfectly and linearly correlated with each other. When one independent variable changes, the others change too. OLS can not capture the effect of one variable in isolation

Assumption 3 No correlation between the explanatory (independent) variables and the error term What if it is violated? Example: Salary = f (Education,….,GPA) What if people with low GPA lie about their GPAs? When GPA is low, the error is always positive Problem: OLS attributes the variation in salary to the variation in GPA while it is in part caused by the variation in error.

Assumption 4 The error terms are uncorrelated with each other What if it is violated? Then we have autocorrelation (serial correlation) problem Example: Consumption = f (…., income) –Suppose we use time series data on the US economy to estimate the above model. Suppose that in 5 years of our study there was a war and consumption dropped significantly even though income didn’t. So, we will get negative errors during those years and they all seem to be correlated with each other.

Assumption 5 The error term must have a zero mean What if this assumption is violated This is not a big deal: the intercept will pick up the mean of the error term

Assumption 5 The error term has a constant variance What if it is violated? Problem of Heteroskedasticity Example: Consumption= f (…., income) –Suppose we use cross section data on various individuals to estimate the above model. People with low levels of income will probably spend most of their income. (The variance of the error is small) People with high levels of income may spend anywhere between 10% to 99% of their income. (The variance of the error is high.) (Figure 2-1)

Assumption 7 (Not Necessary) The error term is normally distributed What is a normal distribution? Symmetric, continuous, bell shaped Can be characterized by its mean and variance Must know if it is violated If violated, some statistical tests are not applicable As the size of sample goes up  the distribution becomes more normal