Goodness of Fit The sum of squared deviations from the mean of a variable can be decomposed as follows: TSS = ESS + RSS This decomposition can be used.

Slides:



Advertisements
Similar presentations
Multiple Regression.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Structural Equation Modeling
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Correlation and Regression
Bivariate Regression Analysis
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Lecture 3 Cameron Kaplan
The multivariable regression model Airline sales is obviously a function of fares—but other factors come into play as well (e.g., income levels and fares.
Specific to General Modelling The traditional approach to econometrics modelling was as follows: 1.Start with an equation based on economic theory. 2.Estimate.
The Simple Regression Model
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Chapter 11 Multiple Regression.
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Correlation and Regression. Correlation What type of relationship exists between the two variables and is the correlation significant? x y Cigarettes.
BCOR 1020 Business Statistics
Interpreting Bi-variate OLS Regression
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
So are how the computer determines the size of the intercept and the slope respectively in an OLS regression The OLS equations give a nice, clear intuitive.
Example of Simple and Multiple Regression
Introduction to Linear Regression and Correlation Analysis
5.1 Basic Estimation Techniques  The relationships we theoretically develop in the text can be estimated statistically using regression analysis,  Regression.
Understanding Multivariate Research Berry & Sanders.
Multiple Regression Analysis Multivariate Analysis.
F TEST OF GOODNESS OF FIT FOR THE WHOLE EQUATION 1 This sequence describes two F tests of goodness of fit in a multiple regression model. The first relates.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Multiple Regression Fundamentals Basic Interpretations.
Lecturer: Kem Reat, Viseth, PhD (Economics)
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
1Spring 02 First Derivatives x y x y x y dy/dx = 0 dy/dx > 0dy/dx < 0.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Ordinary Least Squares Regression.
Chapter 5 Demand Estimation Managerial Economics: Economic Tools for Today’s Decision Makers, 4/e By Paul Keat and Philip Young.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
MARKETING RESEARCH CHAPTER 18 :Correlation and Regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
Chapter 2 Ordinary Least Squares Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Environmental Modeling Basic Testing Methods - Statistics III.
6. Simple Regression and OLS Estimation Chapter 6 will expand on concepts introduced in Chapter 5 to cover the following: 1) Estimating parameters using.
MEASURES OF GOODNESS OF FIT The sum of the squares of the actual values of Y (TSS: total sum of squares) could be decomposed into the sum of the squares.
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Chapter 4: Basic Estimation Techniques
Regression and Correlation
6. Simple Regression and OLS Estimation
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Basic Estimation Techniques
Political Science 30: Political Inquiry
Prediction, Goodness-of-Fit, and Modeling Issues
BPK 304W Correlation.
Correlation and Simple Linear Regression
Basic Estimation Techniques
Chapter 6: MULTIPLE REGRESSION ANALYSIS
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Correlation and Simple Linear Regression
Some issues in multivariate regression
Simple Linear Regression
Tutorial 1: Misspecification
Simple Linear Regression and Correlation
SIMPLE LINEAR REGRESSION
Tutorial 6 SEG rd Oct..
Financial Econometrics Fin. 505
Introduction to Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Goodness of Fit The sum of squared deviations from the mean of a variable can be decomposed as follows: TSS = ESS + RSS This decomposition can be used to define the R-squared or coefficient of determination for a regression equation.

Properties of R-squared R-squared always lies in the range zero to one. If R-squared equals one then the regression is a perfect fit to the data (this almost always indicates that there is something wrong with it!). If R-squared is equal to zero then the regression has no explanatory power. In multivariate regressions the R-squared will always increase when we add an extra variable (even if that variable is completely irrelevant).

Testing if an equation has explanatory power Suppose we wish to test: Under the null hypothesis we can show that: This is the F-statistic for a regression equation. We can compare the test statistic with a critical value from the F tables and reject the null if it exceeds this value.

Relationship between the F-statistic and R-squared We can think of the F test as a test of: This relationship remains true when we consider multivariate regressions.

For a bivariate regression equation, there is also a relationship between the F-test and the t-ratio for the slope coefficient. This relationship only holds for bivariate regression equations. Things become more complicated when we move to multivariate regressions.

Relationship between the R-squared and the standard error of the regression A similar relationship will hold for the multivariate case but we will need to adjust for the loss of degrees of freedom when we introduce extra regressors.

Properties of the OLS residuals The OLS residuals sum to zero: by virtue of the property that the OLS regression passes through the sample means of the data.

The OLS residuals are uncorrelated with the X variable. Note: Therefore