Research Methods I Lecture 10: Regression Analysis on SPSS.

Slides:



Advertisements
Similar presentations
Section 10-3 Regression.
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
Probability & Statistical Inference Lecture 9
12-1 Multiple Linear Regression Models Introduction Many applications of regression analysis involve situations in which there are more than.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
12 Multiple Linear Regression CHAPTER OUTLINE
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
Graduate School Gwilym Pryce
REGRESSION Want to predict one variable (say Y) using the other variable (say X) GOAL: Set up an equation connecting X and Y. Linear regression linear.
January 6, morning session 1 Statistics Micro Mini Multiple Regression January 5-9, 2008 Beth Ayers.
Lecture 6: Multiple Regression
Chapter Topics Types of Regression Models
Simple Linear Regression Analysis
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Business Statistics - QBM117 Statistical inference for regression.
Correlation and Regression Analysis
Chapter 7 Forecasting with Simple Regression
Regression Model Building Setting: Possibly a large set of predictor variables (including interactions). Goal: Fit a parsimonious model that explains variation.
Multiple Regression Dr. Andy Field.
Linear Regression 2 Sociology 5811 Lecture 21 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
Introduction to Regression Analysis, Chapter 13,
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Forecasting Revenue: An Example of Regression Model Building Setting: Possibly a large set of predictor variables used to predict future quarterly revenues.
Correlation & Regression
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Regression and Correlation Methods Judy Zhong Ph.D.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Forecasting Revenue: An Example of Regression Model Building Setting: Possibly a large set of predictor variables used to predict future quarterly revenues.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for clinicians Biostatistics course by Kevin E. Kip, Ph.D., FAHA Professor and Executive Director, Research Center University of South Florida,
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Statistical Methods Statistical Methods Descriptive Inferential
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Review of Building Multiple Regression Models Generalization of univariate linear regression models. One unit of data with a value of dependent variable.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Dr. C. Ertuna1 Issues Regarding Regression Models (Lesson - 06/C)
Regression: Checking the Model Peter T. Donnan Professor of Epidemiology and Biostatistics Statistics for Health Research.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 2: Review of Multiple Regression (Ch. 4-5)
Simple Linear Regression (SLR)
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
REGRESSION DIAGNOSTICS Fall 2013 Dec 12/13. WHY REGRESSION DIAGNOSTICS? The validity of a regression model is based on a set of assumptions. Violation.
CHAPTER 5 Regression BPS - 5TH ED.CHAPTER 5 1. PREDICTION VIA REGRESSION LINE NUMBER OF NEW BIRDS AND PERCENT RETURNING BPS - 5TH ED.CHAPTER 5 2.
Lecture 10: Correlation and Regression Model.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
Applied Quantitative Analysis and Practices LECTURE#25 By Dr. Osman Sadiq Paracha.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Applied Quantitative Analysis and Practices LECTURE#30 By Dr. Osman Sadiq Paracha.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
Lab 4 Multiple Linear Regression. Meaning  An extension of simple linear regression  It models the mean of a response variable as a linear function.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
رگرسیون چندگانه Multiple Regression
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
Correlation, Bivariate Regression, and Multiple Regression
Multiple Regression Prof. Andy Field.
AP Statistics Chapter 14 Section 1.
3.1 Examples of Demand Functions
Regression Analysis Simple Linear Regression
Multiple Regression.
Chapter 12: Regression Diagnostics
Fundamentals of regression analysis
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
بحث في التحليل الاحصائي SPSS بعنوان :
Stats Club Marnie Brennan
Presentation transcript:

Research Methods I Lecture 10: Regression Analysis on SPSS

Introduction Lecture 8: looked at descriptive statistics; relationships between variables: correlation; cross-tabulation This lecture: regression Focus on how regression can be done on SPSS Focus on OLS, although SPSS is capable of WLS and logistic regression

Regression: fundamentals Relationship between two variables lead to prediction of values of a dependent variable Prediction of Y i based on Y(bar), model (effect of X on Y) and residual Regression equation: Y^ = a^ + b^X + e^

Regression: fundamentals Actual values of X will predict values of Y^ based upon intercept term (a^), slope coefficient (b^) and residual (e^) Ordinary Least Squares finds values of a^ and b^ which minimise sum of squared residuals from the regression Under certain assumptions, OLS is BLUE

Evaluating the regression SPSS allows various ways to evaluate the generated regression equation: Goodness of fit Individual significance of regressors Tests of classical assumptions (necessary for inference)

Goodness of fit R 2 =ESS/TSS F-test: F=ESS/TSS Check outliers via standardised residuals or studentised residuals Can check for significant outliers, i.e., ones which would affect the value of the regressors

Significant outliers Cook’s distance (critical>1) Leverage values (critical: > 2(k+1)/n Mahalanobis distance Covariance ratio: if CVR i >/<1 + [3(k+1)/n], deleting case damages/improves parameter precision Casewise diagnostics CI of regression coefficients

Methods of regression Hierarchical: when there is a good statistical or theoretical reason for including one variable first Forward (specific to general) or Backward (general to specific): based on statistical criteria Stepwise: forward + removal test Use Enter command to do usual regression

Regression equation Individual coefficients: unstandardised b and standardised b SPSS will give t-statistics, s.e.(b), and p-value for significance of b If p<0.05, coefficient significant at 5% This is all OK for description but for inference, require specific assumptions: need to test these

“Diagnostic” tests No direct specification test (no equivalent of RESET in Microfit) Variation in ind. variables (  2 ≠0): test by prior observation SPSS provides many tests for multicollinearity: e.g. VIF (critical value=1), tolerance (=1/VIF), covariance matrix, partial correlations between regressors

“Diagnostic” tests Autocorrelation: DW test (DW(2) not directly available) Linearity and homoscedasticity can be tested via scatter plots of standardised residuals versus standardised predicted residuals (from the assump) Normality: ask SPSS for histogram and P-P plot of the regression residuals (can also do K-S and S-W tests)

Conclusions SPSS is capable of doing regression analysis of various types SPSS offers a range of tests of specification, fit and underlying assumptions of regression Tests are more extensive but less user- friendly than a package like Microfit