1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Correlation & Regression Chapter 15. Correlation statistical technique that is used to measure and describe a relationship between two variables (X and.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Introduction to Linear Regression.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
9. SIMPLE LINEAR REGESSION AND CORRELATION
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
SIMPLE LINEAR REGRESSION
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Lecture 5: Simple Linear Regression
PSY 307 – Statistics for the Behavioral Sciences Chapter 7 – Regression.
Simple Linear Regression Analysis
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
Linear Regression.  Uses correlations  Predicts value of one variable from the value of another  ***computes UKNOWN outcomes from present, known outcomes.
Relationships Among Variables
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
Correlation and Linear Regression
Correlation and Linear Regression
Correlation and Linear Regression Chapter 13 Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Regression and Correlation Methods Judy Zhong Ph.D.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
Chapter 15 Correlation and Regression
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Chapter 6 & 7 Linear Regression & Correlation
12a - 1 © 2000 Prentice-Hall, Inc. Statistics Multiple Regression and Model Building Chapter 12 part I.
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
© 2014 by Pearson Higher Education, Inc Upper Saddle River, New Jersey All Rights Reserved HLTH 300 Biostatistics for Public Health Practice, Raul.
Introduction to Linear Regression
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Chapter 14 Correlation and Regression
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
SOCW 671 #11 Correlation and Regression. Uses of Correlation To study the strength of a relationship To study the direction of a relationship Scattergrams.
Chapter 8 Relationships Among Variables. Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Multiple Regression.
Chapter 4: Basic Estimation Techniques
REGRESSION G&W p
Multiple Regression.
S519: Evaluation of Information Systems
Multiple Regression.
Introduction to Regression
15.1 The Role of Statistics in the Research Process
Regression & Prediction
Introduction to Regression
Chapter Thirteen McGraw-Hill/Irwin
3 basic analytical tasks in bivariate (or multivariate) analyses:
Chapter 14 Multiple Regression
MGS 3100 Business Analysis Regression Feb 18, 2016
Presentation transcript:

1 Chapter 17: Introduction to Regression

2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points form a straight line relationship. Regression is a statistical procedure that determines the equation for the straight line that best fits a specific set of data.

3 Introduction to Linear Regression (cont.) Any straight line can be represented by an equation of the form Y = bX + a, where b and a are constants. The value of b is called the slope constant and determines the direction and degree to which the line is tilted. The value of a is called the Y-intercept and determines the point where the line crosses the Y-axis.

5 Introduction to Linear Regression (cont.) How well a set of data points fits a straight line can be measured by calculating the distance between the data points and the line. The total error between the data points and the line is obtained by squaring each distance and then summing the squared values. The regression equation is designed to produce the minimum sum of squared errors.

6 Introduction to Linear Regression (cont.) The equation for the regression line is

8 Introduction to Linear Regression (cont.) The ability of the regression equation to accurately predict the Y values is measured by first computing the proportion of the Y-score variability that is predicted by the regression equation and the proportion that is not predicted.

9 Introduction to Linear Regression (cont.) The unpredicted variability can be used to compute the standard error of estimate which is a measure of the average distance between the actual Y values and the predicted Y values.

10 Introduction to Linear Regression (cont.) Finally, the overall significance of the regression equation can be evaluated by computing an F- ratio. A significant F-ratio indicates that the equation predicts a significant portion of the variability in the Y scores (more than would be expected by chance alone). To compute the F-ratio, you first calculate a variance or MS for the predicted variability and for the unpredicted variability:

11 Introduction to Linear Regression (cont.)

13 Introduction to Multiple Regression with Two Predictor Variables In the same way that linear regression produces an equation that uses values of X to predict values of Y, multiple regression produces an equation that uses two different variables (X 1 and X 2 ) to predict values of Y. The equation is determined by a least squared error solution that minimizes the squared distances between the actual Y values and the predicted Y values.

15 Introduction to Multiple Regression with Two Predictor Variables (cont.) For two predictor variables, the general form of the multiple regression equation is: Ŷ= b 1 X 1 + b 2 X 2 + a The ability of the multiple regression equation to accurately predict the Y values is measured by first computing the proportion of the Y-score variability that is predicted by the regression equation and the proportion that is not predicted.

16 Introduction to Multiple Regression with Two Predictor Variables (cont.) As with linear regression, the unpredicted variability (SS and df) can be used to compute a standard error of estimate that measures the standard distance between the actual Y values and the predicted values.

17 Introduction to Multiple Regression with Two Predictor Variables (cont.) In addition, the overall significance of the multiple regression equation can be evaluated with an F-ratio:

18 Partial Correlation A partial correlation measures the relationship between two variables (X and Y) while eliminating the influence of a third variable (Z). Partial correlations are used to reveal the real, underlying relationship between two variables when researchers suspect that the apparent relation may be distorted by a third variable.

19 Partial Correlation (cont.) For example, there probably is no underlying relationship between weight and mathematics skill for elementary school children. However, both of these variables are positively related to age: Older children weigh more and, because they have spent more years in school, have higher mathematics skills.

20 Partial Correlation (cont.) As a result, weight and mathematics skill will show a positive correlation for a sample of children that includes several different ages. A partial correlation between weight and mathematics skill, holding age constant, would eliminate the influence of age and show the true correlation which is near zero.