C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.

Slides:



Advertisements
Similar presentations
Forecasting Using the Simple Linear Regression Model and Correlation
Advertisements

Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Linear regression models
Correlation and Regression
Describing Relationships Using Correlation and Regression
Chapter 10 Regression. Defining Regression Simple linear regression features one independent variable and one dependent variable, as in correlation the.
Chapter 15 (Ch. 13 in 2nd Can.) Association Between Variables Measured at the Interval-Ratio Level: Bivariate Correlation and Regression.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Chapter Topics Types of Regression Models
Lecture 5: Simple Linear Regression
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Relationships Among Variables
Lecture 5 Correlation and Regression
Correlation and Regression
Correlation and Linear Regression
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Linear Regression.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Introduction to Linear Regression and Correlation Analysis
Linear Regression and Correlation
Linear Trend Lines Y t = b 0 + b 1 X t Where Y t is the dependent variable being forecasted X t is the independent variable being used to explain Y. In.
Relationships between Variables. Two variables are related if they move together in some way Relationship between two variables can be strong, weak or.
The Scientific Method Interpreting Data — Correlation and Regression Analysis.
EQT 272 PROBABILITY AND STATISTICS
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
Basic Concepts of Correlation. Definition A correlation exists between two variables when the values of one are somehow associated with the values of.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Regression Regression relationship = trend + scatter
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Department of Cognitive Science Michael J. Kalsher Adv. Experimental Methods & Statistics PSYC 4310 / COGS 6310 Regression 1 PSYC 4310/6310 Advanced Experimental.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Chapter 10 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 A perfect correlation implies the ability to predict one score from another perfectly.
STA291 Statistical Methods Lecture LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Basic Statistics Linear Regression. X Y Simple Linear Regression.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Continuous Outcome, Dependent Variable (Y-Axis) Child’s Height
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
Michael J. Kalsher PSYCHOMETRICS MGMT 6971 Regression 1 PSYC 4310 Advanced Experimental Methods and Statistics © 2014, Michael Kalsher.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 9 l Simple Linear Regression 9.1 Simple Linear Regression 9.2 Scatter Diagram 9.3 Graphical.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
Theme 6. Linear regression
Correlation and Simple Linear Regression
Linear Regression Prof. Andy Field.
Simple Linear Regression
Correlation and Simple Linear Regression
Correlation and Regression
Correlation and Simple Linear Regression
Correlation and Regression
Simple Linear Regression and Correlation
Chapter 14 Multiple Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable when the score on another variable is called regression In general, statistical prediction is achieved through the production of a simplified statement of the relationship between two variables The most commonly assumed relationship is a linear (straight line) relationship

C82MCP Diploma Statistics School of Psychology University of Nottingham 2 The Linear Equation A linear equation is defined in the following way where X is the independent variable Y is the dependent variable b is the slope of the line a is the intercept

C82MCP Diploma Statistics School of Psychology University of Nottingham 3 An Example of a Positive Relationship The graph below shows the plot of an equation

C82MCP Diploma Statistics School of Psychology University of Nottingham 4 An Example of a Negative Relationship The graph below shows the plot of an equation

C82MCP Diploma Statistics School of Psychology University of Nottingham 5 Simple Linear Regression Coefficients Since we are trying to achieve an equation of the form We need to find coefficients, a and b, that lead the equation to pass through the mean of the dependent variable scores minimise the “error of prediction”

C82MCP Diploma Statistics School of Psychology University of Nottingham 6 Simple Linear Regression Coefficients The following values for the coefficients : and Minimise the “error of prediction”

C82MCP Diploma Statistics School of Psychology University of Nottingham 7 Example Data The data on the right is the mean number of words recalled by primary school children after listening to a spoken list of words Is there a linear relationship between these two variables

C82MCP Diploma Statistics School of Psychology University of Nottingham 8 Example Data When the data is plotted on a scattergraph the points do not all fit on a straight line We need to find a way to describe the best fitting straight line relationship.

C82MCP Diploma Statistics School of Psychology University of Nottingham 9 Example Linear Regression

C82MCP Diploma Statistics School of Psychology University of Nottingham 10 Calculating the Slope The slope is given by: From the example calculations we get Therefore there is a positive relationship between age and the mean number of recalled words

C82MCP Diploma Statistics School of Psychology University of Nottingham 11 Calculating the Intercept The intercept for the example data is given by: The intercept is For this data the regression line crosses the y axis at y=3.62

C82MCP Diploma Statistics School of Psychology University of Nottingham 12 Example Linear Equation For this example data the complete regression equation is given by If we look at one of the five year olds who scored a mean number of recalled words of 6 we find that the equation predicts that they should score 5.81 The residual (i.e. the difference between the predicted score and the actual score) for this five year old is 0.19 which is small.

C82MCP Diploma Statistics School of Psychology University of Nottingham 13 The Statistical Test of the Regression Equation "Does the regression equation significantly predict the data that have been obtained?" The way to approach this problem is on the basis of the variability in the Y scores that the regression equation accounts for.

C82MCP Diploma Statistics School of Psychology University of Nottingham 14 Estimates of Variability The differences between the predicted and the observed scores are known as the residuals We can use the residuals as a measure of variability of the scores around the regression line

C82MCP Diploma Statistics School of Psychology University of Nottingham 15 Testing the Regression Equation We can test the amount of variability that the regression equation accounts for using an F-ratio The estimate of variance used in the F-ratio is known as a Mean Square Mean Squares are defined as:

C82MCP Diploma Statistics School of Psychology University of Nottingham 16 Sum of Square of the Regression The sum of squares of the regression can be calculated using the following formula where

C82MCP Diploma Statistics School of Psychology University of Nottingham 17 Sum of Squares of the Residual The sum of squares of the residual can be calculated using the following formula where

C82MCP Diploma Statistics School of Psychology University of Nottingham 18 The Mean Squares The mean square for the regression is given by: The degrees of freedom for the residual are N-2, so the mean square for the residuals is:

C82MCP Diploma Statistics School of Psychology University of Nottingham 19 Testing the Regression Equation

C82MCP Diploma Statistics School of Psychology University of Nottingham 20 Sum of Squares of X The sum of squares for X are given by: For the example data the sum of squares of X are given by:

C82MCP Diploma Statistics School of Psychology University of Nottingham 21 The Sum of Squares of the Regression The sum of squares of the regression is given by: For the example data the sum of squares of the regression is:

C82MCP Diploma Statistics School of Psychology University of Nottingham 22 The Sum of Squares of the Residual The sum of squares of the residual is given by: where For the example data the sum of squares of the residual is:

C82MCP Diploma Statistics School of Psychology University of Nottingham 23 The Mean Squares The mean square for the regression is given by: The mean square for the residual is given by: The F-ratio is given by:

C82MCP Diploma Statistics School of Psychology University of Nottingham 24 Results of the Analysis The results of this analysis are presented in a summary table The F ratio is looked up in tables with the regression and residual degrees of freedom For this experiment, given 1 & 8 df, the critical value of F, 5.32, is exceeded. Thus the regression equation is a significant predictor of the data

C82MCP Diploma Statistics School of Psychology University of Nottingham 25 Proportion of Variability accounted for One index of the success of the regression equation is the proportion of variability accounted for: This means the 83% of the variability in the dependent variable scores can be accounted for by the regression equation:

C82MCP Diploma Statistics School of Psychology University of Nottingham 26 Summary The regression equation is a significant predictor of this data. There is a linear relationship between the mean number of words recalled and the age of the child