LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Linear Equations Review. Find the slope and y intercept: y + x = -1.
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
 Coefficient of Determination Section 4.3 Alan Craig
Chapter 10 Curve Fitting and Regression Analysis
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 4 Describing the Relation Between Two Variables
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r Assumptions.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
Linear Regression MARE 250 Dr. Jason Turner.
Business Statistics - QBM117 Least squares regression.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Multiple Regression Research Methods and Statistics.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Simple Linear Regression Analysis
Chapter 2 – Simple Linear Regression - How. Here is a perfect scenario of what we want reality to look like for simple linear regression. Our two variables.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Linear Regression.
Introduction to Linear Regression and Correlation Analysis
Simple Linear Regression. Types of Regression Model Regression Models Simple (1 variable) LinearNon-Linear Multiple (2
Relationship of two variables
Chapter 11 Simple Regression
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
Section 4.2 Regression Equations and Predictions.
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Statistical Methods Statistical Methods Descriptive Inferential
Linear Regression Handbook Chapter. Experimental Testing Data are collected, in scientific experiments, to test the relationship between various measurable.
Regression. Population Covariance and Correlation.
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
Introduction to Regression Analysis. Dependent variable (response variable) Measures an outcome of a study  Income  GRE scores Dependent variable =
Section 4.2: Least-Squares Regression Goal: Fit a straight line to a set of points as a way to describe the relationship between the X and Y variables.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
STA291 Statistical Methods Lecture LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms.
Chapter 4 Summary Scatter diagrams of data pairs (x, y) are useful in helping us determine visually if there is any relation between x and y values and,
MARE 250 Dr. Jason Turner Linear Regression. Linear regression investigates and models the linear relationship between a response (Y) and predictor(s)
Least Squares Regression.   If we have two variables X and Y, we often would like to model the relation as a line  Draw a line through the scatter.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Simple Linear Regression The Coefficients of Correlation and Determination Two Quantitative Variables x variable – independent variable or explanatory.
Correlation and Regression Chapter 9. § 9.2 Linear Regression.
SOCW 671 #11 Correlation and Regression. Uses of Correlation To study the strength of a relationship To study the direction of a relationship Scattergrams.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Describing the Relation between Two Variables 4.
Method 3: Least squares regression. Another method for finding the equation of a straight line which is fitted to data is known as the method of least-squares.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
Describing Bivariate Relationships. Bivariate Relationships When exploring/describing a bivariate (x,y) relationship: Determine the Explanatory and Response.
REGRESSION G&W p
Linear Regression Special Topics.
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
Kin 304 Regression Linear Regression Least Sum of Squares
Multiple Regression.
Linear Regression Prof. Andy Field.
BPK 304W Regression Linear Regression Least Sum of Squares
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
The Weather Turbulence
Chapter 14 Multiple Regression
Presentation transcript:

LINEAR REGRESSION: What it Is and How it Works

Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r

What is Bivariate Linear Regression? Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured

Why is it Bivariate? Two variables: X and Y X - independent variable/predictor variable Y - dependent/outcome/criterion variable

Why is it Linear? Based on the linear relationship (correlation) between X and Y The relationship can be described by the equation for a straight line

The Regression Equation y = b 1 x i + b 0 + e i y = predicted score on criterion variable b 0 = intercept x i = measured score on predictor variable b 1 = slope e i = residual (error score)

Least-Squares Solution Minimize squared error in prediction. Error (residual) = difference between predicted y and actual y

How It’s Based on r Replace x and y with z X and z Y : z Y = b 1 z X + b o and the y-intercept becomes 0: z Y = b 1 z X and the slope becomes r: z Y = rz X

Take-Home Point Linear regression is a way of using information about a correlation to make predictions