REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Regression Analysis Simple Regression. y = mx + b y = a + bx.
Correlation and Regression
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Simple Linear Regression
Chapter 4 Describing the Relation Between Two Variables
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r Assumptions.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Regression Analysis. Unscheduled Maintenance Issue: l 36 flight squadrons l Each experiences unscheduled maintenance actions (UMAs) l UMAs costs $1000.
Linear Regression and Correlation
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
Pertemua 19 Regresi Linier
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Bivariate Regression  Assumptions  Each variable is interval/ratio  There is linear (straight line) relationship between the variables.  Normal distribution.
Multiple Regression Research Methods and Statistics.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Least Squares Regression
Simple Linear Regression Analysis
Chapter 2 – Simple Linear Regression - How. Here is a perfect scenario of what we want reality to look like for simple linear regression. Our two variables.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Correlation & Regression
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Biostatistics Unit 9 – Regression and Correlation.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Chapter 6 & 7 Linear Regression & Correlation
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Lecture 22 Dustin Lueker.  The sample mean of the difference scores is an estimator for the difference between the population means  We can now use.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Introduction to regression 3D. Interpretation, interpolation, and extrapolation.
Regression. Types of Linear Regression Model Ordinary Least Square Model (OLS) –Minimize the residuals about the regression linear –Most commonly used.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
Chapter 10 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 A perfect correlation implies the ability to predict one score from another perfectly.
STA291 Statistical Methods Lecture LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Least Squares Regression.   If we have two variables X and Y, we often would like to model the relation as a line  Draw a line through the scatter.
I231B QUANTITATIVE METHODS ANOVA continued and Intro to Regression.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
LEAST-SQUARES REGRESSION 3.2 Least Squares Regression Line and Residuals.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
SOCW 671 #11 Correlation and Regression. Uses of Correlation To study the strength of a relationship To study the direction of a relationship Scattergrams.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Describing the Relation between Two Variables 4.
Method 3: Least squares regression. Another method for finding the equation of a straight line which is fitted to data is known as the method of least-squares.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
BIVARIATE REGRESSION AND CORRELATION
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Presentation transcript:

REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the Assumptions for Using Regression? What are the Assumptions for Using Regression?

What is Regression? Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured

What is the Regression Equation? Equation is linear: y = bx + a y = predicted score on y x = measured score on x b = slope a = y-intercept

Regression Line X Y low high o o o o o o o o o o o o o o o o o o o o o o o o o o o

What is the Least-Squares Solution? Draw the regression line to minimize squared error in prediction. Error in prediction = difference between predicted y and actual y Positive and negative errors are both important

How is Regression Based on Correlation? Replace x and y with z X and z Y : z Y = bz X + a and the y-intercept becomes 0: z Y = bz X and the slope becomes r: z Y = rz X

What are the Assumptions for Using Regression? Predict for the same population from which you sampled Normal distributions for both variables Linear relationship between variables homoscedasticity - y scores are spread out the same degree for every x score

Heteroscedasticity X Y low high o o o o o o o o o o o o o o o o o o o o o o o o o o o

Homoscedasticity X Y low high o o o o o o o o o o o o o o o o o o o o o o o o o o o

Homoscedasticity X Y low high o o o o o o o o o o o o o o o o o o o o o o o o o o o

What is the Standard Error of the Estimate? Average distance of y scores from predicted y scores Index of how far off predictions are expected to be Larger r means smaller standard error