REGRESSION G&W p.498-504 http://stattrek.com/AP-Statistics-1/Regression-Example.aspx?Tutorial=AP.

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
Correlation and Linear Regression.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Learning Objectives 1 Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
SIMPLE LINEAR REGRESSION
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
SIMPLE LINEAR REGRESSION
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Summary of Quantitative Analysis Neuman and Robson Ch. 11
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Leedy and Ormrod Ch. 11 Gray Ch. 14
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Selecting the Correct Statistical Test
Chapter 11 Simple Regression
Understanding Multivariate Research Berry & Sanders.
Chapter 6 & 7 Linear Regression & Correlation
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Regression & Correlation. Review: Types of Variables & Steps in Analysis.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Examining Relationships in Quantitative Research
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Environmental Modeling Basic Testing Methods - Statistics III.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
SOCW 671 #11 Correlation and Regression. Uses of Correlation To study the strength of a relationship To study the direction of a relationship Scattergrams.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Inference about the slope parameter and correlation
Bivariate & Multivariate Regression Analysis
Review 1. Describing variables.
REGRESSION (R2).
B&A ; and REGRESSION - ANCOVA B&A ; and
Kin 304 Regression Linear Regression Least Sum of Squares
Regression.
Multiple Regression.
BPK 304W Regression Linear Regression Least Sum of Squares
Relationship with one independent variable
LESSON 24: INFERENCES USING REGRESSION
Bivariate Linear Regression July 14, 2008
Review of Chapter 2 Some Basic Concepts: Sample center
Simple Linear Regression
Relationship with one independent variable
SIMPLE LINEAR REGRESSION
Product moment correlation
SIMPLE LINEAR REGRESSION
3 basic analytical tasks in bivariate (or multivariate) analyses:
Chapter 14 Multiple Regression
Introductory Statistics
Presentation transcript:

REGRESSION G&W p.498-504 http://stattrek.com/AP-Statistics-1/Regression-Example.aspx?Tutorial=AP

THIS IS A TABLE YOU SHOULD KNOW BEFORE ANALYZING YOUR DATA AND THE FINAL Dependent Variable Independent Variable(s) Method Nominal/Ordinal Chi-Square test Interval/Ratio T-test (2 groups) or ANOVA (3+ groups) Multifactorial ANOVA (2+ IVs) Regression Interval/Ratio AND ANCOVA/GLM

Basics of bivariate regression – When do we need it? We have a dependent variable that is interval or ratio level. We have an independent variable that is interval or ratio (there are ways of incorporating ordinal and nominal variables). We have a posited directional association between the two variables. We have a theory that supports the posited directional association.

Technique: Algebra of the straight line 25 y = 2x + 3 20 (6,15) Regression: y = bx + c b = slope = 6/3 = 2 c = y-intercept = 3 15 6.0 (3,9) 10 3.0 5 1 2 3 4 5 6 7 8 9 10 x

Regression line – we have multiple data points y : Reading Scores x : Family Income Assumptions about deviations from the line?

Technique: Statistics of the straight line We have multiple data points (not just two) We need to find a line that best fits them Define “best fit” – Minimum squared deviations slope intercept Independent variable Error Dependent variable Regression coefficients

Technique: Statistics of the straight line The line that “best fits” has the following slope: Standard deviation of y Standard deviation of x Regression coefficient of y on x. Correlation coefficient of x and y The line that “best fits” has the following intercept:

Things to know A significant regression coefficient does not “prove” causality A regression can provide “predicted” values Predicted values are “point” estimates Confidence intervals around the point estimates provide very important information

Basics of linear regression Hypothesis: a variable(s) x (x1, x2, x3, …) cause(s) another variable y Corollary: x can be used to partially predict y Mathematical implication: This is mimimal and random This is linear

Example

Example: Regression approach Error in prediction is minimal, and random

The way these assumptions look Child’s IQ = 20.99 + 0.78*Mother’s IQ

Prediction Child’s IQ = 20.99 + 0.78*Mother’s IQ Predicted Case 3 IQ = 20.99 + 0.78*110 Predicted Case 3 IQ = 106.83 Actual Case 3 IQ = 102

Example

Example

Example

How are the regression coefficients computed? MINIMIZE SQUARED DEVIATIONS BETWEEN ACTUAL AND PREDICTED VALUES ε is “minimal” and random

Interpreting coefficients

Error in estimation of b The estimate of b will differ from sample to sample. There is sampling error in the estimate of b. b is not equal to the population value of the slope (B). If we take many many simple random samples and estimate b many many times ….

Standard error of b

R2 = + 1 = + 1- R2 = Sum of squared errors of regression Sum of squared deviations from the mean only Variance due to regression Total variance Error variance = + Proportion of variance due to regression Proportion of variance due to error 1 = +