Bivariate Linear Regression. Linear Function Y = a + bX +e.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Linear regression models
Chapter 6 (p153) Predicting Future Performance Criterion-Related Validation – Kind of relation between X and Y (regression) – Degree of relation (validity.
Objectives (BPS chapter 24)
© McGraw-Hill Higher Education. All Rights Reserved. Chapter 2F Statistical Tools in Evaluation.
Simple Linear Regression
MF-852 Financial Econometrics
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
The Simple Regression Model
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
REGRESSION AND CORRELATION
SIMPLE LINEAR REGRESSION
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Multiple Regression Research Methods and Statistics.
Chapter 7 Forecasting with Simple Regression
Simple Linear Regression Analysis
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Bivariate Linear Correlation. Linear Function Y = a + bX.
General Linear Model. Instructional Materials MultReg.htmhttp://core.ecu.edu/psyc/wuenschk/PP/PP- MultReg.htm.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Statistical planning and Sample size determination.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Multivariate Statistics.
Simple linear regression and correlation Regression analysis is the process of constructing a mathematical model or function that can be used to predict.
1 Experimental Statistics - week 11 Chapter 11: Linear Regression and Correlation.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Chapter 20 Linear and Multiple Regression
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
REGRESSION G&W p
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Quantitative Methods Simple Regression.
BIVARIATE REGRESSION AND CORRELATION
LESSON 24: INFERENCES USING REGRESSION
Simple Linear Regression
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Chapter 13 Additional Topics in Regression Analysis
Chapter Thirteen McGraw-Hill/Irwin
Regression Part II.
Presentation transcript:

Bivariate Linear Regression

Linear Function Y = a + bX +e

Sources of Error Error in the measurement of X and or Y or in the manipulation of X. The influence upon Y of variables other than X (extraneous variables), including variables that interact with X. Any nonlinear influence of X upon Y.

The Regression Line r 2 < 1  Predicted Y regresses towards mean Y Least Squares Criterion:

Our Beer and Burger Data

Pearson r is a Standardized Slope Pearson r is the number of standard deviations that predicted Y changes for each one standard deviation change in X.

Error Variance What is SSE if r = 0? If r 2 > 0, we can do better than just predicting that Y i is mean Y.

Standard Error of Estimate Get back to the original units of measurement.

Regression Variance Variance in Y “due to” X p is number of predictors. p = 1 for bivariate regression.

Coefficient of Determination The proportion of variance in Y explained by the linear model.

Coefficient of Alienation The proportion of variance in Y that is not explained by the linear model.

Testing Hypotheses H  : b = 0 F = t 2 One-tailed p from F = two-tailed p from t

Source Table MS total is nothing more than the sample variance of the dependent variable Y. It is usually omitted from the table.

Power Analysis Using Steiger & Fouladi’s R2.exe

Power = 13%

G*Power = 15%

Summary Statement The linear regression between my friends’ burger consumption and their beer consumption fell short of statistical significance, r =.8, beers = burgers, F(1, 3) = 5.33, p =.10. The most likely explanation of the nonsiginficant result is that it represents a Type II error. Given our small sample size (N = 5), power was only 13% even for a large effect (ρ =.5).

The Regression Line is Similar to a Mean

Increase n to 10 Same value of F r 2 = SS regression  SS total, = 25.6/64.0 =.4 (down from.64).

Power Analysis

Power = 33%

G*Power = 36%

Increase n to 10 A.05 criterion of statistical significant was employed for all tests. An a priori power analysis indicated that my sample size (N = 10) would yield power of only 33% even for a large effect (ρ =.5). Despite this low power, the analysis yielded a statistically significant result. Among my friends, beer consumption increased significantly with burger consumption, r =.632, beers = burgers, F(1, 8) = 5.33, p =.05.

Testing Directional Hypotheses H  : b  0H 1 : b > 0 For F, one-tailed p =.05 half-tailed p =.025. P(A  B) = P(A)P(B) =.5(.05) =.025

Assumptions To test H  : b = 0 or construct a CI Homoscedasticity across Y|X Normality of Y|X Normality of Y ignoring X No assumptions about X No assumptions for descriptive statistics (not using t or F)

Placing Confidence Limits on Predicted Values of Mean Y|X To predict the mean value of Y for all subjects who have some particular score on X:

Placing Confidence Limits on Predicted Values of Individual Y|X

Bowed Confidence Intervals

Testing Other Hypotheses Is the correlation between X and Y the same in one population as in another? Is the slope for predicting Y from X the same in one population as in another? Is the intercept for predicting Y from X the same in one population as in another.

Can differ on r but not slope or slope but not r.