STA291 Statistical Methods Lecture 11. 2 LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms.

Slides:



Advertisements
Similar presentations
Statistics Measures of Regression and Prediction Intervals.
Advertisements

AP Statistics Chapter 3 Practice Problems
2nd Day: Bear Example Length (in) Weight (lb)
Chapter 8 Linear Regression © 2010 Pearson Education 1.
Sociology 601 Class 17: October 28, 2009 Review (linear regression) –new terms and concepts –assumptions –reading regression computer outputs Correlation.
Chapter 4 Describing the Relation Between Two Variables
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Lecture 3 Cameron Kaplan
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
LINEAR REGRESSIONS: Cricket example About lines Line as a model:
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Linear Regression and Correlation Analysis
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Math 227 Elementary Statistics Math 227 Elementary Statistics Sullivan, 4 th ed.
Business Statistics - QBM117 Least squares regression.
LINEAR REGRESSIONS: About lines Line as a model: Understanding the slope Predicted values Residuals How to pick a line? Least squares criterion “Point.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Least Squares Regression
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Introduction to Linear Regression and Correlation Analysis
Biostatistics Unit 9 – Regression and Correlation.
Chapter 6 & 7 Linear Regression & Correlation
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION.
Correlation is a statistical technique that describes the degree of relationship between two variables when you have bivariate data. A bivariate distribution.
Linear Regression Least Squares Method: the Meaning of r 2.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Objective: Understanding and using linear regression Answer the following questions: (c) If one house is larger in size than another, do you think it affects.
Relationships If we are doing a study which involves more than one variable, how can we tell if there is a relationship between two (or more) of the.
Chapter 8 Linear Regression. Slide 8- 2 Fat Versus Protein: An Example The following is a scatterplot of total fat versus protein for 30 items on the.
Chapter 10 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 A perfect correlation implies the ability to predict one score from another perfectly.
Examining Bivariate Data Unit 3 – Statistics. Some Vocabulary Response aka Dependent Variable –Measures an outcome of a study Explanatory aka Independent.
CHAPTER 5 Regression BPS - 5TH ED.CHAPTER 5 1. PREDICTION VIA REGRESSION LINE NUMBER OF NEW BIRDS AND PERCENT RETURNING BPS - 5TH ED.CHAPTER 5 2.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12 Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
LECTURE 9 Tuesday, 24 FEBRUARY STA291 Fall Administrative 4.2 Measures of Variation (Empirical Rule) 4.4 Measures of Linear Relationship Suggested.
 Find the Least Squares Regression Line and interpret its slope, y-intercept, and the coefficients of correlation and determination  Justify the regression.
Essential Statistics Chapter 51 Least Squares Regression Line u Regression line equation: y = a + bx ^ –x is the value of the explanatory variable –“y-hat”
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Least Squares Regression Remember y = mx + b? It’s time for an upgrade… A regression line is a line that describes how a response variable y changes as.
Least Squares Regression.   If we have two variables X and Y, we often would like to model the relation as a line  Draw a line through the scatter.
POD 09/19/ B #5P a)Describe the relationship between speed and pulse as shown in the scatterplot to the right. b)The correlation coefficient, r,
AP Statistics HW: p. 165 #42, 44, 45 Obj: to understand the meaning of r 2 and to use residual plots Do Now: On your calculator select: 2 ND ; 0; DIAGNOSTIC.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
Chapter 8 Linear Regression. Fat Versus Protein: An Example 30 items on the Burger King menu:
3.2 - Residuals and Least Squares Regression Line.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Describing the Relation between Two Variables 4.
Method 3: Least squares regression. Another method for finding the equation of a straight line which is fitted to data is known as the method of least-squares.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
AP STATISTICS LESSON 3 – 3 (DAY 2) The role of r 2 in regression.
Unit 3 – Association: Contingency, Correlation, and Regression Lesson 3-3 Linear Regression, Residuals, and Variation.
Describing Bivariate Relationships. Bivariate Relationships When exploring/describing a bivariate (x,y) relationship: Determine the Explanatory and Response.
Unit 4 LSRL.
LECTURE 13 Thursday, 8th October
Warm-up: This table shows a person’s reported income and years of education for 10 participants. The correlation is .79. State the meaning of this correlation.
Chapter 5 LSRL.
1) A residual: a) is the amount of variation explained by the LSRL of y on x b) is how much an observed y-value differs from a predicted y-value c) predicts.
Linear Models and Equations
Least Squares Method: the Meaning of r2
Warm-up: This table shows a person’s reported income and years of education for 10 participants. The correlation is .79. State the meaning of this correlation.
Chapter 8 Part 1 Linear Regression
Chapter 14 Inference for Regression
A medical researcher wishes to determine how the dosage (in mg) of a drug affects the heart rate of the patient. Find the correlation coefficient & interpret.
REGRESSION ANALYSIS 11/28/2019.
Presentation transcript:

STA291 Statistical Methods Lecture 11

2 LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms of what? o In terms of least squared error:

3 “Best” line: least-squares, or regression line Observed point: ( x i, y i ) Predicted value for given x i : (interpretation in a minute) “Best” line minimizes, the sum of the squared errors.

4 Interpretation of the b 0, b 1 b 0 Intercept: predicted value of y when x = 0. b 1 Slope: predicted change in y when x increases by 1.

5 Calculation of the b 0, b 1 where and

6 Least Squares, or Regression Line, Example STA291 study time example: (Hours studied, Score on First Exam) o Data: (1,45), (5, 80), (12, 100) o In summary: o b 1 = o b 0 = Interpretation?

7 Properties of the Least Squares Line o b 1, slope, always has the same sign as r, the correlation coefficient—but they measure different things! o The sum of the errors (or residuals),, is always 0 (zero). o The line always passes through the point.

About those residuals 8 o When we use our prediction equation to “check” values we actually observed in our data set, we can find their residuals: the difference between the predicted value and the observed value o For our STA291 study data earlier, one observation was (5, 80). Our prediction equation was: o When we plug in x = 5, we get a predicted y of 70.24—our residual, then, is

Residuals 9 o Earlier, pointed out the sum of the residuals is always 0 (zero) o Residuals are positive when the observed y is above the regression line; negative when it is below o The smaller (in absolute value) the individual residual, the closer the predicted y was to the actual y.

R-squared??? 10 o Gives the proportion of the variation of the y ’s accounted for in the linear relationship with the x ’s o So, this means?

Why “regression”? 11 o Sir Francis Galton (1880s): correlation between x =father’s height and y =son’s height is about 0.5 o Interpretation: If a father has height one standard deviation below average, then the predicted height of the son is 0.5 standard deviations below average o More Interpretation: If a father has height two standard deviations above average, then the predicted height of the son is 0.5 x 2 = 1 standard deviation above average o Tall parents tend to have tall children, but not so tall o This is called “regression toward the mean” statistical term “regression”

Looking back o Best-fit, or least-squares, or regression line o Interpretation of the slope, intercept o Residuals o R-squared o “Regression toward the mean”