Download presentation
Presentation is loading. Please wait.
Published byJanel Spencer Modified over 5 years ago
1
Daniela Stan Raicu School of CTI, DePaul University
CSC 323 Quarter: Winter 02/03 Daniela Stan Raicu School of CTI, DePaul University 2/18/2019 Daniela Stan - CSC323
2
Outline Chapter 2: Looking at Data – Relationships between
two or more variables Linear regression Least-squares regression line Residual Analysis Cautions about regression and correlation SAS procedures for scatterplots, correlation and regression 2/18/2019 Daniela Stan - CSC323
3
Linear Regression Objective:
To quantify the linear relationship between an explanatory variable and response variable by fitting a line to the data (that is, drawing a line that comes as close as possible to the points). Example: Regression line 2/18/2019 Daniela Stan - CSC323
4
Linear Regression A regression line is a straight line that describes how a response variable y changes as an explanatory variable x changes. Linear Regression equation: ^ y = a + b*x b = slope ~ rate of change a = intercept (x=0) Height= a + b*age 2/18/2019 Daniela Stan - CSC323
5
Prediction Use of Regression: to predict the value of y for any value of x by substituting this x into the equation of the regression line. Example: Prediction via Regression Line Husband and Wife: Ages The regression equation is y = x, where y is the average age of all husbands who have wives of age x For all women aged 30, we predict the average husband age to be 32.7 years: 3.6 + (0.97)(30) = 32.7 years Suppose we know that an individual wife’s age is 30. What would we predict her husband’s age to be? 2/18/2019 Daniela Stan - CSC323
6
Least-squares Regression
Used to determine the “best” line; We want the line to be as close as possible to the data points in the vertical (y) direction (since that is what we are trying to predict) The least - squares regression line of y on x is the line that makes the sum of the squares of the vertical distances of the data points from the line as small as possible. Y Observed value y Error Predicted value A residual is the difference between an observed value of the response variable y and the value predicted by the regression line. x 2/18/2019 Daniela Stan - CSC323
7
The regression line makes the prediction errors as small as possible.
Least - Squares Regression The regression line makes the prediction errors as small as possible. 2/18/2019 Daniela Stan - CSC323
8
Least - Squares Regression (cont.)
How is the least – squares regression line calculated? = predicted value Where: r = correlation, Sx,Sy = standard deviations = means 2/18/2019 Daniela Stan - CSC323
9
Coefficient of Determination (R2)
Measures usefulness of regression prediction R2 (or r2, the square of the correlation): measures how much variation in the values of the response variable (y) is explained by the regression line Example: r=1: R2=1: regression line explains/captures all (100%) of the variation in y r=.7: R2=.49: regression line explains almost half (50%) of the variation in y 2/18/2019 Daniela Stan - CSC323
10
A Caution: Beware of Extrapolation
Extrapolation is the use of regression line for prediction outside the range values of the explanatory variable x that you used to obtain the line. Such predictions are often not accurate. Sarah’s height was plotted against her age Can you predict her height at age 42 months? Can you predict her height at age 30 years (360 months)? 2/18/2019 Daniela Stan - CSC323
11
A Caution: Beware of Extrapolation
Regression line: y = x height at age 42 months? y = 88 height at age 30 years? y = 209.8 She is predicted to be 6’ 10.5” at age 30. 2/18/2019 Daniela Stan - CSC323
12
Accuracy of the predictions
One possible measure of the accuracy of the regression predictions is given by the root mean square error (r.m.s. error). The r.m.s. error is defined as the square root of the average of the square residuals: In large data sets, the r.m.s. error is approximately equal to 2/18/2019 Daniela Stan - CSC323
13
Confounding factor A confounding factor is a variable that has an important effect on the relationship among the variables in a study but it is not included in the study. Example: The mathematics department of a large university must plan the timetable for the following year. Data are collected on the enrollment year, the number x of first-year students and the number y of students enrolled in elementary math courses. The fitted regression line has equation: = x R2=0.694. 2/18/2019 Daniela Stan - CSC323
14
Influential Point An observation is influential for the regression line, if removing it would change considerably the fitted line. An influential point pulls the regression line towards itself. Regression line if is omitted Influential point/outlier 2/18/2019 Daniela Stan - CSC323
15
Summary - Warnings Correlation measures linear association, regression line should be used only when the association is linear. Extrapolation – do not use the regression line to predict values outside the observed range – predictions are not reliable. Correlation and regression line are sensitive to influential / extreme points. 2/18/2019 Daniela Stan - CSC323
16
Data Mining Domain Understanding Data Selection Cleaning &
Exploring really large data bases in the hope of finding useful patterns is called data mining. Domain Understanding Data Selection Cleaning & Preprocessing Knowledge Evaluation & Interpretation Discovering patterns The entire process is iterative and interactive. 2/18/2019 Daniela Stan - CSC323
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.