Linear Regression Analysis 5E Montgomery, Peck & Vining 1 3.6 Hidden Extrapolation in Multiple Regression In prediction, exercise care about potentially.

Slides:



Advertisements
Similar presentations
Least-Squares Regression Section 3.3. Correlation measures the strength and direction of a linear relationship between two variables. How do we summarize.
Advertisements

3.3 Hypothesis Testing in Multiple Linear Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
12-1 Multiple Linear Regression Models Introduction Many applications of regression analysis involve situations in which there are more than.
12 Multiple Linear Regression CHAPTER OUTLINE
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
CHAPTER 8: LINEAR REGRESSION
From the Carnegie Foundation math.mtsac.edu/statway/lesson_3.3.1_version1.5A.
Section 4.2 Fitting Curves and Surfaces by Least Squares.
SVM for Regression DMML Lab 04/20/07. SVM Recall Two-class classification problem using linear model:
Chapter 9 Multicollinearity
Linear Regression Analysis 5E Montgomery, Peck and Vining 1 Chapter 6 Diagnostics for Leverage and Influence.
Mathematical Modeling. What is Mathematical Modeling? Mathematical model – an equation, graph, or algorithm that fits some real data set reasonably well.
Forecasting Outside the Range of the Explanatory Variable: Chapter
Interpolation, extrapolation and regression Interpolation is mathematically contrasted to regression or least-squares fit As important is the contrast.
Adapted from Walch Education A linear equation describes a situation where there is a near- constant rate of change. An exponential equation describes.
Lecture 3: Bivariate Data & Linear Regression 1.Introduction 2.Bivariate Data 3.Linear Analysis of Data a)Freehand Linear Fit b)Least Squares Fit c)Interpolation/Extrapolation.
2.4: Cautions about Regression and Correlation. Cautions: Regression & Correlation Correlation measures only linear association. Extrapolation often produces.
1.4 Data in 2 Variables Definitions. 5.3 Data in 2 Variables: Visualizing Trends When data is collected over long period of time, it may show trends Trends.
5.7 – Predicting with Linear Models  Today we will be learning how to: ◦ Determine whether a linear model is appropriate ◦ Use a linear model to make.
1 Chapter 3 Multiple Linear Regression Multiple Regression Models Suppose that the yield in pounds of conversion in a chemical process depends.
MTH 161: Introduction To Statistics
Modeling a Linear Relationship Lecture 47 Secs – Tue, Apr 25, 2006.
Regression Regression relationship = trend + scatter
Introduction to regression 3D. Interpretation, interpolation, and extrapolation.
Regression Lines. Today’s Aim: To learn the method for calculating the most accurate Line of Best Fit for a set of data.
Linear Regression Analysis 5E Montgomery, Peck & Vining
Modeling a Linear Relationship Lecture 44 Secs – Tue, Apr 24, 2007.
5.7 Predicting with Linear Models Objective : Deciding when to use a linear model Objective : Use a linear model to make a real life prediction. 1 2 A.
Line of Best fit, slope and y- intercepts MAP4C. Best fit lines 0 A line of best fit is a line drawn through data points that represents a linear relationship.
Financial Statistics Unit 2: Modeling a Business Chapter 2.2: Linear Regression.
7-3 Line of Best Fit Objectives
9.2 Linear Regression Key Concepts: –Residuals –Least Squares Criterion –Regression Line –Using a Regression Equation to Make Predictions.
Line of Best Fit Day 2 Linear model - a set of data from a real-life situation that can be represented by a line. In the linear model y = mx +b - m is.
Linear Approximation Lesson 2.6. Midpoint Formula Common way to approximate between two values is to use the mid value or average Midpoint between two.
Chapter 4: Systems of Equations and Inequalities Section 4.7: Solving Linear Systems of Inequalities.
Section 1.6 Fitting Linear Functions to Data. Consider the set of points {(3,1), (4,3), (6,6), (8,12)} Plot these points on a graph –This is called a.
1 Multiple Regression and Correlation KNN Ch. 6 CC Ch 3.
Statistics 3502/6304 Prof. Eric A. Suess Chapter 3.
The Line of Best Fit CHAPTER 2 LESSON 3  Observed Values- Data collected from sources such as experiments or surveys  Predicted (Expected) Values-
The Least Squares Regression Line. The problem with drawing line of best fit by eye is that the line drawn will vary from person to person. Instead, use.
LINEAR GRAPHS AND FUNCTIONS UNIT ONE GENERAL MATHS.
4.2 – Linear Regression and the Coefficient of Determination Sometimes we will need an exact equation for the line of best fit. Vocabulary Least-Squares.
Predicting Future. Two Approaches to Predition n Extrapolation: Use past experiences for predicting future. One looks for patterns over time. n Predictive.
Global predictors of regression fidelity A single number to characterize the overall quality of the surrogate. Equivalence measures –Coefficient of multiple.
Chapter 6 Diagnostics for Leverage and Influence
Least-Squares Regression
Validation of Regression Models
Regression Analysis 4e Montgomery, Peck & Vining
Prof. Eric A. Suess Chapter 3
Chapter 3 Multiple Linear Regression
Algebra 1 Section 6.6.
Lesson 5.7 Predict with Linear Models The Zeros of a Function
Chapter 7 Some Regression Pitfalls.
Least-Squares Regression
Modeling a Linear Relationship
Line of Best Fit.
Simple Linear Regression
Example 3.3 Delivery Time Data
EQUATION 4.1 Relationship Between One Dependent and One Independent Variable: Simple Regression Analysis.
Scatterplots line of best fit trend line interpolation extrapolation
Lesson 2.2 Linear Regression.
Ch 9.
9/27/ A Least-Squares Regression.
Cases. Simple Regression Linear Multiple Regression.
Lines in the Plane and Slope
Modeling a Linear Relationship
Model Adequacy Checking
Linear Regression Analysis 5th edition Montgomery, Peck & Vining
Regression and Correlation of Data
Presentation transcript:

Linear Regression Analysis 5E Montgomery, Peck & Vining Hidden Extrapolation in Multiple Regression In prediction, exercise care about potentially extrapolating beyond the region containing the original observations. Figure 3.10 An example of extrapolation in multiple regression.

Linear Regression Analysis 5E Montgomery, Peck & Vining Hidden Extrapolation in Multiple Regression We will define the smallest convex set containing all of the original n data points (x i1, x i2, … x ik ), i = 1, 2, …, n, as the regressor variable hull RVH. If a point x 01, x 02, …, x 0k lies inside or on the boundary of the RVH, then prediction or estimation involves interpolation, while if this point lies outside the RVH, extrapolation is required.

Linear Regression Analysis 5E Montgomery, Peck & Vining Hidden Extrapolation in Multiple Regression Diagonal elements of the matrix H = X(X’X) -1 X’ can aid in determining if hidden extrapolation exists: The set of points x (not necessarily data points used to fit the model) that satisfy is an ellipsoid enclosing all points inside the RVH.

Linear Regression Analysis 5E Montgomery, Peck & Vining Hidden Extrapolation in Multiple Regression Let x 0 be a point at which prediction or estimation is of interest. Then If h 00 > h max then the point is a point of extrapolation.

Linear Regression Analysis 5E Montgomery, Peck & Vining 5 Example 3.13 Consider prediction or estimation at:

Linear Regression Analysis 5E Montgomery, Peck & Vining 6 Figure 3.10 Scatterplot of cases and distance for the delivery time data. #9 a b c d