Time Series Analysis – Chapter 6 Odds and Ends

Slides:



Advertisements
Similar presentations
Qualitative predictor variables
Advertisements

Lesson 10: Linear Regression and Correlation
Forecasting Using the Simple Linear Regression Model and Correlation
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Objectives (BPS chapter 24)
Time Trends Simplest time trend is a linear trend Examine National Population data set. How well does a linear model work? Did you examine the residuals.
Chapter 13 Multiple Regression
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Relationships Between Quantitative Variables
Chapter 12 Multiple Regression
Exercise 7.1 a. Find and report the four seasonal factors for quarter 1, 2, 3 and 4 sn1 = 1.191, sn2 = 1.521, sn3 = 0.804, sn4 = b. What is the.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 11 th Edition.
Linear Regression MARE 250 Dr. Jason Turner.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Descriptive measures of the strength of a linear association r-squared and the (Pearson) correlation coefficient r.
Correlation & Regression
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Introduction to Linear Regression and Correlation Analysis
Inference for regression - Simple linear regression
Linear Regression and Correlation
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Introduction to Linear Regression
1 © 2008 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 5 Summarizing Bivariate Data.
Summarizing Bivariate Data
Introduction to Probability and Statistics Thirteenth Edition Chapter 12 Linear Regression and Correlation.
Chap 14-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
14- 1 Chapter Fourteen McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Non-linear Regression Example.
Copyright ©2011 Brooks/Cole, Cengage Learning Inference about Simple Regression Chapter 14 1.
Multiple regression. Example: Brain and body size predictive of intelligence? Sample of n = 38 college students Response (Y): intelligence based on the.
Fitting Curves to Data 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 5: Fitting Curves to Data Terry Dielman Applied Regression.
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Inference with computer printouts. Coefficie nts Standard Errort StatP-value Lower 95% Upper 95% Intercept
Chapter 14: Inference for Regression. A brief review of chapter 4... (Regression Analysis: Exploring Association BetweenVariables )  Bi-variate data.
A first order model with one binary and one quantitative predictor variable.
LEAST-SQUARES REGRESSION 3.2 Role of s and r 2 in Regression.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
732G21/732G28/732A35 Lecture 4. Variance-covariance matrix for the regression coefficients 2.
Inference with Computer Printouts. Leaning Tower of Pisa Find a 90% confidence interval. Year Lean
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Linear Regression and Correlation Chapter 13.
Univariate Point Estimation Confidence Interval Estimation Bivariate: Linear Regression Multivariate: Multiple Regression 1 Chapter 4: Statistical Approaches.
Correlation and Regression Elementary Statistics Larson Farber Chapter 9 Hours of Training Accidents.
Regression Analysis Presentation 13. Regression In Chapter 15, we looked at associations between two categorical variables. We will now focus on relationships.
11-1 Copyright © 2014, 2011, and 2008 Pearson Education, Inc.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Fixing problems with the model Transforming the data so that the simple linear regression model is okay for the transformed data.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Descriptive measures of the degree of linear association R-squared and correlation.
1. Analyzing patterns in scatterplots 2. Correlation and linearity 3. Least-squares regression line 4. Residual plots, outliers, and influential points.
David Housman for Math 323 Probability and Statistics Class 05 Ion Sensitive Electrodes.
Chapter 15 Multiple Regression Model Building
Chapter 4: Basic Estimation Techniques
Regression and Correlation
CHAPTER 3 Describing Relationships
Basic Estimation Techniques
ENM 310 Design of Experiments and Regression Analysis
(Residuals and
Lecture 18 Outline: 1. Role of Variables in a Regression Equation
The Practice of Statistics in the Life Sciences Fourth Edition
Basic Estimation Techniques
CHAPTER 29: Multiple Regression*
Essentials of Statistics for Business and Economics (8e)
Presentation transcript:

Time Series Analysis – Chapter 6 Odds and Ends

Units Conversions When variables are rescaled (units are changed), the coefficients, standard errors, confidence intervals, t statistics, and F statistics change in ways that preserve all measured effects and testing outcomes.

Beta Coefficients A z-score is: 𝑧= 𝑥− 𝑥 𝑠 In a data set, the data for each regression variable (independent and dependent) are converted to z-scores. Then, the regression is conducted.

Beta Coefficients – Example Use the data set 4th Graders Feet Regress foot length on foot width The regression equation is Foot Length = 7.82 + 1.66 Foot Width Predictor Coef SE Coef T P Constant 7.817 2.938 2.66 0.011 Foot Width 1.6576 0.3262 5.08 0.000 S = 1.02477 R-Sq = 41.1% R-Sq(adj) = 39.5%

Beta Coefficients – Example Use the data set 4th Graders Feet Regress z-score of foot length on z-score of foot width The regression equation is zFoot Length = - 0.000 + 0.641 zFoot Width Predictor Coef SE Coef T P Constant -0.0000 0.1245 -0.00 1.000 zFoot Width 0.6411 0.1262 5.08 0.000 S = 0.777763 R-Sq = 41.1% R-Sq(adj) = 39.5%

Using the Log of a Variable Taking the log usually narrows the range of the variable – This can result in estimates that are less sensitive to outliers

Using the Log of a Variable When a variable is a positive $ amount, the log is usually taken When a variable has large integer values, the log is usually taken: population, total # employees, school enrollment, etc…

Using the Log of a Variable Variables that are measured in years such as education, experience, age, etc… are usually left in original form

Using the Log of a Variable Proportions or percentages are usually left in original form because the coefficients are easier to interpret – percentage point change interpretation.

Modeling a Quadratic Effect Consider the quadratic effect dataset Want to predict Millions of retained impressions per week Predictor is TV advertising budget, 1983 ($ millions) Model is: 𝑚𝑖𝑙= 𝛽 𝑜 + 𝛽 1 𝑠𝑝𝑒𝑛𝑑+𝑢

Consider the quadratic effect dataset Want to predict Millions of retained impressions per week Predictor is TV advertising budget, 1983 ($ millions) The regression equation is MIL = 22.2 + 0.363 SPEND Predictor Coef SE Coef T P Constant 22.163 7.089 3.13 0.006 SPEND 0.36317 0.09712 3.74 0.001 S = 23.5015 R-Sq = 42.4% R-Sq(adj) = 39.4%

Did you check your residuals plots?

Scatterplot – there is a quadratic effect too!

Modeling a Quadratic Effect Consider the quadratic effect dataset Want to predict Millions of retained impressions per week Predictor is TV advertising budget, 1983 ($ millions) Add the quadratic effect to the model Model is: 𝑚𝑖𝑙= 𝛽 𝑜 + 𝛽 1 𝑠𝑝𝑒𝑛𝑑+ 𝛽 2 𝑠𝑝𝑒𝑛𝑑 2 +𝑢

Model is: 𝑚𝑖𝑙= 𝛽 𝑜 + 𝛽 1 𝑠𝑝𝑒𝑛𝑑+ 𝛽 2 𝑠𝑝𝑒𝑛𝑑 2 +𝑢 The regression equation is MIL = 7.06 + 1.08 SPEND - 0.00399 SPEND SQUARED Predictor Coef SE Coef T P Constant 7.059 9.986 0.71 0.489 SPEND 1.0847 0.3699 2.93 0.009 SPEND SQUARED -0.003990 0.001984 -2.01 0.060 S = 21.8185 R-Sq = 53.0% R-Sq(adj) = 47.7%

Did you check your residuals plots?

Modeling a Quadratic Effect The interpretation of the quadratic term, a, depends on whether the linear term, b, is positive or negative.

The graph above and on the left shows an equation with a positive linear term to set the frame of reference. When the quadratic term is also positive, then the net effect is a greater than linear increase (see the middle graph). The interesting case is when the quadratic term is negative (the right graph). In this case, the linear and quadratic term compete with one another. The increase is less than linear because the quadratic term is exerting a downward force on the equation. Eventually, the trend will level off and head downward. In some situations, the place where the equation levels off is beyond the maximum of the data.

Quadratic Effect Example Consider the dataset MILEAGE (on my website) Create a model to predict MPG

More on R2 R2 does not indicate whether The independent variables are a true cause of the changes in the dependent variable omitted-variable bias exists the correct regression was used the most appropriate set of independent variables has been chosen there is collinearity present in the data on the explanatory variables the model might be improved by using transformed versions of the existing set of independent variables

More on R2 But, R2 has an easy interpretation: The percent of variability present in the independent variable explained by the regression.

Adjusted R2 Modification of R2 that adjusts for the number of explanatory terms in the model. Adjusted R2 increases only if the new term added to the model improves the model sufficiently This implies adjusted R2 can rise or fall after the addition of a new term to the model. Definition: 𝑅 2 =1−(1− 𝑅 2 ) 𝑛−1 𝑛−𝑝−1 Where n is sample size and p is total number of predictors in the model

Adjusted R2 – Example Use MILEAGE data set Regress MPG on HP, WT, SP What is the R2 and the adjusted R2 Now, regress MPG on HP, WT, SP, and VOL

Prediction Intervals Use MILEAGE data set Regress MPG on HP We want to create a prediction of MPG at a HP of 200 Minitab gives: New Obs Fit SE Fit 95% CI 95% PI 1 22.261 1.210 (19.853, 24.670) (9.741, 34.782)

Prediction Intervals Difference between the 95% CI and the 95% PI Confidence interval of the prediction: Represents a range that the mean response is likely to fall given specified settings of the predictors. Prediction Interval: Represents a range that a single new observation is likely to fall given specified settings of the predictors. New Obs Fit SE Fit 95% CI 95% PI 1 22.261 1.210 (19.853, 24.670) (9.741, 34.782)

Prediction Intervals

Prediction Intervals Model has best predictive properties – narrowest interval – at the means of the predictors. Predict MPG from HP at the mean of HP