Presentation is loading. Please wait.

Presentation is loading. Please wait.

Time Series Analysis – Chapter 6 Odds and Ends

Similar presentations


Presentation on theme: "Time Series Analysis – Chapter 6 Odds and Ends"— Presentation transcript:

1 Time Series Analysis – Chapter 6 Odds and Ends

2 Units Conversions When variables are rescaled (units are changed), the coefficients, standard errors, confidence intervals, t statistics, and F statistics change in ways that preserve all measured effects and testing outcomes.

3 Beta Coefficients A z-score is: 𝑧= 𝑥− 𝑥 𝑠
In a data set, the data for each regression variable (independent and dependent) are converted to z-scores. Then, the regression is conducted.

4 Beta Coefficients – Example
Use the data set 4th Graders Feet Regress foot length on foot width The regression equation is Foot Length = Foot Width Predictor Coef SE Coef T P Constant Foot Width S = R-Sq = 41.1% R-Sq(adj) = 39.5%

5 Beta Coefficients – Example
Use the data set 4th Graders Feet Regress z-score of foot length on z-score of foot width The regression equation is zFoot Length = zFoot Width Predictor Coef SE Coef T P Constant zFoot Width S = R-Sq = 41.1% R-Sq(adj) = 39.5%

6 Using the Log of a Variable
Taking the log usually narrows the range of the variable – This can result in estimates that are less sensitive to outliers

7 Using the Log of a Variable
When a variable is a positive $ amount, the log is usually taken When a variable has large integer values, the log is usually taken: population, total # employees, school enrollment, etc…

8 Using the Log of a Variable
Variables that are measured in years such as education, experience, age, etc… are usually left in original form

9 Using the Log of a Variable
Proportions or percentages are usually left in original form because the coefficients are easier to interpret – percentage point change interpretation.

10 Modeling a Quadratic Effect
Consider the quadratic effect dataset Want to predict Millions of retained impressions per week Predictor is TV advertising budget, 1983 ($ millions) Model is: 𝑚𝑖𝑙= 𝛽 𝑜 + 𝛽 1 𝑠𝑝𝑒𝑛𝑑+𝑢

11 Consider the quadratic effect dataset
Want to predict Millions of retained impressions per week Predictor is TV advertising budget, 1983 ($ millions) The regression equation is MIL = SPEND Predictor Coef SE Coef T P Constant SPEND S = R-Sq = 42.4% R-Sq(adj) = 39.4%

12 Did you check your residuals plots?

13 Scatterplot – there is a quadratic effect too!

14 Modeling a Quadratic Effect
Consider the quadratic effect dataset Want to predict Millions of retained impressions per week Predictor is TV advertising budget, 1983 ($ millions) Add the quadratic effect to the model Model is: 𝑚𝑖𝑙= 𝛽 𝑜 + 𝛽 1 𝑠𝑝𝑒𝑛𝑑+ 𝛽 2 𝑠𝑝𝑒𝑛𝑑 2 +𝑢

15 Model is: 𝑚𝑖𝑙= 𝛽 𝑜 + 𝛽 1 𝑠𝑝𝑒𝑛𝑑+ 𝛽 2 𝑠𝑝𝑒𝑛𝑑 2 +𝑢
The regression equation is MIL = SPEND SPEND SQUARED Predictor Coef SE Coef T P Constant SPEND SPEND SQUARED S = R-Sq = 53.0% R-Sq(adj) = 47.7%

16 Did you check your residuals plots?

17 Modeling a Quadratic Effect
The interpretation of the quadratic term, a, depends on whether the linear term, b, is positive or negative.

18 The graph above and on the left shows an equation with a positive linear term to set the frame of reference. When the quadratic term is also positive, then the net effect is a greater than linear increase (see the middle graph). The interesting case is when the quadratic term is negative (the right graph). In this case, the linear and quadratic term compete with one another. The increase is less than linear because the quadratic term is exerting a downward force on the equation. Eventually, the trend will level off and head downward. In some situations, the place where the equation levels off is beyond the maximum of the data.

19 Quadratic Effect Example
Consider the dataset MILEAGE (on my website) Create a model to predict MPG

20 More on R2 R2 does not indicate whether
The independent variables are a true cause of the changes in the dependent variable omitted-variable bias exists the correct regression was used the most appropriate set of independent variables has been chosen there is collinearity present in the data on the explanatory variables the model might be improved by using transformed versions of the existing set of independent variables

21 More on R2 But, R2 has an easy interpretation:
The percent of variability present in the independent variable explained by the regression.

22 Adjusted R2 Modification of R2 that adjusts for the number of explanatory terms in the model. Adjusted R2 increases only if the new term added to the model improves the model sufficiently This implies adjusted R2 can rise or fall after the addition of a new term to the model. Definition: 𝑅 2 =1−(1− 𝑅 2 ) 𝑛−1 𝑛−𝑝−1 Where n is sample size and p is total number of predictors in the model

23 Adjusted R2 – Example Use MILEAGE data set Regress MPG on HP, WT, SP
What is the R2 and the adjusted R2 Now, regress MPG on HP, WT, SP, and VOL

24 Prediction Intervals Use MILEAGE data set Regress MPG on HP
We want to create a prediction of MPG at a HP of 200 Minitab gives: New Obs Fit SE Fit % CI % PI (19.853, ) (9.741, )

25 Prediction Intervals Difference between the 95% CI and the 95% PI
Confidence interval of the prediction: Represents a range that the mean response is likely to fall given specified settings of the predictors. Prediction Interval: Represents a range that a single new observation is likely to fall given specified settings of the predictors. New Obs Fit SE Fit % CI % PI (19.853, ) (9.741, )

26 Prediction Intervals

27 Prediction Intervals Model has best predictive properties – narrowest interval – at the means of the predictors. Predict MPG from HP at the mean of HP


Download ppt "Time Series Analysis – Chapter 6 Odds and Ends"

Similar presentations


Ads by Google