REGRESSION (CONTINUED)

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Topic 12: Multiple Linear Regression
Forecasting Using the Simple Linear Regression Model and Correlation
BA 275 Quantitative Business Methods
Uji Kelinearan dan Keberartian Regresi Pertemuan 02 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Simple Linear Regression. G. Baker, Department of Statistics University of South Carolina; Slide 2 Relationship Between Two Quantitative Variables If.
Simple Linear Regression
Chapter 12 Simple Linear Regression
Chapter 10 Simple Regression.
1 Pertemuan 13 Uji Koefisien Korelasi dan Regresi Matakuliah: A0392 – Statistik Ekonomi Tahun: 2006.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Slide Copyright © 2010 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Business Statistics First Edition.
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
BCOR 1020 Business Statistics
This Week Continue with linear regression Begin multiple regression –Le 8.2 –C & S 9:A-E Handout: Class examples and assignment 3.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Chapter 7 Forecasting with Simple Regression
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Autocorrelation in Time Series KNNL – Chapter 12.
Regression Analysis Relationship with one independent variable.
ANAREGWEEK 14 AUTOCORRELATION IN TIME SRIES DATA  Problems of autocorrelation  First-order autoregressive error model  Durbin-Watson test for autocorrelation.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
ENGR 610 Applied Statistics Fall Week 11 Marshall University CITE Jack Smith.
1 Experimental Statistics - week 11 Chapter 11: Linear Regression and Correlation.
Multiple Regression.
Lecture 11: Simple Linear Regression
Simple Linear Regression & Correlation
Chapter 20 Linear and Multiple Regression
Inference for Least Squares Lines
Statistics for Managers using Microsoft Excel 3rd Edition
REGRESSION (CONTINUED)
BA 240 Yamasaki Solutions to Practice 5
Simple Linear Regression
Relationship with one independent variable
Chapter 13 Simple Linear Regression
9/19/2018 ST3131, Lecture 6.
Quantitative Methods Simple Regression.
Slides by JOHN LOUCKS St. Edward’s University.
Linear Regression Models
BA 275 Quantitative Business Methods
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Review of Chapter 3 where Multiple Linear Regression Model:
Multiple Regression.
PENGOLAHAN DAN PENYAJIAN
Review of Chapter 2 Some Basic Concepts: Sample center
Relationship with one independent variable
Simple Linear Regression
Simple Linear Regression
DRQ #11 – October 22, (4 pts) (1/2 pt)
DRQ #11 – October 22, (4 pts) (1/2 pt)
MGS 3100 Business Analysis Regression Feb 18, 2016
St. Edward’s University
Chapter 13 Simple Linear Regression
Presentation transcript:

REGRESSION (CONTINUED) LECTURE 4 REGRESSION (CONTINUED) Analysis of Variance; Standard Errors & Confidence Intervals; Prediction Intervals; Examination of Residuals Supplementary Readings: Wilks, chapters 6,9; Bevington, P.R., Robinson, D.K., Data Reduction and Error Analysis for the Physical Sciences, McGraw-Hill, 1992.

What should we require of them? Recall from last time… Define: We call these residuals What should we require of them?

What should we require of them? Recall from last time… GAUSSIAN What should we require of them?

Analysis of Variance (“ANOVA”)? Recall from last time… Analysis of Variance (“ANOVA”)? 2(n=5) Gaussian data

Analysis of Variance (“ANOVA”) is guaranteed by linear regression procedure Why “n-2”?

Analysis of Variance (“ANOVA”) Define:

Analysis of Variance (“ANOVA”) 1 and n-2 degrees of freedom Define: 1 and n-2 degrees of freedom

Analysis of Variance (“ANOVA”) 1 and n-2 degrees of freedom Source df SS MS F-test Total n-1 SST Regression 1 SSR MSR=SSR MSR/MSE Residual n-2 SSE MSE=se2 1 and n-2 degrees of freedom

Analysis of Variance (“ANOVA”) for Simple Linear Regression Source df SS MS F-test Total n-1 SST Regression 1 SSR MSR=SSR MSR/MSE Residual n-2 SSE MSE=se2 We’ll discuss ANOVA further in the next lecture (“multivariate regression”)

‘Goodness of Fit’

‘Goodness of Fit’ For simple linear regression

‘Goodness of Fit’ Outside the “support” of the regression, in general,

‘Goodness of Fit’ Outside the “support” of the regression, in general,

‘Goodness of Fit’ Reliability Bias

Analysis of Variance (“ANOVA”) Under Gaussian assumptions, the estimates from linear regression of the parameter a and b represent unbiased estimates of means of a Gaussian distribution Where the standard errors in the regression parameters are:

Confidence Intervals The estimated regression slope ‘b’ is likely to be within some range of the true ‘b’

Confidence Intervals This naturally defines a t test for the presence of a trend:

Prediction Intervals MSE in a predicted value or, (‘Prediction Error’) is larger than the nominal MSE, increasing as the predictand value departs from the mean Note that sy approaches se as the ‘training’ sample becomes large

Linear Correlation ‘r’ suffers from sampling error both in the regression slope and the estimates of variance…

Linear Correlation ‘r’ suffers from sampling error both in the regression slope and the estimates of variance…

Linear Correlation Coefficient

Examining Residuals Heteroscedasticity A trend in residual variance violates the assumption of Gaussian residuals…

Examining Residuals Heteroscedasticity Often a simple transformation of the original data will yield more closely Gaussian residuals…

Examining Residuals Leverage Points can still be a problem!

Examining Residuals Autocorrelation Durbin-Watson Statistic

Examining Residuals Autocorrelation Suppose we have the simple (‘first order autoregressive’) model Then we can still use all of the results based on Gaussian statistics, but with the modified sample size: For example:

Examining Residuals Autocorrelation Suppose we have the simple (‘first order autoregressive’) model Then we can still use all of the results based on Gaussian statistics, but with the modified sample size: Different for tests of variance

Examining Residuals Autocorrelation Suppose we have the simple (‘first order autoregressive’) model Then we can still use all of the results based on Gaussian statistics, but with the modified sample size: Different again for correlations

We can remove the serial correlation through Examining Residuals Suppose we have the simple (‘first order autoregressive’) model We can remove the serial correlation through