Dependent (response) Variable Independent (control) Variable Random Error XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Assumption:  i ‘s are independent.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
10-3 Inferences.
Inference for Regression
Regression Inferential Methods
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Econ 140 Lecture 81 Classical Regression II Lecture 8.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Objectives (BPS chapter 24)
Simple Linear Regression
Correlation and Regression. Spearman's rank correlation An alternative to correlation that does not make so many assumptions Still measures the strength.
Chapter 10 Simple Regression.
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
SIMPLE LINEAR REGRESSION
T-test.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Simple Linear Regression Analysis
Linear Regression Example Data
SIMPLE LINEAR REGRESSION
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Lecture 19 Simple linear regression (Review, 18.5, 18.8)
Simple Linear Regression and Correlation
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Correlation & Regression
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
STA291 Statistical Methods Lecture 27. Inference for Regression.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
QMS 6351 Statistics and Research Methods Regression Analysis: Testing for Significance Chapter 14 ( ) Chapter 15 (15.5) Prof. Vera Adamchik.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Regression. Height Weight Suppose you took many samples of the same size from this population & calculated the LSRL for each. Using the slope from each.
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
Linear Regression and Correlation Analysis. Regression Analysis Regression Analysis attempts to determine the strength of the relationship between one.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Lecture 10: Correlation and Regression Model.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Math 4030 – 13a Correlation & Regression. Correlation (Sec. 11.6):  Two random variables, X and Y, both continuous numerical;  Correlation exists when.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Regression. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other words, there is a distribution.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Statistics for Managers Using Microsoft® Excel 5th Edition
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Chapter 26: Inference for Slope. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Inference about the slope parameter and correlation
AP Statistics Chapter 14 Section 1.
Math 4030 – 10b Inferences Concerning Variances: Hypothesis Testing
Inference for Regression
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
LESSON 24: INFERENCES USING REGRESSION
Simple Linear Regression
Inference for Regression
Adding variables. There is a difference between assessing the statistical significance of a variable acting alone and a variable being added to a model.
Inferences 10-3.
Presentation transcript:

Dependent (response) Variable Independent (control) Variable Random Error XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Assumption:  i ‘s are independent normally distributed random variables with mean 0 and (common) variance  2.

Estimator for  : Estimator  : Estimator for the Y value for given x: Estimators for parameters unbiased? Confidence intervals for the true parameters? Confidence interval for the Y(x) value: Estimate E(Y(x))? Hypothesis Testing? Quality of the model? Better model? (Sec. 11.2)

Air Velocity x (cm/s) Evaporation Coefficient y (mm 2 /s) (Linear) relationship exists? Linear relationship? Prediction for x = 190 cm/s Error?

xyx^2y^2xy Sum Sxx Slope b Syy Intercept a Sxy505.4

Residuals: Residual sum of squares (or Error Sum of Squares): Estimator for  2 : Standard Error of the Estimate:

Distribution of b (as a statistic): Confidence interval for the true population slope  is To test the null hypothesis H 0 :  =  0, use the statistic (with df = n – 2.) Only when the null hypothesis H 0 :  = 0 is rejected, the independent variable x can be included in the model.

Distribution of a (as a statistic): Confidence interval for the true population intercept  is To test the null hypothesis H 0 :  =  0, use the statistic (with df = n – 2.) Only when the null hypothesis H 0 :  = 0 is rejected, the nonzero intercept can be included in the model.

Inference about E[Y(x)] =  +  x, the mean of the response value for given x. Confidence interval for E[Y(x)] is where t-distribution has df = n – 2.

To predict the “future” Y(x), we use the interval where t-distribution has df = n – 2. Note: Confidence interval for Y(x) is wider than the confidence interval for E[Y(x)]; Width of the confidence intervals for Y(x) and E[Y(x)] depend on the x value. (Limit of prediction.)

Excel Outputs: Estimators a and b T-score in Hypothesis Tests P-values in Hypothesis Tests 95% Confidence Intervals for  and  Compare with the critical t-value(s); Compare with the  value; Check if the interval contains zero.