Simple Linear Regression Lecture for Statistics 509 November-December 2000.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Chapter 12 Simple Linear Regression
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Objectives (BPS chapter 24)
Simple Linear Regression
July 1, 2008Lecture 17 - Regression Testing1 Testing Relationships between Variables Statistics Lecture 17.
Chapter 12 Simple Linear Regression
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r Assumptions.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
Chapter Topics Types of Regression Models
Simple Linear Regression Statistics 700 Week of November 27.
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Business Statistics - QBM117 Statistical inference for regression.
Correlation and Regression Analysis
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Correlation & Regression
Correlation and Regression
Linear Regression.
Introduction to Linear Regression and Correlation Analysis
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12 Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Inference for regression - Simple linear regression
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-3 Regression.
Chapter 11 Simple Regression
Linear Regression and Correlation
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Stat 112: Notes 2 Today’s class: Section 3.3. –Full description of simple linear regression model. –Checking the assumptions of the simple linear regression.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Simple Linear Regression ANOVA for regression (10.2)
College Prep Stats. x is the independent variable (predictor variable) ^ y = b 0 + b 1 x ^ y = mx + b b 0 = y - intercept b 1 = slope y is the dependent.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
1 1 Slide Simple Linear Regression Estimation and Residuals Chapter 14 BA 303 – Spring 2011.
ECON 338/ENVR 305 CLICKER QUESTIONS Statistics – Question Set #8 (from Chapter 10)
June 30, 2008Stat Lecture 16 - Regression1 Inference for relationships between variables Statistics Lecture 16.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Chapter 10 Correlation and Regression 10-2 Correlation 10-3 Regression.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Linear Correlation (12.5) In the regression analysis that we have considered so far, we assume that x is a controlled independent variable and Y is an.
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Linear Regression and Correlation Chapter 13.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Inference for Least Squares Lines
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Virtual COMSATS Inferential Statistics Lecture-26
BIVARIATE REGRESSION AND CORRELATION
Simple Linear Regression - Introduction
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Simple Linear Regression
Simple Linear Regression
Simple Linear Regression
Created by Erin Hodgess, Houston, Texas
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Simple Linear Regression Lecture for Statistics 509 November-December 2000

Week of 11/27/2000Stat Regression Lecture2 Correlation and Regression Study of association and/or relationship between variables. Useful for determining the effect of changes in one variable (called the independent or control variable) on another variable (called the dependent or response variable). Regression models could be utilized to determine optimal operating conditions [these conditions specified by the control variables] in order to achieve a certain specified value or yield on the response variable. Regression models could also be utilized to predict the value of the response given a value of the independent variable, or could be used for “calibrating” the value of the independent variable to achieve a certain response.

Week of 11/27/2000Stat Regression Lecture3 Some Examples Control variable is X = Average Speed of a Car and response variable is Y=Fuel Efficiency of the Car. Goal is to determine speed to optimize the efficiency of the car. Control variable is X = Temperature, while the response variable is Y = Yield in a chemical reaction. Control variable is X = amount of fertilizer applied on a plant, while the response variable is Y = yield of this plant. Control variable is X = thickness of a stack of bond paper, while the response variable is Y = number of sheets in this stack. Control variable is X = average time of studying, while the response variable is Y = GPA.

Week of 11/27/2000Stat Regression Lecture4 Population Model Each member of the population will have a value for the independent variable X and the response variable Y, usually represented by the vector (X,Y). For a given value X = x, the variable Y has a certain distribution whose conditional mean is  (x) and whose conditional variance is  2 (x). This could be visualized as follows: When you consider the subpopulation consisting of units whose values of X equal x, then their Y-values has a certain distribution whose mean is  (x) and whose variance is  2 (x). When you pick a unit from this subpopulation, then the Y-value that you will observe is governed by this particular distribution. In particular, this observation could be expressed via Y =  (x) + , where e is some “error term.”

Week of 11/27/2000Stat Regression Lecture5 Assumptions for Simple Linear Regression  (x) = E(Y|X=x) =  +  x. This means that the mean of Y, given X = x, is a linear function of x.  is called the regression coefficient or the slope of the regression line;  is the y-intercept.  2 (x) =   does not depend on x. This is the assumption of “equal variances” or homoscedasticity. Furthermore, for the sample data (x 1, Y 1 ), (x 2, Y 2 ), …, (x n, Y n ): Y 1, Y 2, …, Y n are independent observations, and their conditional distributions are all normal. In shorthand notation: Y i =  (x i ) +  i =  +  x i +  i, i=1,2,…,n, where  1,  2, …,  n are independent and identically distributed (IID) N(0,  2 ).

Week of 11/27/2000Stat Regression Lecture6 Regression Problem Given the sample (bivariate) data (x 1, Y 1 ), (x 2, Y 2 ), …, (x n, Y n ), satisfying the linear regression model Y i =  +  x i +  i with  1,  2, …,  n IID N(0,  2 ) we would like to address the following questions: How should the data be summarized graphically? What are the estimators of the parameters , , and  2 ? What will be an estimate of the prediction line? What are the properties of the estimators of the model parameters? How do we test whether the fitted regression model is a significant model? How do we construct CIs or test hypotheses concerning parameters? How do we perform prediction using the prediction model?

Week of 11/27/2000Stat Regression Lecture7 Illustrative Example: On Plasma Etching Plasma etching is essential to the fine-line pattern transfer in current semiconductor processes. The paper “Ion Beam- Assisted Etching of Aluminum with Chlorine” in J. Electrochem. Soc. (1985) gives the data below on chlorine flow (x, in SCCM) through a nozzle used in the etching mechanism, and etch rate (y, in 100A/min)

Week of 11/27/2000Stat Regression Lecture8 The Scatterplot

Week of 11/27/2000Stat Regression Lecture9 Least-Squares Prediction Line

Week of 11/27/2000Stat Regression Lecture10

Week of 11/27/2000Stat Regression Lecture11

Week of 11/27/2000Stat Regression Lecture12 Analysis of Variance Table

Week of 11/27/2000Stat Regression Lecture13

Week of 11/27/2000Stat Regression Lecture14 Excel Worksheet for Regression Computations

Week of 11/27/2000Stat Regression Lecture15 Regression Analysis from Minitab

Week of 11/27/2000Stat Regression Lecture16 Fitted Line in Scatterplot with Bands