Ch12.1 Simple Linear Regression

Slides:



Advertisements
Similar presentations
Chapter 12 Simple Linear Regression
Advertisements

Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Simple Regression. Major Questions Given an economic model involving a relationship between two economic variables, how do we go about specifying the.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
© The McGraw-Hill Companies, Inc., 2000 CorrelationandRegression Further Mathematics - CORE.
Simple Linear Regression
Chapter 12 Simple Linear Regression
SIMPLE LINEAR REGRESSION
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Simple Linear Regression and Correlation
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Regression Analysis What is regression ?What is regression ? Best-fit lineBest-fit line Least squareLeast square What is regression ?What is regression.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© The McGraw-Hill Companies, Inc., 2000 Business and Finance College Principles of Statistics Lecture 10 aaed EL Rabai week
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Linear Regression Least Squares Method: the Meaning of r 2.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Regression. Population Covariance and Correlation.
Regression Regression relationship = trend + scatter
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Regression and Correlation of Data Correlation: Correlation is a measure of the association between random variables, say X and Y. No assumption that one.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Multiple Regression.
The simple linear regression model and parameter estimation
Chapter 11: Linear Regression and Correlation
Regression and Correlation of Data Summary
Lecture #26 Thursday, November 17, 2016 Textbook: 14.1 and 14.3
Linear Regression and Correlation Analysis
10.3 Coefficient of Determination and Standard Error of the Estimate
Multiple Regression and Model Building
Quantitative Methods Simple Regression.
Econ 3790: Business and Economics Statistics
Simple Linear Regression - Introduction
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Regression Analysis PhD Course.
LESSON 21: REGRESSION ANALYSIS
Simple Regression Mary M. Whiteside, PhD.
Linear Regression.
Multiple Regression.
Regression Models - Introduction
Simple Linear Regression
Least-Squares Regression
PENGOLAHAN DAN PENYAJIAN
M248: Analyzing data Block D UNIT D2 Regression.
Section 2: Linear Regression.
Correlation and Regression
Least-Squares Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression
11C Line of Best Fit By Eye, 11D Linear Regression
Simple Linear Regression
Ch 4.1 & 4.2 Two dimensions concept
Lesson 2.2 Linear Regression.
Regression and Correlation of Data
Regression Models - Introduction
Presentation transcript:

Ch12.1 Simple Linear Regression The Simple Linear Regression Model There exists parameters such that for any fixed value of x, the dependent variable is related to x through the model equation ε is a random variable (called the random deviation) with E(ε) = 0 and V(ε) = σ2 One can see how a dependent variable is related to an independent variable with a scatter plot. Ch12.1

Ch12.1 Estimating Model Parameters Principle of Least Squares The vertical deviation of the point (xi,yi) from the line y = b0 + b1x is yi – (b0 + b1xi) The sum of squared vertical deviations from the points to the line is: Ch12.2

The error sum of squares, denoted SSE, is and the estimate of σ2 is A computational formula for the SSE, is The total sum of squares, denoted SST, is The coefficient of determination, denoted by r2, is given by Ch12.2

The least-squares (regression) line for the data is given by where and The fitted (predicted) values are obtained by substituting into the equation of the estimated regression line: The residuals are the vertical deviations from the estimated line. Ch12.2