Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Section 10-3 Regression.
Kin 304 Regression Linear Regression Least Sum of Squares
Linear Equations Review. Find the slope and y intercept: y + x = -1.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Developing and Using a Simple Regression Equation. The simple regression model is based on the equation for a straight line: Yc = A+BX.
© The McGraw-Hill Companies, Inc., 2000 CorrelationandRegression Further Mathematics - CORE.
Introduction to Regression Analysis
SIMPLE LINEAR REGRESSION
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Chapter 2 – Simple Linear Regression - How. Here is a perfect scenario of what we want reality to look like for simple linear regression. Our two variables.
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Correlation & Regression
Linear Regression.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Simple Linear Regression. Types of Regression Model Regression Models Simple (1 variable) LinearNon-Linear Multiple (2
Chapter 11 Simple Regression
© The McGraw-Hill Companies, Inc., 2000 Business and Finance College Principles of Statistics Lecture 10 aaed EL Rabai week
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
© The McGraw-Hill Companies, Inc., Chapter 11 Correlation and Regression.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Statistical Methods Statistical Methods Descriptive Inferential
Regression. Population Covariance and Correlation.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
1 Forecasting Formulas Symbols n Total number of periods, or number of data points. A Actual demand for the period (  Y). F Forecast demand for the period.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Environmental Modeling Basic Testing Methods - Statistics III.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
. 5.1 write linear equation in slope intercept form..5.2 use linear equations in slope –intercept form..5.3 write linear equation in point slope form..5.4.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
y=a+bx Linear Regression: Method of Least Squares slope y intercept y
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Regression Modeling Applications in Land use and Transport.
© The McGraw-Hill Companies, Inc., Chapter 10 Correlation and Regression.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
1 Linear Regression Model. 2 Types of Regression Models.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
Inference about the slope parameter and correlation
The simple linear regression model and parameter estimation
REGRESSION G&W p
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Ch12.1 Simple Linear Regression
Linear Regression Bonus
Chapter 5 STATISTICS (PART 4).
SIMPLE LINEAR REGRESSION MODEL
CHAPTER 10 Correlation and Regression (Objectives)
S519: Evaluation of Information Systems
Simple Linear Regression - Introduction
Slope of the regression line:
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Regression Analysis PhD Course.
y=a+bx Linear Regression: Method of Least Squares slope y intercept y
LESSON 21: REGRESSION ANALYSIS
Simple Linear Regression
Correlation and Regression
Section 2: Linear Regression.
Simple Linear Regression
Created by Erin Hodgess, Houston, Texas
Presentation transcript:

Simple Linear Regression

The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x where the regression coefficients  and  are parameters to be estimated from the sample data. Denoting their estimates by a and b, respectively, we can then estimate  Y|x by  y from the sample regression or the fitted regression line  y = a + bx where the estimates a and b represent the  y intercept and slope, respectively. The symbol  y is used here to distinguish between the estimated or predicted value given by the sample regression line and an actual observed experimental value y for some value of x.

Estimating the Regression Coefficient. Given the sample {(x i, y i ); i = 1, 2, …, n}, the least squares estimates a and b of the regression coefficients  and  are computed from the formulas and

x y