Linear model. a type of regression analyses statistical method – both the response variable (Y) and the explanatory variable (X) are continuous variables.

Slides:



Advertisements
Similar presentations
Chapter 12 Inference for Linear Regression
Advertisements

Brief introduction on Logistic Regression
Forecasting Using the Simple Linear Regression Model and Correlation
BA 275 Quantitative Business Methods
Stat 112: Lecture 7 Notes Homework 2: Due next Thursday The Multiple Linear Regression model (Chapter 4.1) Inferences from multiple regression analysis.
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Simple Linear Regression. Start by exploring the data Construct a scatterplot  Does a linear relationship between variables exist?  Is the relationship.
Objectives (BPS chapter 24)
Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Chapter 10 Simple Regression.
1 BA 275 Quantitative Business Methods Residual Analysis Multiple Linear Regression Adjusted R-squared Prediction Dummy Variables Agenda.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections ): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence.
Simple Linear Regression Analysis
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Correlation and Regression Analysis
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Correlation & Regression
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
1 MULTI VARIATE VARIABLE n-th OBJECT m-th VARIABLE.
Chapter 11 Simple Regression
Hypothesis Testing in Linear Regression Analysis
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Introduction to Linear Regression
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Why Model? Make predictions or forecasts where we don’t have data.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
Section 12.3 Regression Analysis HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2008 by Hawkes Learning Systems/Quant Systems, Inc. All.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Chapter 14: Inference for Regression. A brief review of chapter 4... (Regression Analysis: Exploring Association BetweenVariables )  Bi-variate data.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
MEASURES OF GOODNESS OF FIT The sum of the squares of the actual values of Y (TSS: total sum of squares) could be decomposed into the sum of the squares.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 17 Simple Linear Regression and Correlation.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
11.1 Heteroskedasticity: Nature and Detection Aims and Learning Objectives By the end of this session students should be able to: Explain the nature.
Lab 4 Multiple Linear Regression. Meaning  An extension of simple linear regression  It models the mean of a response variable as a linear function.
Lesson Testing the Significance of the Least Squares Regression Model.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
AP Statistics Chapter 14 Section 1.
Statistics in MSmcDESPOT
CHAPTER 29: Multiple Regression*
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
PENGOLAHAN DAN PENYAJIAN
Simple Linear Regression
3.2. SIMPLE LINEAR REGRESSION
Presentation transcript:

Linear model

a type of regression analyses statistical method – both the response variable (Y) and the explanatory variable (X) are continuous variables – predict the value of one variable (the dependent variable - y)based on the value of other variables (independent variables x 1, x 2,…x k.)

3 Linear model Simplest linear model y = dependent variable x = independent variable  0 = y-intercept  1 = slope of the line = error variable  0 and  1 are unknown, therefore, are estimated from the data.

Linear model maximum likelihood estimates provide the ‘best’ estimates method of least squares least of residuals: vertical differences between the data and the fitted model

Linear model how to fit the simple linear regression model using lm () lm(formula, data=…, subset=…) By default, the lm () print out the estimates for the coefficients Usually, we store the results of the model in a variable, so that it can subsequently be queried for more

ref:

Linear model Table one row represent each coefficient first column: estimate second column: standard error SE third column: t-statistic – testing null hypothes isH 0 : β = 0 fourth column: p-value for testing H 0 : β = 0 against the two-tailed alternative

Linear model residuals – difference between the actual values and predicted values from regression – if residuals look like a normal distribution when plotted, this indicates the mean of the difference between our predictions and the actual values is close to 0

Linear model estimated coefficient value of slope calculated by the regression standard error of estimated coefficient – measure of the variability in the estimate for the coefficient. Lower means better

Linear model R-square evaluate the goodness of fit of your model. Higher is better with 1 being the best. corresponds with the amount of variability in what you're predicting that is explained by the model.

Linear model Confidence intervals a type of interval estimate of a population parameter indicate the reliability of an estimate

Information Criterion measure of the relative quality of a statistical model log-likelihood value can be obtained, according to the formula -2*log-likelihood + k*npar, where npar represents the number of parameters in the fitted model – AIC: k = 2 – BIC or SBC (Schwarz's Bayesian criterion): k = log(n)

AIC Akaike Information Criterion (AIC) method for comparing models index takes into account a model’s statistical fit and the number of parameters needed to achieve this fit models with smaller AIC values indicate adequate fit with fewer parameter

ref: Robert Kabacoff. R in Action. New York:Manning Publsihing;2011

Homoskedasticity vs Heteroskedasticity variance of the error term is constant. (Homoscedasticity) If the error terms do not have constant variance, they are said to be heteroskedastic.

Robustness: Cook’s distance influential observation – observation that has a disproportionate impact on the determination of the model parameters based on the difference of the predicted values of y i for a given x i when the point (x i, y i )is and isn’t included in the calculation of the regression coefficients