26134 Business Statistics Week 5 Tutorial

Slides:



Advertisements
Similar presentations
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Advertisements

Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Multiple Regression Analysis
Chapter 12 Simple Regression
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Business Statistics - QBM117 Interval estimation for the slope and y-intercept Hypothesis tests for regression.
Simple Linear Regression Analysis
Regression and Correlation Methods Judy Zhong Ph.D.
Inference for regression - Simple linear regression
Chapter 11 Simple Regression
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
Chapter 14 Simple Regression
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Section 9-1: Inference for Slope and Correlation Section 9-3: Confidence and Prediction Intervals Visit the Maths Study Centre.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
CORRELATION: Correlation analysis Correlation analysis is used to measure the strength of association (linear relationship) between two quantitative variables.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 21 The Simple Regression Model.
Lecture 10: Correlation and Regression Model.
26134 Business Statistics Tutorial 11: Hypothesis Testing Introduction: Key concepts in this tutorial are listed below 1. Difference.
26134 Business Statistics Tutorial 12: REVISION THRESHOLD CONCEPT 5 (TH5): Theoretical foundation of statistical inference:
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
PO 141: INTRODUCTION TO PUBLIC POLICY Summer I (2015) Claire Leavitt Boston University.
Predicting Energy Consumption in Buildings using Multiple Linear Regression Introduction Linear regression is used to model energy consumption in buildings.
Tutorial 11: Hypothesis Testing
Chapter 13 Simple Linear Regression
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
32931 Technology Research Methods Autumn 2017 Quantitative Research Component Topic 4: Bivariate Analysis (Contingency Analysis and Regression Analysis)
Chapter 14 Introduction to Multiple Regression
Chapter 20 Linear and Multiple Regression
Regression and Correlation
Chapter 4 Basic Estimation Techniques
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Multiple Regression Equations
MATH1005 STATISTICS Tutorial 3: Bivariate Data.
Correlation and Simple Linear Regression
Dr.MUSTAQUE AHMED MBBS,MD(COMMUNITY MEDICINE), FELLOWSHIP IN HIV/AIDS
Political Science 30: Political Inquiry
Linear Regression and Correlation Analysis
26134 Business Statistics Week 6 Tutorial
Chapter 11 Simple Regression
Relationship with one independent variable
Chapter 13 Simple Linear Regression
Correlation and Simple Linear Regression
I271B Quantitative Methods
Correlation and Regression
CHAPTER 29: Multiple Regression*
Multiple Regression Models
Correlation and Simple Linear Regression
Simple Linear Regression
Relationship with one independent variable
Pemeriksaan Sisa dan Data Berpengaruh Pertemuan 17
Simple Linear Regression and Correlation
Product moment correlation
Tutorial 6 SEG rd Oct..
Section 11.1: Significance Tests: Basics
Introduction to Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

26134 Business Statistics Mahrita.Harahap@uts.edu.au Week 5 Tutorial Multiple Linear Regression Key concepts in this tutorial are listed below 1. Multiple regression. 2. Interpreting parameter estimate (or coefficients). 3. Hypothesis testing of the model. 4. Interpreting significance (t-stats and F-stats). 5. Interpreting R2 and Adjusted R2: 6. Using the multiple linear regression for prediction.

In statistics we usually want to statistically analyse a population but collecting data for the whole population is usually impractical, expensive and unavailable. That is why we collect samples from the population (sampling) and make inferences about the population parameters using the statistics of the sample (inferencing) with some level of accuracy (confidence level). A population is a collection of all possible individuals, objects, or measurements of interest. A sample is a subset of the population of interest.

Multiple Linear Regression A single metric dependent variable with two or more independent variables. Regression Equation: Interpretation of coefficients: For a one unit increase in Xi, Y increases/decreases by Bi units on average, holding other variables constant. NOTE: The interpretation of the intercept may be nonsensical since it is often not reasonable for the explanatory variable to be zero. As “x” is zero, the response variable is ….. If zero is not in the given sample x range then the intercept cannot be interpreted because 0 is outside of the sample range. Avoid trying to apply a regression line to predict values far from those that were used to create it.

Hypothesis Testing We use hypothesis testing to infer conclusions about the population parameters based on analysing the statistics of the sample. In statistics, a hypothesis is a statement about a population parameter. The null hypothesis, denoted H0 is a statement or claim about a population parameter that is initially assumed to be true. Is always an equality. (Eg. H0: β1=0) The alternative hypothesis, denoted by H1 is the competing claim. What we are trying to prove. (Eg. H1: β1 ≠ 0) Test Statistic: a measure of compatibility between the statement in the null hypothesis and the data obtained. Decision Criteria: The P-value is the probability of obtaining a test statistic as extreme or more extreme than the observed sample value given H0 is true. Each test statistic has a corresponding p-value. If p-value≤0.05 reject Ho If p-value>0.05 do not reject Ho Conclusion: Make your conclusion in context of the problem.

Assessing significance of individual independent variables H0: βi=0 (no linear relationship) H1 : βi≠0 (linear relationship does exist between xi and y) Test Statistic: t-stat= t-test* indicates significance of individual independent variables Decision Criteria: If p-value≤0.05 reject Ho If p-value>0.05 do not reject Ho Conclusion: If p-value ≤0.05, reject Ho, there is significant evidence that βi is not equal to zero. Thus, the independent variable is linearly related to the dependent variable. If p-value >0.05, do not reject Ho, there is no significant evidence that βi is not equal to zero. Thus, the independent variable is NOT linearly related to the dependent variable.

Assessing significance of model. H0: β1 = β2 = … = βk = 0 (no linear relationship) H1 : at least one βi ≠ 0 (at least one independent variable affects y) Test Statistic: F-Stat= Also mention the adjusted R2. Decision Criteria: If p-value≤0.05 reject Ho If p-value>0.05 do not reject Ho Conclusion: If p-value ≤0.05, reject Ho, there is significant evidence that at least one of the βi is not equal to zero. Thus, at least one independent variable is linearly related to y. Hence the the model does have some validity and it is useful. If p-value >0.05, do not reject Ho, there is no significant evidence that at least one of the βi is not equal to zero. Hence the the model is not valid or useful. R2 F Assessment of model 1 ∞ Perfect Close to 1 Large Good Close to 0 Small Poor Useless

R2 and Adjusted R2 The R2 is a numerical value between 0 and 1 which explains the variation in the dependent variable as explained by all independent variables. R2 always increases with addition of independent variables in the model irrespective of whether these variables contribute to the overall fit of the model. This can be misleading about the model assessment… The adjusted R2 recalculates the R2 based on the number of independent variables in the model and the sample size. In layman terms – this value tells us how useful the model is. Interpretation: …% of the variation in the dependent variable is explained by variation in the independent variables, taking into account the sample size and number of independent variables