Maths Study Centre CB04.03.331 Open 11am – 5pm Semester Weekdays https://www.uts.edu.au/future-students/science/student-experience/maths-study-

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Chapter 27 Inferences for Regression This is just for one sample We want to talk about the relation between waist size and %body fat for the complete population.
Copyright © 2010 Pearson Education, Inc. Slide
Inference for Regression
Regression Inferential Methods
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
EPI 809/Spring Probability Distribution of Random Error.
Objectives (BPS chapter 24)
BA 555 Practical Business Analysis
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Chapter Topics Types of Regression Models
Introduction to Probability and Statistics Linear Regression and Correlation.
Correlation and Regression Analysis
Introduction to Regression Analysis, Chapter 13,
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Chapter 12 Section 1 Inference for Linear Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 5 Correlation and Regression
Correlation & Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
STA291 Statistical Methods Lecture 27. Inference for Regression.
Hypothesis Testing in Linear Regression Analysis
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Inferences for Regression
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Regression. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other words, there is a distribution.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
Introduction to Probability and Statistics Thirteenth Edition Chapter 12 Linear Regression and Correlation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
12.1 WS Solutions. (b) The y-intercept says that if there no time spent at the table, we would predict the average number of calories consumed to be
+ Chapter 12: More About Regression Section 12.1 Inference for Linear Regression.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
Copyright ©2011 Brooks/Cole, Cengage Learning Inference about Simple Regression Chapter 14 1.
1 Regression Analysis The contents in this chapter are from Chapters of the textbook. The cntry15.sav data will be used. The data collected 15 countries’
Simple Linear Regression (SLR)
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Data Analysis.
Simple linear regression Tron Anders Moger
Applied Quantitative Analysis and Practices LECTURE#25 By Dr. Osman Sadiq Paracha.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays Check out This presentation.
Residual Analysis Purposes –Examine Functional Form (Linear vs. Non- Linear Model) –Evaluate Violations of Assumptions Graphical Analysis of Residuals.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays Check out This presentation.
Regression. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other words, there is a distribution.
Chapter 26: Inference for Slope. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Inference for Regression
Chapter 20 Linear and Multiple Regression
Regression and Correlation
Correlation and Simple Linear Regression
Inferences for Regression
Correlation and Simple Linear Regression
CHAPTER 29: Multiple Regression*
No notecard for this quiz!!
Correlation and Simple Linear Regression
Regression Chapter 8.
Simple Linear Regression and Correlation
Inferences for Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Maths Study Centre CB Open 11am – 5pm Semester Weekdays centre centre START>ALL PROGRAMS>IBM SPSS>IBM SPSS STATISTICS 19 Marking Scheme: 0 if less than 50% attempted, 1 for more than 50% attempted but less than 50% correct, 2 if more than 50% correct.

The assumptions of the model is that the random errors (residuals) are normally distributed and random (have constant variance). 1.Normality Assumption: look at the histogram or Normal P-P plot. The normal probability plot is constructed by plotting the expected values of the residuals under the normality assumption (line) against the actual values of the residuals. If the normal assumption is valid, the residuals should lie approximately on the straight line. Any non-linear trend indicates the violation of the normal assumption. 2.Constant Variance Assumption: look at the residuals vs fits plot (GRAPH>LEGACYDIALOG>SCATTER>SIMPLESCATTER>Y AXIS:STUDENTISED RESIDUAL>X AXIS: PREDICTED VARIABLE. Residuals should have a constant variance (does not exhibit a pattern/random scatter). If the variance is not constant (patterns or increasing variance) then ordinary least squares is not the most efficient estimation method to characterise the relationship.

d) Use t-test when testing individual parameters: For obs 1: H 0 : α=0. The intercept is 0. H 1 : α ≠0. The intercept is not equal to 0. Test Statistic: Since p-value=0.003<0.05 we reject H o. We have enough evidence to prove it takes α ≠0 at the 5% level of significance. H 0 : β=1. The population slope is 1. H 1 : β≠1. The population slope is not equal to 1. Test Statistic: If |t|>t n-2, p-value≤0.05. Reject H o. We have enough evidence to prove β≠1 at the 5% level of significance. If |t| Do not reject H o. We do not have enough evidence to prove β≠1 at the 5% level of significance. Since |t|<t n-2 we do not reject H o and conclude we do not have enough evidence to prove β≠1 at the 5% level of significance. For obs 2: H 0 : α=0. The intercept is 0. H 1 : α ≠0. The intercept is not equal to 0. Since p-value=0.003<0.05 we reject H o. We have enough evidence to prove it takes α ≠1 at the 5% level of significance. H 0 : β=1. The population slope is 1. H 1 : β≠1. The population slope is not equal to 1. Test Statistic: If |t|>t n-2, p-value≤0.05. Reject H o. We have enough evidence to prove β≠1 at the 5% level of significance. So Observer 2 seems more biased in their estimates given a slope significantly different from 1, even though the R 2 for observer 2 is greater than observer 1.

If we look at the scatterplot of the residuals vs. fitted graph and see that the variation increases – we have non-constant variance (constant variance assumption is not valid. If the variance is not constant (exhibits a pattern) then ordinary least squares is not the most efficient estimation method We turn to WEIGHTED REGRESSION!

For obs 1: H 0 : α=0. The intercept is 0. H 1 : α ≠0. The intercept is not equal to 0. Test Statistic: Since p-value=0.038<0.05 we reject H o. We have enough evidence to prove it takes α ≠0 at the 5% level of significance. H 0 : β=1. The population slope is 1. H 1 : β≠1. The population slope is not equal to 1. Test Statistic: If |t|>t n-2, p-value≤0.05. Reject H o. We have enough evidence to prove β≠1 at the 5% level of significance. If |t| Do not reject H o. We do not have enough evidence to prove β≠1 at the 5% level of significance. Since |t|<t n-2 we do not reject H o and conclude we do not have enough evidence to prove β≠1 at the 5% level of significance.

e) Forward Selection Models: Backward Selection Models: