Maths Study Centre CB01.16.11 Open 11am – 5pm Semester Weekdays https://www.uts.edu.au/future-students/science/student-experience/maths-study-

Slides:



Advertisements
Similar presentations
1.What is Pearson’s coefficient of correlation? 2.What proportion of the variation in SAT scores is explained by variation in class sizes? 3.What is the.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
Forecasting Using the Simple Linear Regression Model and Correlation
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
1 BA 275 Quantitative Business Methods Residual Analysis Multiple Linear Regression Adjusted R-squared Prediction Dummy Variables Agenda.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
Chapter Topics Types of Regression Models
Ch. 14: The Multiple Regression Model building
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Pertemua 19 Regresi Linier
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Business Statistics - QBM117 Statistical inference for regression.
Lecture 19 Simple linear regression (Review, 18.5, 18.8)
Correlation and Regression Analysis
Chapter 7 Forecasting with Simple Regression
Introduction to Regression Analysis, Chapter 13,
Correlation & Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Introduction to Linear Regression
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Section 9-1: Inference for Slope and Correlation Section 9-3: Confidence and Prediction Intervals Visit the Maths Study Centre.
1 Regression Analysis The contents in this chapter are from Chapters of the textbook. The cntry15.sav data will be used. The data collected 15 countries’
Simple Linear Regression (SLR)
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
REGRESSION DIAGNOSTICS Fall 2013 Dec 12/13. WHY REGRESSION DIAGNOSTICS? The validity of a regression model is based on a set of assumptions. Violation.
Data Analysis.
Examining Relationships in Quantitative Research
Simple linear regression Tron Anders Moger
Applied Quantitative Analysis and Practices LECTURE#25 By Dr. Osman Sadiq Paracha.
Take advantage of the Maths Study Centre CB Open 11am – 5pm Semester Weekdays for help. Check out some regression videos.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays Check out This presentation.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays Check out This presentation.
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
Biostatistics Regression and Correlation Methods Class #10 April 4, 2000.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
1 Simple Linear Regression Chapter Introduction In Chapters 17 to 19 we examine the relationship between interval variables via a mathematical.
Inference for Least Squares Lines
Virtual COMSATS Inferential Statistics Lecture-26
Stats Club Marnie Brennan
BA 275 Quantitative Business Methods
PENGOLAHAN DAN PENYAJIAN
Simple Linear Regression
Chapter 13 Additional Topics in Regression Analysis
3.2. SIMPLE LINEAR REGRESSION
Presentation transcript:

Maths Study Centre CB Open 11am – 5pm Semester Weekdays centre centre START>ALL PROGRAMS>IBM SPSS>IBM SPSS STATISTICS 19 Marking Scheme: 0 if less than 50% attempted, 1 for more than 50% attempted but less than 50% correct, 2 if more than 50% correct.

Feedback for Lab 1

The key point is that the prediction interval tells you about the distribution of values, not the uncertainty in determining the population mean. Prediction intervals must account for both the uncertainty in knowing the value of the population mean, plus data scatter. So a prediction interval is always wider than a confidence interval. g)

Model Assumptions: Residual Plots One of the assumptions of the model is that the random errors (residuals) are random and normally distributed To assess the normality assumption of the residuals we may look at the normal probability plot. The normal probability plot is constructed by plotting the expected values of the residuals under the normality assumption (line) against the actual values of the residuals. If the normal assumption is valid, the residuals should lie approximately on the straight line. Any non-linear trend indicates the violation of the normal assumption. This plot may also show outliers.

Model Assumptions: Residual Plots Another assumption of the model is that the random errors (residuals) have a constant variance (homoscedasticity). If the variance is not constant (heteroscedasticity) then ordinary least squares is not the most efficient estimation method. We may apply some sort of transformation to stabilise the variance when the variance is some function of the mean.

On SPSS>Graph>Legacy Dialog>Boxplot>Summaries of separate variables

Coefficient for determination R 2 R-squared gives us the proportion of the total variability in the response variable (Y) that is “explained” by the least squares regression line based on the predictor variable (X). It is usually stated as a percentage. — —Interpretation: On average, R 2 % of the variation in the dependent variable can be explained by the independent variable through the regression model.

Regression Significance: t-test H 0 : β=0. There is no association between the response variable and the independent variable. (Regression is insignificant) E[y|x]= α + 0*X H 1 : β≠0. The independent variable will affect the response variable. (Regression is significant) E[y|x]= α + βX Test Statistic: If p-value≤α. Reject Ho. A ge is significantly related to systolic blood pressure. Regression is significant. If p-value>α. Do not reject Ho. A ge is not significantly related to systolic blood pressure. Regression is not significant.

Regression Significance: F-test / Goodness of fit test H 0 : β=0. There is no association between the response variable and the independent variable. (Regression is insignificant) E[y|x]= α + 0*X H 1 : β≠0. The independent variable/s will affect the response variable. (Regression is significant) E[y|x]= α + βX Test Statistic: If p-value≤α. Reject Ho. A ge is significantly related to systolic blood pressure. Regression is significant. If p-value>α. Do not reject Ho. A ge is not significantly related to systolic blood pressure. Regression is not significant.