Quantitative Methods Checking the models II: the other three assumptions.

Slides:



Advertisements
Similar presentations
Session 4 Lecture: Regression Analysis Practical: multiple regression
Advertisements

Assumptions underlying regression analysis
Quantitative Methods Interactions - getting more complex.
Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
4.1: Linearizing Data.
Correlation and regression
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Simple Linear Regression. G. Baker, Department of Statistics University of South Carolina; Slide 2 Relationship Between Two Quantitative Variables If.
Université d’Ottawa / University of Ottawa 2001 Bio 4118 Applied Biostatistics L10.1 CorrelationCorrelation The underlying principle of correlation analysis.
Psychology 202b Advanced Psychological Statistics, II February 15, 2011.
Design of Engineering Experiments - Experiments with Random Factors
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Quantitative Methods Using more than one explanatory variable.
Quantitative Methods Checking the models I: independence.
1 Review of Correlation A correlation coefficient measures the strength of a linear relation between two measurement variables. The measure is based on.
Quantitative Methods Combining continuous and categorical variables.
Assumption and Data Transformation. Assumption of Anova The error terms are randomly, independently, and normally distributed The error terms are randomly,
7.1 – Completing the Square
Transforming the data Modified from: Gotelli and Allison Chapter 8; Sokal and Rohlf 2000 Chapter 13.
Correlation & Regression
Active Learning Lecture Slides
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction to Linear Regression and Correlation Analysis
Correlation and Linear Regression
STA291 Statistical Methods Lecture 27. Inference for Regression.
5-1 Introduction 5-2 Inference on the Means of Two Populations, Variances Known Assumptions.
Chapter 13Design & Analysis of Experiments 8E 2012 Montgomery 1.
Chapter 6 & 7 Linear Regression & Correlation
© 1998, Geoff Kuenning Linear Regression Models What is a (good) model? Estimating model parameters Allocating variation Confidence intervals for regressions.
MARE 250 Dr. Jason Turner Hypothesis Testing III.
ALISON BOWLING THE GENERAL LINEAR MODEL. ALTERNATIVE EXPRESSION OF THE MODEL.
1 Chapter 10 Correlation and Regression 10.2 Correlation 10.3 Regression.
Correlation and Regression Used when we are interested in the relationship between two variables. NOT the differences between means or medians of different.
Chapter 10 Correlation and Regression
Examining Relationships in Quantitative Research
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
6.6 Solving Radical Equations. Principle of power: If a = b then a n = b n for any n Question: Is it also true that if a n = b n then a = b? Explain in.
Copyright © 2014, 2010, 2007 Pearson Education, Inc. 1 1 Chapter 9 Quadratic Equations and Functions.
DOX 6E Montgomery1 Design of Engineering Experiments Part 9 – Experiments with Random Factors Text reference, Chapter 13, Pg. 484 Previous chapters have.
Univariate Linear Regression Problem Model: Y=  0 +  1 X+  Test: H 0 : β 1 =0. Alternative: H 1 : β 1 >0. The distribution of Y is normal under both.
INDE 6335 ENGINEERING ADMINISTRATION SURVEY DESIGN Dr. Christopher A. Chung Dept. of Industrial Engineering.
11/26/2015 V. J. Motto 1 Chapter 1: Linear Models V. J. Motto M110 Modeling with Elementary Functions 1.5 Best-Fit Lines and Residuals.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Multiple Logistic Regression STAT E-150 Statistical Methods.
Section 1-3: Graphing Data
Statistics 2: generalized linear models. General linear model: Y ~ a + b 1 * x 1 + … + b n * x n + ε There are many cases when general linear models are.
8-5 Variation Functions with Non-Integer Exponents
Occasionally, we are able to see clear violations of the constant variance assumption by looking at a residual plot - characteristic “funnel” shape… often.
More on data transformations No recipes, but some advice.
Chapter 22 Comparing Two Proportions. Comparing 2 Proportions How do the two groups differ? Did a treatment work better than the placebo control? Are.
Chapter 8 Relationships Among Variables. Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
Radical Equations and Problem Solving Use the power rule to solve radical equations.
Part II Exploring Relationships Between Variables.
Regression Analysis AGEC 784.
Statistical Quality Control, 7th Edition by Douglas C. Montgomery.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Chapter 15 Strategies When Population Distributions are Not Normal:
Inference for Regression Lines
Quantitative Methods What lies beyond?.
9 Tests of Hypotheses for a Single Sample CHAPTER OUTLINE
Chapter 23 Comparing Means.
Quantitative Methods What lies beyond?.
When You See (This), You Think (That)
Parabolic Curve Equation Relationship y=
Combining continuous and categorical variables
Presentation transcript:

Quantitative Methods Checking the models II: the other three assumptions

Checking the models II: the other 3 assumptions Assumptions of GLM BACAFTER = BACBEF+TREATMNT TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 TREATMNT Coef PREDICTED BACAFTER = BACBEF (Model Formula) (Model) (Fitted Value Equation or Best Fit Equation)

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Assumptions of GLM TREATMNT Coef 1  1 BACAFTER =  + BACBEF + 2  2 +  3 - 1 - 2 (Model) Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Are the assumptions likely to be true? Assumptions of GLM Independence Homogeneity of variance Normality of error Linearity/additivity Checking the models II: the other 3 assumptions

Model Criticism Checking the models II: the other 3 assumptions

Model Criticism Checking the models II: the other 3 assumptions

Model Criticism Checking the models II: the other 3 assumptions

Transformations and Homogeneity Checking the models II: the other 3 assumptions

Transformations and Homogeneity Checking the models II: the other 3 assumptions

Transformations and Homogeneity Checking the models II: the other 3 assumptions

Transformations and Homogeneity Checking the models II: the other 3 assumptions

Transformations and Homogeneity Checking the models II: the other 3 assumptions

Transformations and Homogeneity Checking the models II: the other 3 assumptions

Transformations and Homogeneity Checking the models II: the other 3 assumptions

Transformations and Homogeneity Checking the models II: the other 3 assumptions None, or linear Square root Log Negative inverse

Non-linearity Checking the models II: the other 3 assumptions

Non-linearity Checking the models II: the other 3 assumptions

Non-linearity

Checking the models II: the other 3 assumptions Non-linearity

Example Checking the models II: the other 3 assumptions

Example Checking the models II: the other 3 assumptions

Example Checking the models II: the other 3 assumptions

Example Checking the models II: the other 3 assumptions MTB > let LOGDEN=log(DENSITY)

Hints Checking the models II: the other 3 assumptions

Hints Checking the models II: the other 3 assumptions Morphometric data: log Count data: square root Proportional data: angular Survival data: negative inverse Don’t be too picky

Selecting a transformation Checking the models II: the other 3 assumptions With covariates, consider transforming X too Continuous y-variable - varying strengths Increasing strength: none, square root, log, negative inverse Proportions - root arcsin Counts - square root Based on homogenising the error variance Go through the model criticism process again (and if necessary again and again)

Last words… You should always check assumptions as much as you can using the techniques of model criticism Transformations can help to ‘cure’ failures to meet assumptions Always repeat model criticism after transforming Homogeneity of variance is the priority for transformations Model selection I: principles of model choice and designed experiments Read Chapter 10 Checking the models II: the other 3 assumptions