LECTURE 15 Hypotheses about Contrasts EPSY 640 Texas A&M University.

Slides:



Advertisements
Similar presentations
Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Advertisements

1-Way ANOVA with Numeric Factor – Dose-Response Dose Response Studies in Laboratory Animals S.J. Ruberg (1995). “Dose Response Studies. II. Analysis and.
Splash Screen.
Action Research Correlation and Regression
Inference for Regression
Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors & Confusions when interpreting.
Regression Analysis Simple Regression. y = mx + b y = a + bx.
SPSS Series 3: Repeated Measures ANOVA and MANOVA
Correlation Chapter 9.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
CORRELATION LECTURE 1 EPSY 640 Texas A&M University.
CORRELATION AND SIMPLE LINEAR REGRESSION - Revisited Ref: Cohen, Cohen, West, & Aiken (2003), ch. 2.
LECTURE 14 ANALYSIS OF VARIANCE EPSY 640 Texas A&M University.
ANALYSIS OF VARIANCE. Multigroup experimental design PURPOSES: –COMPARE 3 OR MORE GROUPS SIMULTANEOUSLY –TAKE ADVANTAGE OF POWER OF LARGER TOTAL SAMPLE.
Analyses of K-Group Designs : Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 21 CURVE FITTING Chapter 18 Function Interpolation and Approximation.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Psy B07 Chapter 1Slide 1 ANALYSIS OF VARIANCE. Psy B07 Chapter 1Slide 2 t-test refresher  In chapter 7 we talked about analyses that could be conducted.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
The Scientific Method Interpreting Data — Correlation and Regression Analysis.
12d.1 One-way ANOVA These notes are developed from “Approaching Multivariate Analysis: A Practical Introduction” by Pat Dugard, John Todman and Harry Staines.
EQT 272 PROBABILITY AND STATISTICS
Applications The General Linear Model. Transformations.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
So far... We have been estimating differences caused by application of various treatments, and determining the probability that an observed difference.
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
1 G Lect 6W Polynomial example Orthogonal polynomials Statistical power for regression G Multiple Regression Week 6 (Wednesday)
Completely Randomized Design Reviews for later topics Reviews for later topics –Model parameterization (estimability) –Contrasts (power analysis) Analysis.
6.1 Polynomial Functions.
Curvilinear 2 Modeling Departures from the Straight Line (Curves and Interactions)
CHAPTER 3 INTRODUCTORY LINEAR REGRESSION. Introduction  Linear regression is a study on the linear relationship between two variables. This is done by.
Linear Trend Lines = b 0 + b 1 X t Where is the dependent variable being forecasted X t is the independent variable being used to explain Y. In Linear.
Algebra 1 Notes Lesson 7-4 Elimination Using Multiplication.
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Within Subjects Analysis of Variance PowerPoint.
+ Comparing several means: ANOVA (GLM 1) Chapter 11.
2.2 Linear Functions and Models. A linear function is a function of the form f(x)=mx+b The graph of a linear function is a line with a slope m and y-intercept.
Stats Lunch: Day 8 Repeated-Measures ANOVA and Analyzing Trends (It’s Hot)
Lecture 6 Design Matrices and ANOVA and how this is done in LIMMA.
Orthogonal Linear Contrasts A technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Spline Interpolation A Primer on the Basics by Don Allen.
Quadratic Regression ©2005 Dr. B. C. Paul. Fitting Second Order Effects Can also use least square error formulation to fit an equation of the form Math.
Remember You just invented a “magic math pill” that will increase test scores. On the day of the first test you give the pill to 4 subjects. When these.
Handout Twelve: Design & Analysis of Covariance
Specialist Mathematics Polynomials Week 3. Graphs of Cubic Polynomials.
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
CRD, Strength of Association, Effect Size, Power, and Sample Size Calculations BUSI 6480 Lecture 4.
Free Powerpoint Templates ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
Chapter 5: Introductory Linear Regression. INTRODUCTION TO LINEAR REGRESSION Regression – is a statistical procedure for establishing the relationship.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 9 l Simple Linear Regression 9.1 Simple Linear Regression 9.2 Scatter Diagram 9.3 Graphical.
Regression and Correlation
Week 2 – PART III POST-HOC TESTS.
Why is this important? Requirement Understand research articles
Comparing several means: ANOVA (GLM 1)
Simple Linear Regression
Multiple Comparisons Q560: Experimental Methods in Cognitive Science Lecture 10.
5.2 Evaluate and Graph Polynomial Functions
1-Way ANOVA with Numeric Factor – Dose-Response
CHAPTER 29: Multiple Regression*
Comparing Several Means: ANOVA
Polynomial Fit in R.
Notes Over 6.2 Identifying Polynomial Functions Polynomial Function
MOHAMMAD NAZMUL HUQ, Assistant Professor, Department of Business Administration. Chapter-16: Analysis of Variance and Covariance Relationship among techniques.
Integrated Math 3 – Mod 3 Test Review
Passport to Advanced Math
Presentation transcript:

LECTURE 15 Hypotheses about Contrasts EPSY 640 Texas A&M University

Hypotheses about Contrasts C = c 1  1 + c 2  2 + c 3  3 + …+ c k  k, with  c i = 0. The null hypothesis is H 0 : C = 0 H 1 : C  0

Hypotheses about Contrasts C 1 = (0)  instruction + (1)  advance organizer (-1)  neutral topic Thus, for this contrast we ignore the straight instruction condition, as evidenced by its weight of 0, and subtract the mean of the neutral topic condition from the mean for the advance organizer condition. A second contrast might be 2, -1, -1: C 2 = (2)  instruction (-1)  advance organizer (-1)  neutral topic We can interpret this contrast better by examining its null hypothesis: C 2 = 0 = (2)  instruction (-1)  advance organizer (-1)  neutral topic, so that (2)  instruction = (1)  advance organizer + (1)  neutral topic and  instruction -[ (1)[  advance organizer + (1)  neutral topic ] / 2 = 0.

Contrasts simple contrasts, if only two groups have nonzero coefficients, and complex contrasts for those involving three or more groups

Planned Orthogonal Contrasts Orthogonal contrasts have the property that they are mathematically independent of each other. That is, there is no information in one that tells us anything about the other. This is created mathematically by requiring that for each pair of contrasts in the set,  c i1 c i2 = 0, where c i1 is the contrast value for group i in contrast 1, c i2 the contrast value for the same group in contrast 2. For example, with C 1 and C 2 above, C 1 :01-1 C 2 : C 1 C 2 :0 x 2 +1 x –1 +-1 x –1 = 0 –1 + 1 = 0

Planned Orthogonal Contrasts VENN DIAGRAM REPRESENTATION SSy Treat SS SSc1 SSerror R 2 c1=SSc1/SSy SSc2 R 2 c2=SSc2/SSy R 2 y=(SSc1+SSc2)/SSy

Geometry of POCs C1: 0, 1, -1 C2: 2, -1, -1 GP 1 GP 2 GP 3

PATH DIAGRAM FOR PLANNED ORTHOGONAL CONTRASTS C2C2 e C1C1  1 (r c1,y )=.085  2 (r c2,y ) =.048 y

Nonorthogonal Contrasts VENN DIAGRAM REPRESENTATION SSy Treat SS SSc2 SSerror SSc1

PATH DIAGRAM FOR PLANNED NONORTHOGONAL CONTRASTS C2C2 e C1C1  1 (r c1,y )=.128  2 (r c2,y ) = y r=.78

Control Treatment Treatment+Drug Treatment+ Placebo C T TD TP The purpose of the placebo is to mimic the results of the drug. An even more complex design might include a control plus the placebo. The set of orthogonal contrasts follow from hypotheses of interest: C T TD TP C 1 : This contrast assesses whether treatments are more effective generally than the control condition.

Control Treatment Treatment+Drug Treatment+ Placebo C T TD TP C 2 : This contrast compares the treatment with additions to treatment. C 3 : and this contrast compares the effect of the drug with the placebo. There are other sets of contrasts a researcher might substitute or add. Here, we will look at the contrasts to determine that they are orthogonal: C 1 : C = 0, so that C 1 and C 2 are orthogonal.

Control Treatment Treatment+Drug Treatment+ Placebo C T TD TP C 1 : C = 0, so that C 1 and C 3 are orthogonal. C 3 : C –1 +1 = 0, so that C 3 and C 2 are orthogonal.

A second set of contrasts might be developed as follows: C T TD TP C 1 : This contrasts the control with the primary drug conditions of interest. Next, C 2 : This contrast compares the treatment with treatment plus drug, the major interest of the study. Finally C 3 : and this contrast compares the effect of the drug with the placebo. C 1 : C = 0, so that C 1 and C 2 are orthogonal. C 1 : C = -1, so that C 1 and C 3 are not orthogonal. C 3 : C –1 0 = -1, so that C 3 and C 2 are not orthogonal.

Polynomial Trend Contrasts When groups represent interval data we can conduct polynomial trend contrasts example: Group A receives no treatment, Group B 10 hours, and group C receives 20 hours of instructional treatment Treatment condition (time) is now interval:

Polynomial Trend Contrasts The contrast coefficients for polynomial trends fit curves: linear, quadratic, cubic, etc. The coefficients can be obtained from statistics texts most easily SPSS has a polynomial trend option in the Analyze/Compare Means/One Way ANOVA analysis

Polynomial Trend Contrasts: example of drug dosages ml dose C 1 : linear C 2 : quadratic C 3 : cubic

C1C1 C2C C3C3 Fig. Graphs of planned orthogonal contrasts for four interval treatments

SPSS EXAMPLE The groups represent the five quintiles of school enrollment size, 1-281, , , , and

SPSS EXAMPLE Unweighted used because each group has the same # of schools

SPSS EXAMPLE We might have gotten a quadratic from this curve but too much variation within groups