Statistical Analysis of the Nonequivalent Groups Design.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

BA 275 Quantitative Business Methods
Chapter 10 Curve Fitting and Regression Analysis
Objectives (BPS chapter 24)
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Introduction to Linear Regression.
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
Lecture 3 Cameron Kaplan
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Statistical Analysis of the Regression Point Displacement Design (RPD)
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Analysis of Covariance Goals: 1)Reduce error variance. 2)Remove sources of bias from experiment. 3)Obtain adjusted estimates of population means.
Chapter Eighteen MEASURES OF ASSOCIATION
The Basics of Regression continued
Linear Regression MARE 250 Dr. Jason Turner.
Analysis of Covariance Goals: 1)Reduce error variance. 2)Remove sources of bias from experiment. 3)Obtain adjusted estimates of population means.
Statistical Analysis of the Regression-Discontinuity Design.
EDUC 200C Section 4 – Review Melissa Kemmerle October 19, 2012.
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Chapter 12 Inferential Statistics Gay, Mills, and Airasian
Quasi-Experiments. The Basic Nonequivalent Groups Design (NEGD) l Key Feature: Nonequivalent assignment NOXONOONOXONOO.
Introduction to Linear Regression and Correlation Analysis
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Statistics for the Social Sciences Psychology 340 Fall 2013 Correlation and Regression.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Introduction to Linear Regression
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Ordinary Least Squares Estimation: A Primer Projectseminar Migration and the Labour Market, Meeting May 24, 2012 The linear regression model 1. A brief.
Analysis of Covariance adjusting for potential confounds.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Chapter 4 Prediction. Predictor and Criterion Variables  Predictor variable (X)  Criterion variable (Y)
Time Series Analysis – Chapter 6 Odds and Ends
Analysis of Covariance (ANCOVA)
CHEMISTRY ANALYTICAL CHEMISTRY Fall Lecture 6.
Environmental Modeling Basic Testing Methods - Statistics III.
A first order model with one binary and one quantitative predictor variable.
Psychology 202a Advanced Psychological Statistics October 22, 2015.
8-1 MGMG 522 : Session #8 Heteroskedasticity (Ch. 10)
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
The General Linear Model. Estimation -- The General Linear Model Formula for a straight line y = b 0 + b 1 x x y.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Multivariate Statistics.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
The simple linear regression model and parameter estimation
Chapter 20 Linear and Multiple Regression
Simple Linear Regression
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
AP Statistics Chapter 14 Section 1.
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
Projection on Latent Variables
G Lecture 10b Example: Recognition Memory
Statistical Analysis of the Randomized Block Design
Multiple Regression.
12 Inferential Analysis.
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
CHAPTER 26: Inference for Regression
Chapter 12 Inference on the Least-squares Regression Line; ANOVA

Regression Models - Introduction
The Nonequivalent Groups Design
12 Inferential Analysis.
Simple Linear Regression
Example on the Concept of Regression . observation
Presentation transcript:

Statistical Analysis of the Nonequivalent Groups Design

Analysis Requirements l Pre-post l Two-group l Treatment-control (dummy-code) NOXONOONOXONOO

Analysis of Covariance y i = outcome score for the i th unit  0 =coefficient for the intercept  1 =pretest coefficient  2 =mean difference for treatment X i =covariate Z i =dummy variable for treatment(0 = control, 1= treatment) e i =residual for the i th unit y i =  0 +  1 X i +  2 Z i + e i where:

The Bivariate Distribution Program group has a 5-point pretest Advantage. Program group scores 15-points higher on Posttest.

Regression Results l Result is biased! CI.95(  2 =10) =  2 ±2SE(  2 ) = ±2(.5682) = ± CI.95(  2 =10) =  2 ±2SE(  2 ) = ±2(.5682) = ± l CI = to Predictor Coef StErr t p Constant pretest Group y i = X i Z i

The Bivariate Distribution Regression line slopes are biased. Why?

Regression and Error Y X No measurement error

Regression and Error Y X Y X No measurement error Measurement error on the posttest only

Measurement error on the pretest only Regression and Error Y X Y X Y X No measurement error Measurement error on the posttest only

How Regression Fits Lines

Method of least squares

How Regression Fits Lines Method of least squares Minimize the sum of the squares of the residuals from the regression line.

How Regression Fits Lines Y X Method of least squares Minimize the sum of the squares of the residuals from the regression line. Least squares minimizes on y not x.

How Error Affects Slope Y X No measurement error, No effect

How Error Affects Slope Y X Y X No measurement error, no effect. Measurement error on the posttest only, adds variability around regression line, but doesn’t affect the slope

Measurement error on the pretest only: Affects slope Flattens regression lines How Error Affects Slope Y X Y X Y X No measurement error, no effect. Measurement error on the posttest only, adds variability around regression line, but doesn’t affect the slope.

How Error Affects Slope Y X Y X Y X Y X Measurement error on the pretest only: Affects slope Flattens regression lines

How Error Affects Slope Y X Y X Y X Y X Notice that the true result in all three cases should be a null (no effect) one.

How Error Affects Slope Notice that the true result in all three cases should be a null (no effect) one. Y X Null case

How Error Affects Slope But with measurement error on the pretest, we get a pseudo-effect. Y X Pseudo-effect

Where Does This Leave Us? l Traditional ANCOVA looks like it should work on NEGD, but it’s biased. l The bias results from the effect of pretest measurement error under the least squares criterion. l Slopes are flattened or “attenuated”.

What’s the Answer? l If it’s a pretest problem, let’s fix the pretest. l If we could remove the error from the pretest, it would fix the problem. l Can we adjust pretest scores for error? l What do we know about error?

What’s the Answer? l We know that if we had no error, reliability = 1; all error, reliability=0. l Reliability estimates the proportion of true score. l Unreliability=1-Reliability. l This is the proportion of error! l Use this to adjust pretest.

What Would a Pretest Adjustment Look Like? Original pretest distribution

What Would a Pretest Adjustment Look Like? Original pretest distribution Adjusted dretest distribution

Y X How Would It Affect Regression? The regression The pretest distribution

Y X How Would It Affect Regression? The regression The pretest distribution

Y X How Far Do We Squeeze the Pretest? Squeeze inward an amount proportionate to the error.Squeeze inward an amount proportionate to the error. If reliability=.8, we want to squeeze in about 20% (i.e., 1-.8).If reliability=.8, we want to squeeze in about 20% (i.e., 1-.8). Or, we want pretest to retain 80% of it’s original width.Or, we want pretest to retain 80% of it’s original width.

Adjusting the Pretest for Unreliability X adj = X + r(X - X) __

Adjusting the Pretest for Unreliability X adj = X + r(X - X) __ where:

Adjusting the Pretest for Unreliability X adj = X + r(X - X) __ X adj =adjusted pretest value where:

Adjusting the Pretest for Unreliability X adj = X + r(X - X) __ X adj =adjusted pretest value X=original pretest value _ where:

Adjusting the Pretest for Unreliability X adj = X + r(X - X) __ r=reliability X adj =adjusted pretest value X=original pretest value _ where:

Reliability-Corrected Analysis of Covariance y i = outcome score for the i th unit  0 =coefficient for the intercept  1 =pretest coefficient  2 =mean difference for treatment X adj =covariate adjusted for unreliability Z i =dummy variable for treatment(0 = control, 1= treatment) e i =residual for the i th unit y i =  0 +  1 X adj +  2 Z i + e i where:

Regression Results l Result is unbiased! CI.95(  2 =10) =  2 ±2SE(  2 ) =9.3048±2(.6166) =9.3048± CI.95(  2 =10) =  2 ±2SE(  2 ) =9.3048±2(.6166) =9.3048± l CI = to y i = X adj Z i Predictor Coef StErr t p Constant adjpre Group

Graph of Means pretestposttestpretestposttest MEANMEANSTD DEVSTD DEV Comp Prog ALL

Adjusted Pretest l Note that the adjusted means are the same as the unadjusted means. l The only thing that changes is the standard deviation (variability). pretestadjpreposttestpretestadjpreposttest MEANMEANMEANSTD DEVSTD DEVSTD DEV Comp Prog ALL

Original Regression Results Original Pseudo-effect=11.28

Corrected Regression Results Original Corrected Pseudo-effect=11.28 Effect=9.31