Masked Visual Analysis (MVA)

Slides:



Advertisements
Similar presentations
Special Education as an Alternative to Academic Failure.
Advertisements

Methods Caregivers of children with autism spectrum disorder experience higher levels of stress in comparison to caregivers of typically developing children.
Special Education Referral and Evaluation Process Presented by Lexington Special Education Staff February 1, 2013.
Parent-Child Interaction in School Aged Children with SLI. By Jessica Allen & Chloe Marshall.
THE IEP PROCESS Cassie A. Newson. Purpose of Initial Evaluation  To see if the child is a “child with a disability,” as defined by IDEA  To gather information.
Understanding the IEP Process
Amy R. Wagner, LCSW, BCaBA University of West Georgia.
SLT Provision for Pre- schoolers: Mainstream & Special Needs.
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
Surrey Place Centre: Raising Awareness About Autism Spectrum Disorder in the Community Kelly Alves, Parent and Education Support Supervisor.
Project Aim To provide training for Early Childhood Care Providers (ECCPs) on Applied Behavior Analysis (ABA) principles within the EIBI autism classroom,
How do you know it worked
Using a combined blocking procedure to teach color discrimination to a child with autism Gladys Williams, Luis Antonio Perez-Gonzalez, & Anna Beatriz Muller.
Single-subject experimental designs
Secondary Goals and Transition Strategies Speech and Language Support.
Single-Subject Designs
Partnering with parents
The Multidisciplinary Team Testing Considerations, and Parental Participation in the Assessment Process Chapter Seven.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Copyright © 2011 Pearson Education, Inc. All rights reserved. Doing Research in Behavior Modification Chapter 22.
Doing Research in Behavior Modification
Single-Case Research: Standards for Design and Analysis Thomas R. Kratochwill University of Wisconsin-Madison.
Chapter 6 Application of Withdrawal Design. A-B-A Design The Study: Teaching Socially Valid Social Interaction Responses to Students with Severe Disabilities.
Single Subject Research (Richards et al.) Chapter 8.
Randomization: A Missing Component of the Single-Case Research Methodological Standards Adapted from Kratochwill, T. R., & Levin, J. R. (2010). Enhancing.
Inclusion: Effective Practices for All Students, 1e McLeskey/Rosenberg/Westling © 2010 Pearson Education, Inc. All Rights Reserved. 5-1 ADHD.
Welcome to the “Special Education Tour”.  Specifically designed instruction  At no cost to parents  To meet the unique needs of a child with disabilities.
Parent Leadership Team Meeting Intro to RtI.  RtI Overview  Problem Solving Process  What papers do I fill out?  A3 documenting the story.
Statistical Models for the Analysis of Single-Case Intervention Data Introduction to:  Regression Models  Multilevel Models.
Random Thoughts On Enhancing the Scientific Credibility of Single-Case Intervention Research: Randomization to the Rescue Thomas R. Kratochwill and Joel.
Observation and Assessment in Early Childhood Feel free to chat with each other. We will start class at 9:00 PM ET! Seminar Two: Using Standardized Tests.
Part C Eligibility (Part H). Eligibility Criteria: Children ages birth through two who are developmentally delayed or are at established risk for developmental.
Specific Learning Disabilities (SLD) Eligibility Implementing Wisconsin’s SLD Rule December
 Three Criteria: Inadequate classroom achievement (after intervention) Insufficient progress Consideration of exclusionary factors  Sources of Data.
Randomized Single-Case Intervention Designs Joel R
The Resource Room Miss Jen Broeker Resource Teacher.
CE300-Observation and Assessment in Early Childhood Unit 2 Using Standardized Tests and Authentic Assessments Feel free to chat with each other. We will.
Statistical Randomization Tests: Issues and Applications Randomization Tests versus Permutation Tests Randomization Tests versus Permutation Tests Test.
Masked Visual Analysis (MVA) A method to ensure control of the Type I error rate when visually analyzing single-case studies.
THEATRICAL INSTRUCTION DURING LITERACY WILL DECREASE BEHAVIORS AND INCREASE ENGAGEMENT FOR ELEMENTARY STUDENTS WITH EMOTIONAL DISTURBANCE "ALL STUDENTS.
IES Advanced Training Institute on Single-Case Research Methods
DAY 2 Visual Analysis of Single-Case Intervention Data Tom Kratochwill
2015 Leadership Conference “All In: Achieving Results Together”
Video Self-Modeling: Effectiveness in School Populations Krystal Franco, B. I. S., Christopher Carter, SSP & Wendi Johnson, Ph.D. Texas Woman’s University,
Goals of the Presentation
Single Subject Research
Statistical Approaches to Support Device Innovation- FDA View
TREATMENT SENSITIVITY OF THE DYADIC PARENT-CHILD INTERACTION CODING SYSTEM-II Jenny Klein, B.S., Branlyn Werba, M.S., and Sheila Eyberg, Ph.D. University.
Verification Guidelines for Children with Disabilities
Statistical Models for the Analysis of Single-Case Intervention Data
OSEP Project Directors Meeting
Effect size measures for single-case designs: General considerations
11 Single-Case Research Designs.
Masked Visual Analysis (MVA)
Research Methods: Concepts and Connections First Edition
Identification of Children with Specific Learning Disabilities
Statistical Models for the Analysis of Single-Case Intervention Data
DAY 2 Single-Case Design Drug Intervention Research Tom Kratochwill
EXPLORING THE LEARNING SAFETY NET
Randomization: A Missing Component of the Single-Case Research Methodological Standards Joel R. Levin University of Arizona Adapted from Kratochwill, T.
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Spatial STEM - C Evaluation of a Model Spatial Thinking Curriculum for Building Computational Skills in Elementary Grades K-5 A collaboratvive project.
Dr. James W. Dottin Department Chair Business Administration
Brodhead, Cox, and Quigley (2018)
Sloan O’Malley Storie, Sondra Stegenga, & Jane Squires
Evaluation in IDEA 2004.
Identification of Children with Specific Learning Disabilities
DAY 2 Single-Case Design Drug Intervention Research Tom Kratochwill
Masked Visual Analysis (MVA)
Statistical Models for the Analysis of Single-Case Intervention Data
Presentation transcript:

Masked Visual Analysis (MVA) A method to ensure control of the Type I error rate when visually analyzing single-case studies

Type I Error Control A Type I error is committed when an analyst concludes there was an effect when there really was not. In conventional statistical analyses researchers often set the Type I error rate to .05. With traditional visual analyses it is difficult to know how likely it is that a researcher will incorrectly conclude there was an effect.

Type I Error Studies Estimate Study .24 Matyas & Greenwood, 1990 .25 Stocks & Williams, 1995 .28 Fisch, 2001 .66 Borckardt, Murphy, Nash, & Shaw, 2004 .17/.01 Carter, 2009

MVA Steps for Randomized Designs 1. Plan the study 2. Spit research team into two groups: a) intervention team b) analysis team 3. Intervention team makes random assignment, but does not tell analysis team 4. Intervention team conducts the study 5 Intervention team creates a masked graph 6. Analysis team analyzes masked graph

Example Application (Thanks to Kendall DeLoatche) Study to examine the effect of parent/child interaction therapy (PCIT) on the number of praises given by parent during interaction with child Design Type: Multiple Baseline Across 4 Participants Intervention Schedule: Baseline lengths of 3, 4, 5, and 6 Randomization: Randomize order of participants for intervention

Dyadic Parent-Child Interaction Coding System (DPICS): LABELED PRAISES Session

Compute the p-value The p-value is computed as: p = # specifications/# possible assignments # possible assignments = 4*3*2*1 = 24 p = 1/24 = .0417

Type I error control If there were no treatment effects the data would be the same regardless of which random assignments were made. As a consequence, the Analysis Team would make the same decisions and the same specification (e.g., always say the order is 1, 2, 4, 3). Because the assignments are made randomly the probability that the assignment corresponds to the one the Analysis Team would pick is 1 out of the # possible (e.g., the order 1, 2, 4, 3 would be selected randomly 1 out of 24 times).

MVA Steps for Response-Guided Randomized Designs 1. Set study parameters Research team agrees upon: Deign type (e.g., MB) Minimums (e.g., minimum of 5 observations per phase) Randomization (e.g., random order of participants in MB)

2. Split into two teams Analysis Team Visually analyze the data and direct the Intervention Team Intervention Team Conduct the study based on the agreed upon parameters and the direction of the Analysis Team

3. Conduct the study The Intervention Team begins the study and sends the collected outcome data to the Analysis Team The Analysis Team analyzes the data and makes decisions about when it would be appropriate to make a random assignment The Intervention Team makes random assignments when directed by the Analysis Team and continues to collect and send the outcome measures to the Analysis Team, but they never disclose the results of the random assignments The Analysis Team indicates when the study should be concluded

4. Compute the p-value The Analysis Team specifies what they believe are the results of the random assignments The Intervention Team indicates if they are correct If not correct, the Analysis Team continues to make specifications until a correct specification is made The p-value is computed as: p = # specifications/# possible assignments

Example 1: Multiple Baseline Design – 4 Participants Step 1: Set study parameters Dependent Variable? % of time on task Design Type? Multiple Baseline Across Participants Minimums? At least 5 baseline observations Staggers of at least 2 observations Treatment phases with at least 3 observations

Example 1: Multiple Baseline Design – 4 Participants Step 1: Set study parameters Randomization? Randomize order of participants for intervention How many possible assignments of participants to treatment order? Who is 1st, 2nd, 3rd, and 4th? P = 4! = 24 possible assignments If the treatment has no effect, the probability that a masked visual analyst could identify the correct order p = 1/24 = .0417

Example 1: Multiple Baseline Design – 4 Participants Step 2: Split into two teams Step 3: Conduct the study

% Time on Task Session

% Time on Task Session

% Time on Task Session

% Time on Task Session

Example 1: Multiple Baseline Design – 4 Participants Step 4: Compute the p-value Analysis Team make a specification Intervention Team, are they correct? If the treatment has no effect, the probability that a masked visual analysts could have identified the correct order p = 1/24 = .0417

Example 2: Multiple Baseline Design – 3 Participants Step 1: Set study parameters Design Type? Multiple Baseline Across Participants Minimums? At least 5 baseline observations Staggers of at least 3 observations Treatment phases with at least 5 observations If outlier, at least 3 additional observations in phase

Example 2: Multiple Baseline Design – 3 Participants Dependent Variable? % intervals with prosocial behavior Randomization? - How many possible assignments of participants to treatment order? Who is 1st, 2nd, and 3rd? P = 3! = 6 possible assignments - What if we randomly select from Participant 1, Participant 2, Participant 3, and no one? P=4! = 24 possible assignments, if correct p = 1/24 = .0417

Example 2: Multiple Baseline Design – 3 Participants Step 2: Split into two teams Step 3: Conduct the study

Example 2: Multiple Baseline Design – 3 Participants Step 4: Compute the p-value Analysis Team make a specification Intervention Team, are they correct? If the treatment has no effect, the probability that a masked visual analysts could have identified the assignments p = 1/24 = .0417

Example 3: Multiple Probe Design Step 1: Set study parameters Design Type? Multiple Probe with 5 Participants Minimums? At least 5 observations in each phase At least 3 consecutive observations prior to intervention At least 3 consecutive observations after an intervention Temporal staggers of at least 2 observations Randomization? Random assignment of treatment to blocks of observations, where there is one mystery block for each participant at the point the participant becomes eligible for intervention 25=32, so if correct with 5 blocks, p = 1/32 = .03125

Example 3: Multiple Probe Design Step 2: Split into two teams Step 3: Conduct the study

A B ? Dave ? John ? Bob ? Dan ? Theresa

Example 3: Multiple Probe Design Step 4: Compute the p-value Analysis Team make a specification Intervention Team, are they correct? Yes? p = 1/32 = .03125

Example 4: Reversal Design Step 1: Set study parameters Dependent Variable? Number of Disruptive Behaviors Design Type? Reversal Minimums? At least 5 observations per phase At least 3 phase changes (at least ABAB) Randomization? Random assignment of treatment to blocks of observations Because each assignment has 2 possibilities, need 5 assignments to obtain over 20 possible assignments and a p-value < .05. 25=32, so if correct p = 1/32 = .03125

Example 4: Reversal Design Step 2: Split into two teams Step 3: Conduct the study

Example 4: Reversal Design Step 4: Compute the p-value Analysis Team make a specification Intervention Team, are they correct? If the treatment has no effect, the probability that a masked visual analysts could have identified the assignments p = 1/32 = .03125

Example 5: Alternating Treatments Design Step 1: Set study parameters Design Type? Alternating Treatments (2 treatments) Minimums? At least 5 alternating pairs Randomization? Random assignment of one observation in the pair to A and one to B Because each assignment has 2 possibilities, need 5 assignments to obtain over 20 possible assignments and a p-value < .05. 25=32, so if correct with 5 pairs, p = 1/32 = .03125

Example 5: Alternating Treatments Design Step 2: Split into two teams Step 3: Conduct the study

Example 5: Alternating Treatments Design Step 4: Compute the p-value Analysis Team make a specification Intervention Team, are they correct? Yes? p = 1/64 = .015625 No? Make a second specification If correct this time, p = 2/64 = .03125

Applications and Illustrations Byun, T. M., Hitchcock, E., & Ferron, J. M. (2017). Masked visual analysis: Minimizing type I error in response-guided single-case design for communication disorders. Journal of Speech, Language, and Hearing Research, 60, 1455=1466. DeLoatche, K. J. (2015). Parent-child interaction therapy as a treatment for ADHD in early childhood: A multiple baseline single-case design (Unpublished doctoral dissertation). University of South Florida, Tampa. Dickerson, E. (2016). Computerized cognitive remediation therapy (CCRT): Investigating change in the psychological and cognitive function of adolescent psychiatric patients. (Unpublished doctoral dissertation). Northeastern University, Boston. Ferron, J. M., & Levin, J. R. (2014). Single-case permutation and randomization statistical tests: Present status, promising new developments. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case intervention research: Statistical and methodological advances (pp. 153-183). Washington, DC: American Psychological Association. Ferron, J., & Jones, P. K. (2006). Tests for the visual analysis of response-guided multiple-baseline data. Journal of Experimental Education, 75, 66-81. Ferron, J. M., Joo, S.-H., & Levin, J. R. (accepted). A Monte-Carlo evaluation of masked-visual analysis in response-guided versus fixed-criteria multiple-baseline designs. Journal of Applied Behavior Analysis. Hinojosa, S. M. (2016). Teacher child interaction therapy: An ecological approach to intervening with young children who display disruptive behaviors. (Unpublished doctoral dissertation). University of South Florida, Tampa. Hua, Y., Yuan, C., Monroe, K., Hinzman, M. L., Alqahtani, S., Abdulmohsen, A., & Kern, A. M. (2016). Effects of the reread- adapt and answer-comprehend and goal setting intervention on decoding and reading comprehension skills of young adults with intellectual disabilities. Developmental Neurorehabilitation. Ottley, J. R., Coogle, C. G., Rahn, N. L., & Spear, C. (2017). Impact of bug-in-ear professional development on early childhood co-teachers’ use of communication strategies. Topics in Early Childhood Special Education, 36, 218-229. McKenney, E. L., Mann, K. A., Brown, D. L., & Jewell, J. D. (2017). Addressing cultural responsiveness in consultation: An empirical demonstration. Journal of Educational and Psychological Consultation. McKeown, D., Kimball, K., & Ledford, J. (2015). Effects of asynchronous audio feedback on the story revision practices of students with emotional/behavioral disorders. Education and Treatment of Children, 38, 541-564.