Just Because They Say It’s ‘Scientifically- based’ Doesn’t Mean It Will Work!

Slides:



Advertisements
Similar presentations
PhD Research Seminar Series: Valid Research Designs
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Experimental Research Designs
Quasi-Experimental Design
8. Evidence-based management Step 3: Critical appraisal of studies
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
MSP Evaluation Rubric and Working Definitions Xiaodong Zhang, PhD, Westat Annual State Coordinators Meeting Washington, DC, June 10-12, 2008.
Experiments. Types of experiments ‘so far’ Paired comparison Happy experiment watching Goon video Two independent groups Different treatments for each.
Research System Theory Hypotheses Data Verification Theory building Hypothesis generation Measurement issues Research design Sampling issues Statistical.
Jeff Beard Lisa Helma David Parrish Start Presentation.
1 Reading First Internal Evaluation Leadership Tuesday 2/3/03 Scott K. Baker Barbara Gunn Pacific Institutes for Research University of Oregon Portland,
Notes Grading Scale now posted on website.
Home Economics Teachers’ Readiness for Teaching STEM
Copyright c 2001 The McGraw-Hill Companies, Inc.1 Chapter 8 Quantitative Research Designs.
Quantitative Research  Adequate experimental control  Lack of artificiality  Basis for comparison  Adequate information from the data  Uncontaminated.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
Types of Evaluation.
Chapter 9 Experimental Research Gay, Mills, and Airasian
Educational Research Methods
Experimental Research Take some action and observe its effects Take some action and observe its effects Extension of natural science to social science.
Experimental Design The Gold Standard?.
From where did single-case research emerge? What is the logic behind SCDs? What is high quality research? What are the quality indicators for SCDs? SPCD.
Fig Theory construction. A good theory will generate a host of testable hypotheses. In a typical study, only one or a few of these hypotheses can.
Research Design Methodology Part 1. Objectives  Qualitative  Quantitative  Experimental designs  Experimental  Quasi-experimental  Non-experimental.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Selecting a Research Design. Research Design Refers to the outline, plan, or strategy specifying the procedure to be used in answering research questions.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Professional development for mainstream teachers of ELLs: Project GLAD ® and Beyond Theresa Deussen March 10, 2014.
1 / 27 California Educational Research Association 88 th Annual Conference Formative Assessment: Implications for Student Learning San Francisco, CA November.
Striving to Link Teacher and Student Outcomes: Results from an Analysis of Whole-school Interventions Kelly Feighan, Elena Kirtcheva, and Eric Kucharik.
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Progressing Toward a Shared Set of Methods and Standards for Developing and Using Measures of Implementation Fidelity Discussant Comments Prepared by Carol.
S-005 Intervention research: True experiments and quasi- experiments.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
Chapter Four Experimental & Quasi-experimental Designs.
Understanding Research Design Can have confusing terms Research Methodology The entire process from question to analysis Research Design Clearly defined.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Methodology Matters: Doing Research in the Behavioral and Social Sciences ICS 205 Ha Nguyen Chad Ata.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
8. Observation Jin-Wan Seo, Professor Dept. of Public Administration, University of Incheon.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
EDCI 696 Dr. D. Brown Presented by: Kim Bassa. Targeted Topics Analysis of dependent variables and different types of data Selecting the appropriate statistic.
ED 589: Educational Research Methods Quantitative Research Elements.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Career Academic Technical Institute (CATI) Division of Career-Technical Education TN State Department of Education 25th NACTEI New Orleans, 2005.
Research Design. Time of Data Collection Longitudinal Longitudinal –Panel study –Trend study –Cohort study Cross-sectional Cross-sectional.
Changing Teaching Behaviors: The Road to Student Achievement Powell et al: Technology as a potentially cost-effective alternative to on-site coaching Research.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Research Design Quantitative Study Design - B. Back to Class 9.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Chapter Eight: Quantitative Methods
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Outcomes Evaluation A good evaluation is …. –Useful to its audience –practical to implement –conducted ethically –technically accurate.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
S-005 Intervention research: True experiments and quasi- experiments.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
Issues in Evaluating Educational Research
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Misc Internal Validity Scenarios External Validity Construct Validity
Presentation transcript:

Just Because They Say It’s ‘Scientifically- based’ Doesn’t Mean It Will Work!

Changing Landscape P.L IDEA ‘03 Decrease in experimental studies 1980 (61%) (38%) Definition of “scientifically-based” = Random assignment/True Experiment National Reading Panel report Institute for Educational Sciences Configuration/role of OSEP Status of Part D in reauthorization

“….methodologically weak research, trivial studies, an infatuation with jargon, and a tendency toward fads.” National Research Council

Educational Research: The Hardest Science of All!!!

Standards for Field Testing Interventions (CRL) Practical and doable Easy for both teachers & students to learn Yield meaningful”real world” outcomes Broad in reach…..impacts non-SWDs Impact performance of SWDs to enable them to compete within criterion environment

Guiding Principles (CRL) Deal with complex realities of schools Participant input at all stages Use sound research methodologies/designs Collect many measures on interventions Field-testing in multiple stages Insist on both statistical & social significance Translate field protocols into user manuals Bring interventions to scale

Designing High Quality Research in Special Education: Group Experimental Design Gersten, Lloyd, Baker (1999)

The BIG Issue: Trade-off between Internal & External Validity “….the challenge to educational researchers is to sensibly negotiate a balance between those components that satisfy science (internal) & those that reflect the complexities of real classroom teaching (external).”

Good Research is Largely Dictated by….. How well independent variables are conceptualized & operationalized, the definition of the sample, & the dependent variables are selected to assess the impact of the intervention

On Independent Variables… Precise definitions needed Problems arise with PAR (flexibility) Syntheses/meta need precision Majority of literature: incomplete or very poor description of intervention Gap between conceptualization & implementation (# min., support, tchr. Training, etc.)

Improving Independent Variables Intervention transcripts Replications (others/component analysis) Fidelity measures throughout implementation (amt of training, lesson length, time, feedback) Good comparison groups (control teacher effects, feedback, time) (see p. 13

Improving Sample Selection & Description Sample size (difficult with SWDs) Stronger the treatment, smaller #s Increase power by increasing homogeneity Precise sample description (ELL, SES, achievement & cognitive levels, etc.) Random selection (survey); random assignment (intervention )

Quasi-Experimental Designs Students used from intact groups Determine similarity with pretests -- if variance exists, use procedures to adjust statistically Problems when differences on pretests exceed 1/2 SD of the two groups

Improving Quasi-Experiments Adequate prestesting with measures with good technical qualities Pretest data with more than.5 SD shouldn’t be used ANCOVA shouldn’t be used if SD>.5

Dependent Measures Those measures used to evaluate the impact of the intervention. The conclusions of a study depend on both the quality of the intervention and the measures used to evaluate the intervention.

Improving Dependent Measures Use multiple measures ( global and specific skill) Select measures non-biased toward intervention (teaching to the test) Ensure that not all measures are developed by researcher Select measures with good technical adequacy