Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,

Slides:



Advertisements
Similar presentations
Katrina Abuabara, MD, MA1 Esther E Freeman MD, PhD2;
Advertisements

What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic.
+ Evidence Based Practice University of Utah Training School Psychologists to be Experts in Evidence Based Practices for Tertiary Students with Serious.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Grading the Strength of a Body of Evidence on Diagnostic Tests Prepared for: The Agency for Healthcare Research and Quality (AHRQ) Training Modules for.
A Problem-Solving Approach to Student Success.  Review of RTI  Definitions  The Problem-Solving Approach  Role of the Three Tiered Intervention System.
Introduction to the User’s Guide for Developing a Protocol for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research.
How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.
What Works Clearinghouse Practice Guides. U.S Department of Education Institute of Education Sciences (IES) What Works Clearinghouse (WWC) Practice Guides.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
When To Select Observational Studies as Evidence for Comparative Effectiveness Reviews Prepared for: The Agency for Healthcare Research and Quality (AHRQ)
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
Critical Appraisal Dr Samira Alsenany Dr SA 2012 Dr Samira alsenany.
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
By Dr. Ahmed Mostafa Assist. Prof. of anesthesia & I.C.U. Evidence-based medicine.
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
EVIDENCE BASED PRACTICE
RESEARCH A systematic quest for undiscovered truth A way of thinking
Reading Scientific Papers Shimae Soheilipour
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Reviewing systematic reviews: meta- analysis of What Works Clearinghouse computer-assisted reading interventions. October 2012 Improving Education through.
Power and Sample Size Determination Anwar Ahmad. Learning Objectives Provide examples demonstrating how the margin of error, effect size and variability.
Best Practices: Standing on the Shoulders of Giants? Ronnie Detrich Wing Institute.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
STANDARDS OF EVIDENCE FOR INFORMING DECISIONS ON CHOOSING AMONG ALTERNATIVE APPROACHES TO PROVIDING RH/FP SERVICES Ian Askew, Population Council July 30,
What Works Clearinghouse Susan Sanchez Institute of Education Sciences.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
November 15, Regional Educational Laboratory - Southwest The Effects of Teacher Professional Development on Student Achievement: Finding from a Systematic.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
Rigor: Contemporary Standards for Experimental and Quasi- Experimental Research Russell Gersten Director,Instructional Research Group & Professor Emeritus,
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
GUIDE TO EVIDENCE-BASED INTERVENTIONS. ASDs now affect one in every 110 children Centers for Disease Control and Prevention Lifelong effect on functioning,
Measuring Fidelity in Early Childhood Scaling-Up Initiatives: A Framework and Examples Carl J. Dunst, Ph.D. Orelena Hawks Puckett Institute Asheville,
When To Select Observational Studies Interactive Quiz Prepared for: The Agency for Healthcare Research and Quality (AHRQ) Training Modules for Systematic.
Systematic Review: Interpreting Results and Identifying Gaps October 17, 2012.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Hypothesis Testing Introduction to Statistics Chapter 8 Feb 24-26, 2009 Classes #12-13.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
…empowering communities through modeling and adaptive management Evaluate & Share Did We Make It?
Dallas 2015 TFQO: Name EVREVs: Names and #COI Taskforce: Name Insert Short PICO title Total of 12 (no studies) to 20 slides (maximum) using standard format.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Workshop on Standards for Clinical Practice Guidelines Institute of Medicine January 11, 2010 Vivian H. Coates, Vice President, ECRI Project Director,
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Comparative Effectiveness Research (CER) and Patient- Centered Outcomes Research (PCOR) Presentation Developed for the Academy of Managed Care Pharmacy.
Reviewing systematic reviews: meta- analysis of What Works Clearinghouse computer-assisted interventions. November 2011 American Evaluation Association.
Copyright © 2010, 2006, 2002 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 10 Evidence-Based Practice Sharon E. Lock.
June 25, Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement:
RESEARCH SYNTHESIS: The Core Methodology of Evidence-Based Reviews Susan N. Labin, Ph.D.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
How to read a paper D. Singh-Ranger.
Quality Health Care Nursing 870
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Systematic Review, Synthesis, & Clinical Practice Guidelines
Sabine Wollscheid, Senior Researcher, Dr. phil.
North Carolina Positive Behavior Support Initiative
Overview of different types of reviews : Scoping Reviews, Rapid Reviews, Systematic Reviews Housne
EVIDENCE-BASED PRACTICE
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic. Ask What is a review?
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February, 2010 Oslo, Norway

How to Synthesize? And for Whom? Obviously an enormous number of studies in most fields, even for narrow questions What should approach be to reviewing studies, summarizing outcomes, and then evaluating quality of studies? How should syntheses be prepared and distributed? What are problems with research syntheses?

Challenges to Synthesis Studies can use different design, different populations, different sample sizes Studies of effect typically make some comparison on one group to another, –e.g., those with more education – e.g., those enrolled in an early childhood program Many validity problems in studies –Selection effects, consistency/fidelity of adoption, attrition Need to decide how to rank and present evidence –For policy, must be accessible and useful

US What Works Clearinghouse (WWC) Starting 2002, US Dept of Education set up WWC to: Produce user-friendly practice guides for educators that address instructional challenges with research-based recommendations for schools and classrooms; practice guides Assess the rigor of research evidence on the effectiveness of interventions (programs, products, practices, and policies), giving educators the tools to make informed decisions; interventions Develop and implement standards for reviewing and synthesizing education researchstandards Source:

Approach of WWC Attempts to provide systematic evidence reviews in areas of educational practices - has worked in 8 separate topic areas (e.g. early reading, math instruction) Has very high standard, favoring Randomized Control Trials (RCTs) versus Quasi- Experimental Designs (QEDs) Project leads to quality of evidence reports and practice guides based on good evidence

WWC Approach Favors RCTs

Sample WWC Output for 25 Early (K-3) Reading Programs

Few Studies Meet WWC Standard for Inclusion Large numbers of studies reviewed Relatively few meet evidence standards –Prefer RCTS –Good matching –Low Attrition WWC reduces further with substantive (vs statistical) significance

Outcome of WWC Very small number of studies high meet evidence standards –Really is Almost Nothing Works Clearinghouse –WWC Procedures and Standards Handbook is very useful reference to evidence review - Describes/discusses standards Produce practice guides and Doing What Works multimedia presentations on web aimed for educators

Analogy with Health Research Example of Agency for Healthcare Research and Quality (ARHQ) –Nearly 200 completed evidence based practice reports mostly on medical treatment Summarize literature and present estimated effects with meta-analytic techniques Readily available to search on ARHQ website –Presents strength of evidence based on design, sample sizes, etc –Medical model easier to have small RCTs and controls (e.g., placebo drugs)

ARHQ: Strength of Evidence Grades High true effect. Further research is very unlikely to change our confidence in the estimate of effect. Moderate confidence that the evidence reflects the true effect. Further research may change our confidence in the estimate of effect and may change the estimate. Low confidence that the evidence reflects the true effect. Further research is likely to change our confidence in the estimate of effect and is likely to change the estimate. Insufficient Evidence either is unavailable or does not permit estimation of an effect.

Comparison of AHRQ and WWC ARHQ able to draw on larger literature Easier design issues in medical studies –(less concern of selection/confounding factors) Have much higher pass rate for health studies Practice guides may be more current in that base literature more quickly updated –More acceptable studies being produced in health – they are standard in drug treatment literature Health studies can generally be more focused to specific treatment vs. education that has many confounding components (which favors RCTs)

General Challenges in Research Most research looks at outcomes/effects in short-term –Does end-of-year student achievement improve for fifth graders if teachers use specific reading method ? –Does drug reduce blood pressure? By how much? Key question is does long-term, sustainable effect exist? Do measured effects apply to large population? If data obtained for other reason than evaluation study, can we use them for evaluation? –What methods to apply? What are comparison groups? Can we match individuals?

Questions for Thought/Discussion What is good evidence of effect to guide policy? How high a standard is needed? –Must we have (expensive) RCTs for everything? What data should routinely be gathered to support later studies, whatever method? –Important policy question: what data to collect?` How best to synthesize and report evidence to be useful to policymakers AND practitioners? – Doing What Works and not just Knowing What Works (or thinking we know)

Working to Improve Health in US Boston Marathon, Km done, 2 to go!