Don’t Be Fooled: Assessing the Quality of Evidence

Slides:



Advertisements
Similar presentations
RESEARCH CLINIC SESSION 1 Committed Officials Pursuing Excellence in Research 27 June 2013.
Advertisements

ESEA: Developing the Paraprofessional Portfolio Prepared by Carolyn Ellis Logan, Consultant Professional Development/Human Rights Department Michigan Education.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Implementation Core Team March 1, 2013 Conference Number: Participant Code:
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
8. Evidence-based management Step 3: Critical appraisal of studies
New England Regional Colloquium Series “Systems of State Support” B. Keith Speers January 24, 2007.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
The Academy of Pacesetting Districts Introducing...
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
EBC course 10 April 2003 Critical Appraisal of the Clinical Literature: The Big Picture Cynthia R. Long, PhD Associate Professor Palmer Center for Chiropractic.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
© 2011 Pearson Prentice Hall, Salkind. Writing a Research Proposal.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
Pilot Grant Program EGAD Study OCCUPATIONAL & ENVIRONMENTAL HEALTH.
Nevada Department of Education Office of Educational Opportunity Nevada Comprehensive Curriculum Audit Tool for Schools NCCAT-S August
Module 6.0: Communication Protocol DIT Installation Series Trainer Name Date.
Module 8.0: Selection Process for Effective Innovations DIT Installation Series Trainer Name Date.
© 2009 Pearson Prentice Hall, Salkind. Chapter 13 Writing a Research Proposal.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. EVIDENCE-BASED TEACHING IN NURSING – Chapter 15 –
Preparing a Showcase Awards nomination 2016 Showcase Award Guidelines showcase/2016/guidelines.html
District Evaluation of MTSS RtI Innovations 2016 Anna Harms & Jose Castillo.
Statistical Analysis and Tips for Designing a Clinical Study DESIGN, METHODS AND COMMON MISTAKES Eneko Larumbe, PhD Biostatistician October 12, 2016.
Module 6: Coaching System
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Module 5: Communication Plan and Process for Addressing Barriers
Coaching and Supervision:
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
Module 8: Effective Innovation Review and Selection Process
Module 7: Effective Innovation Alignment
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Linking Communication Protocols
Mapping Out Your Summer Plans
Module 1: Introduction to Terminology and Implementation Research
CHAPTER OVERVIEW The Format of a Research Proposal Being Neat
Writing the research protocol
Supplementary Table 1. PRISMA checklist
The Year of Core Instruction
School-wide Positive Behavioral Interventions & Supports (SWPBIS) Readiness Activity miblsi.org.
Randomized Trials: A Brief Overview
European Network on teacher Education Policies
Using the Hexagon tool to Effectively select a practice
RtI Innovations: Evaluation Anna Harms & Jose Castillo
PSYCH 610 Competitive Success/snaptutorial.com
PSYCH 610 GUIDE Experience Tradition / psych610guide.com.
PSYCH 610 Education for Service/snaptutorial.com.
Session 4 Objectives Participants will:
What can we learn from small pilots conducted by school districts?
EDD/581 Action Research Proposal (insert your name)
Project Management Process Groups
Roles and Responsibilities
Project Title Subtitle: make sure you specify it is a research project
Assessing Academic Programs at IPFW
Employee engagement Delivery guide
EDD/581 Action Research Proposal (insert your name)
Roles and Responsibilities
Using the Child and Family Outcomes Analysis Tools
An Introduction to Evaluating Federal Title Funding
CHAPTER OVERVIEW The Format of a Research Proposal Being Neat
Presentation transcript:

Don’t Be Fooled: Assessing the Quality of Evidence District Review and Selection Process

Outcomes Considerations for judging the quality of evidence Describe benefits of using an effective innovation review and selection process that includes a review of evidence and other critical factors Identify the components of an effective innovation review and selection process Review an example Demonstrate how the review and selection process can also be used for “de-selection”

1.0 Quality “Evidence:” Good, Bad and the Ugly

Why Evidence-Based Practice? Increase probability of success for more students Central feature in implementation of MTSS & RtI Required in Every Student Succeeds Act (ESSA); many states require it as well

Levels of Evidence Pyramid

Major Categories Filtered information: judges the quality of studies and makes recommendations for practice Includes the following: Systematic reviews Critically-appraised topics Critically-appraised individual articles

Major Categories (cont.) Unfiltered Information: includes the primary, unfiltered literature that requires people accessing to take on the role of making sure the information you found is reliable and valid Includes the following: Randomized controlled trials Cohort studies Case-controlled studies; case series / reports

1. Filtered Information Systematic Review: Ask a specific question Thorough literature review across studies Eliminate poorly done studies Attempt to make practice recommendations based on well-done studies Critically - Appraised Topics: Evaluate and synthesize Multiple research studies Resulting in short, systematic reviews about a particular topic Critically- Appraised individual articles Evaluate individual research studies Write a short synopsis of the study

2. Unfiltered Information Randomized Control Trials: Random assignment Intervention group (thing studying to determine its effect) Control group Considered to be superior to other study designs Cohort Studies: Observes a group of individuals Meet a set of criteria (selection criteria) Longitudinal (cohort is followed over time) Case-Controlled Studies; Case Series / Reports: Observational study Participants are not randomly assigned to the intervention or control group Lowest level of rigor for research designs Results may be confounded by other variables

Other Research Design Types Quasi-experimental: looks like an experimental design but lacks the key ingredient of “random assignment” Concerns about internal validity because the intervention group and control groups may not be comparable at the onset of the study (baseline) Institute of Education Sciences (IES) has provided guidance on designing quasi-experimental studies (https://ies.ed.gov/ncee/wwc/Multimedia/23)

Other Research Design Types Regression discontinuity (RDD): assignment to intervention or comparison group based on performance relative to cutoff value (e.g, score on a test) This is a regression analysis Concerns about integrity of forcing variable, attrition, strict parameters for modeling and analyses Institute of Education Sciences (IES) has provided guidance on designing and evaluating RDD studies (https://ies.ed.gov/ncee/wwc/Docs/ReferenceResources/ww c_standards_handbook_v4_draft.pdf)

Activity 1.0 Review the levels of evidence presented in this section making note of critical pieces of information you believe your colleagues would benefit from hearing about after this session Your presenters will put the group into a partner activity after reviewing the slides

Activity 1.1 Describe the elements of your district’s process for conducting a thorough review of evidence prior to selection If you are unsure about your district’s process, identify some information you want to know after leaving this conference Discuss your district’s capacity (people, knowledge/skills, budget) to review evidence programs and practices

Evidence is Only Part of the Equation Reviewing the evidence of programs, practices, etc. is only part of the equation. Districts need to consider other critical factors to help them make good decisions about what to select for improving student outcomes.

2.0 Benefits of a Review and Selection Process

Key Terminology Effective Innovation: A set of defined practices used in schools to achieve outcomes that have been empirically proven to produce desired results To be an effective innovation, the practices, programs, assessments, or initiatives should be proven to be “usable:” teachable, learnable, doable, and readily assessed in practice

Benefits Increased confidence in the following: Participation in initiatives or the adoption of programs, practices, and assessments are the best available District has a full understanding of the resources needed to successfully use the selected effective innovations Decisions that resulted in not selecting an effective innovation or de-selecting the use of an existing innovation resulted in a thorough analysis and critical factors

3.0 Effective Innovation Review and Selection Process

Components of a Review and Selection Process Purpose of a review and selection process Guidelines for when to use the process Decision-making protocol Directions for: Completing the review and selection tool Providing supporting documentation for specific items Submission

1. Purpose Include the following: Brief summary of the purpose and intended outcome of conducting a thorough review of an effective innovation Rationale about why the district expects a thorough review process be completed before decisions are made to select an effective innovation

2. Guidelines for Use List likely scenarios that would warrant the use of the review process: Approached to consider participation in an initiative, “pilot project,” and / or approached to use a new assessment or data system Considering purchasing new curriculum resource materials Considering purchasing new assessments, data systems, or educational software Considering to continue the use of effective innovations that overlap or appear to be redundant with other effective innovations (de-selection)

3. Decision-Making Protocol List the people with the highest level of decision- making authority to determine whether the review process will result in a new selection or de-selection Include statements about the conditions that would warrant involvement from other groups / teams (e.g., board of education, curriculum council) Provide parameters for timelines to make decisions

4. Directions Steps outline: Initiating a review / selection process (e.g., who can do this and what needs to occur before it is started) People that need to be involved in the process: Specific items in the review and selection tools are to be completed by pre-determined designees Parameters for seeking consultation from program, assessment developers, or certified individuals to adequately represent the effective innovation Submitting the tool with the appropriate documentation

Selection Tools Two tools: Program, Practice, or Initiative review tool Assessment review tool Each tool is framed around six critical factors that need to be considered during a high- quality review and selection process

Six Critical Factors Effective innovation overview (e.g., name / title, description) Need for the innovation Fit with district, state, national priorities or existing efforts Resources necessary to support peoples’ successful use Evidence to demonstrate positive effect on outcomes Readiness for the effective innovation’s use in a typical school / classroom setting Is the effective innovation mature enough to use in a typical classroom setting and do staff meet the qualifications to successfully use the effective innovation?)

Hexagon Tool (Blasé, K., Kiser, L., Van Dyke, M., 2013)

Activity 3.0 Given the components of a review and selection process (slide 16) and the six critical factors (slide 22), outline areas you believe your district’s process overlaps and opportunities to strengthen your existing process

4.0 Review and Selection Template

Activity 4.0 Independently read page 1 of the document, “Effective Innovation Review and Selection Process” Jigsaw Activity: Partner 1: Read the directions and steps for the Program, Practice, or Initiative Review process (pp 2-7) Partner 2: Read the directions and steps for the Assessment Review and Selection Tool (pp. 8-14) Each partner will develop talking points outlining the important sections, reasons for their importance, and critical areas that are included in the tool that could get overlooked

5.0 Additional Resources and Examples

Additional Resources Core Reading Curriculum Analysis Process: used prior to the district completes the Effective Innovation Review and Selection Tool (handout) Example of a district’s Effective Innovation Review and Selection Process (electronic access) Example of how a district engaged in a de-selection process to make room for the new effective innovation (electronic access) List of information on evidence-based programs and practices (handout)

Activity 5.0 With your partner review the additional resources Given the context of your district, what information do you want to share with your colleagues

Thank You! Brad Niebling, Ph.D. Iowa Dept. of Education, Bureau Chief brad.niebling@iowa.go v Kim St. Martin, Ph.D. MIBLSI Assistant Director kstmartin@miblsimtss. org