Doing more with less: Evaluation with the Rapid Cycle Evaluation Coach

Slides:



Advertisements
Similar presentations
Using Growth Models to improve quality of school accountability systems October 22, 2010.
Advertisements

1 Title I Program Evaluation Title I Technical Assistance & Networking Session May 23, 2011.
Introduction to Monitoring and Evaluation
Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
Jessamine County Schools Pyramid of Interventions Update 2009.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Establishing an Effective Network of PB4L: School wide Coaches
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
Welcome Oregon Scaling-up EBISS Coaching Makes a Difference Expanding and Refining the Coaches Role Oregon 1.
Reporting and Using Evaluation Results Presented on 6/18/15.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
Is Your School Improving Outcomes for Students with Disabilities: Guesswork or Science? Presented by The Elementary & Middle Schools Technical Assistance.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Public Charter School Grant Program Workshop Aligning Teacher Evaluation, Professional Development, Recruitment and Retention March 3, 2014.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Program Overview The College Community School District's Mentoring and Induction Program is designed to increase retention of promising beginning educators.
Timberlane Regional School District
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
School Improvement Planning Today’s Session Review the purpose of SI planning Review the components of SI plans Discuss changes to SI planning.
Enhancing Education Through Technology (Ed Tech) Title IID Competitive Grants Michigan Department of Education Information Briefing July 17 and.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Planning for Successful Simulation Simulation Planning Guide - A Guided Discussion.
Bob Algozzine Rob Horner National PBIS Leadership Forum Chicago Hyatt Regency O’Hare October 8, /
MSP Annual Performance Report: Online Instrument MSP Regional Conferences November, 2006 – February, 2007.
AmeriCorps Grantee Training Evaluation and Research September 11, 2014.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Response to Intervention Finding a Way Out of the ‘Research-Based’ Maze: A Guide for Schools Jim Wright
Research in Practice for Adults: an introduction 1.
How do you know your product “works”? And what does it mean for a product to “work”?
Oregon Statewide System of Support for School & District Improvement Tryna Luton & Denny Nkemontoh Odyssey – August 2010.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
Response to Intervention for PST Dr. Kenneth P. Oliver Macon County Schools’ Fall Leadership Retreat November 15, 2013.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
American Evaluation Association Annual Conference
Comparing Bayesian and Frequentist Inference for Decision-Making
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Introduction to Program Evaluation
Partnership Data Collection Manual
Drafting a Climate Change Response Plan
America’s Promise Evaluation What is it and what should you expect?
Monitoring and Evaluation using the
What can we learn from small pilots conducted by school districts?
Best Practices.
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Collaborative Leadership
Procurement.
Using Data for Program Improvement
Creating Coaching Cycles that Move Coach Practice Forward
Using Data for Program Improvement
Evidence-Based Practices Under ESSA for Title II, Part A
Purvi Sevak and Todd Honeycutt Presentation at NCRE Fall Conference
An Introduction to Evaluating Federal Title Funding
Rapid cycle evaluation:
Some Further Considerations in Combining Single Case and Group Designs
Dr. Phyllis Underwood REL Southeast
Root Cause Analysis Identifying critical campaign challenges and diagnosing bottlenecks.
Presentation transcript:

Doing more with less: Evaluation with the Rapid Cycle Evaluation Coach Erin Dillon, Mathematica Policy Research Matthew Lenard, Wake County Public Schools

Session Overview What is Rapid Cycle Evaluation? Rapid Cycle Evaluation Coach demo Activity to practice using the RCE Coach Discussion on the RCE Coach and ‘Openness’ Q & A

What does evaluation look like in your agency?

What is rapid cycle evaluation? Focused on measuring the impact of changes to existing program operations and services Uses experimental or quasi-experimental methods to identify a causal relationship Relies predominantly on administrative data to measure impacts Results can be observed quickly (within one year)

A word on the word “cycle” An on-ramp to a continuous improvement cycle Short-term focus: one-time opportunistic evaluations Long-term vision: Iterative, continuous improvement tool Linked to data systems Agencies running multiple experiments per year

Rapid-cycle v. Program evaluation Rapid-cycle Evaluation Program Evaluation Incorporated into regular practice and decision making Done outside of regular practice Narrow, targeted research question Broader and/or multiple research questions Uses existing data Collects new data Findings in one year or less Multiple years until findings Low-cost, usually in-house analysis More costly analysis done by outside experts Results lead to change and a new round of testing Results are the end point

Who is doing RCE?

RCE Coach background Free, openly licensed, online tool created by the U.S. Dept. of Education Office of Education Technology Created to help districts make better decisions about education technology products, but can be used for other education interventions Tailored to non-technical audience Available at https://edtechrce.org/

RCE Coach demonstration

Bayesian v. frequentist interpretations Bayesian Interpretation Frequentist Interpretation Assesses the probability that an intervention has the desired impact Assesses whether results are statistically significant Uncertainty can be framed in probabilistic terms: “There is a 77% chance that the new education technology improves student achievement, and a 23% chance that it decreases achievement.” Uncertainty is typically framed in terms of the confidence interval: “The 95% confidence interval around the impact of the new education technology includes zero, so we cannot reject the hypothesis that there was no difference.” User determines if results are practically significant Results are usually presented as binary: statistically significant or not significant

Bayesian results in the RCE Coach There is a 67% chance the technology has a positive impact There is a 33% chance the technology has a negative impact Impact estimate and credible interval

Activity: Using the RCE Coach Work in groups or individually Use scenarios provided or your own data Scenarios are for a matched comparison design. Let Erin or Matt know if you want to try the random assignment tools Download data for scenarios here: https://edtechrce.org/PreviewTools (Coach workshop resources) Activity Steps: Create a log-in: https://edtechrce.org, select “Create Account” Work through the RCE Coach set-up steps Analyze the data using the RCE Coach analysis tool Create a findings brief and be prepared to share your results

Activity debrief What did you learn in your analysis and findings brief? Do you think rapid cycle evaluation can be useful in your agency? What obstacles will there be to using a tool like the RCE Coach in your agency?

RCE Coach lessons learned and looking ahead The Coach needs a champion to ensure RCEs are a priority The Coach can serve as a local capacity building tool Practices associated with collecting, reporting and interpreting usage data are still emergent Ed Tech Developers are important partners in the RCE process Looking ahead Build out non-academic achievement, teacher professional development and staff productivity measures Pilot with additional districts/schools Add case studies based on pilot districts Highlight and expand the “shared evaluation” page

RCE Coach and ‘Openness’ ‘Share Evaluation’ page allows users to share their results publicly Goal is to create a national bank of RCE results that describe education technology tools in different contexts Hope to add the ability to create meta-analyses that aggregate results across districts Challenges in sharing results: Districts are reluctant to share negative or null results Ed tech developers see risk in potentially unfavorable results Need more openness to sharing “failure”

Interested in piloting the RCE Coach? Identifying districts and schools to be pilot partners Ready to pilot an ed tech in summer or fall 2017 Ideally, schools or districts are: Able to implement forward-looking evaluations, and/or Interested in looking at student non-academic outcomes, teacher professional development, or staff productivity Receive customized training and support from MPR or SRI Go to https://edtechrce.org/ and submit the form to express your interest

For more information Erin Dillon Matthew Lenard edillon@mathematica-mpr.com Matthew Lenard mlenard@wcpss.net