Presentation is loading. Please wait.

Presentation is loading. Please wait.

What can we learn from small pilots conducted by school districts?

Similar presentations


Presentation on theme: "What can we learn from small pilots conducted by school districts?"— Presentation transcript:

1 What can we learn from small pilots conducted by school districts?
Lessons from the Ed Tech RCE Coach Presentation at the SREE Spring 2018 Conference Washington, DC February 28, 2018 Alexandra Resch

2 Overview What is the Coach and why did we build it?
What does the Coach do? What have we learned so far working with districts? Where can we go from here?

3 Problem: Low Use of Rigorous Evidence
Sources of information in technology purchasing This is from a Digital Promise report - they surveyed district staff and tech providers to understand how districts identify and vet tech products intended to support instruction. Source: Improving Ed-Tech Purchasing, Digital Promise

4 Poll of Principals and District Leaders
Much more common to ask teachers than to run a pilot. Looking at progress or usage data also more common than a pilot. For educational software, it is really important to understand usability and user experience, but it’s alarming that districts often end there. Source: Menti poll during session at Ed Surge Fusion conference, November 2017.

5 Why Don’t Districts Use Evidence?
Different definition of what it means to “work” Logistics and cost are focus Evaluation of effectiveness often an afterthought Limited resources, capacity, or appetite for interpreting or pursuing rigorous evaluation Often believe that evaluation requires clinical conditions that don’t exist in schools Sometimes hard to differentiate between marketing and research Often find research not providing the information they need

6 Solution: the RCE Coach
Powerful, free, easy-to-use online platform Offers tools and guidance needed to make evidence-based decisions using comparison- group evaluations Designed for users with little time, and limited knowledge of statistics Funded by the U.S. Department of Education, Office of Educational Technology Launched in 2016 and is being used by schools, districts, and other education agencies Today we’ll introduce you to the Rapid Cycle Evaluation, or RCE Coach, as a tool that can help you figure out what works Alternate bullets: Designed with input from teachers, school leaders and district staff Designed for users with little time, and limited knowledge of statistics

7 Types of Support the Coach Provides
Evaluation concepts Analytic thinking Data wrangling Number crunching Decision making  Goal is to help district learn as much as possible from decisions

8 The Coach approach to generating evidence of effectiveness
What does it do? The Coach approach to generating evidence of effectiveness

9 Key Features Helps user create a valid comparison group using matching or random assignment Collects information about intervention, context, and setting Reports results and study design in finding brief Facilitates sharing within and across organizations User is prompted to set parameters for success that feed into conclusions Magnitude of change that would be meaningful (and region of practical equivalence) Certainty threshold Anchored to decisions user will make

10 The RCE Coach Workflow The Coach walks users through 5 steps of conducting ed tech evaluations. The Coach’s tools also build capacity for and comfort with evaluation, increasing the district’s ability to interpret and generate evidence about what technologies work in their schools.

11 Interpreting the findings

12 Interpreting the findings
First statement is a summary of your results, based on your data and the parameters you chose for the intended effect and level of confidence you’re comfortable with.

13 Interpreting the findings
Second section includes your research question and reports the parameters you set in plain language.

14 Interpreting the findings
Third section gives the detailed findings that lead to the conclusion above.

15 What have we learned from working with districts?

16 Who is Using the Coach? Note: Users are prompted to enter a role when they sign up for the Coach. 117 did not enter any role and are not included in the chart.

17 How are education agencies using the Coach?
Is a program or approach working? Assess whether a program or approach already being implemented is achieving its goal and worth the investment Uplift Education Charter School Network Evaluated a pull-out reading intervention Learned there was only a 12 percent chance that the program positively affected student reading scores Determined that the $40,000 investment per school year was not worth it, and reallocated funds to hire reading specialists in most schools

18 How are education agencies using the Coach?
Is it worth expanding a pilot? Pilot and assess a new program or approach before rolling out to all children, parents, or staff Clarksdale Municipal School District Conducted small pilot evaluations of iRead in summer and afterschool/Sat programs Learned there was a 78 percent chance that the program positively affected reading scores in November, but only a 22 percent in January Determined it was worth rolling out the low- cost iRead program in all K-2 classrooms

19 How are education agencies using the Coach?
What is the best way to implement a program or approach? Assess implementation variations of a program or approach A Large Mid-Atlantic District Testing variations of types and timing of text message reminders to reduce chronic absenteeism among students Results coming soon!

20 Pilot Studies to Date 2349 registered users 1756 evaluations started
340 completed matching 41 completed random assignment 347 completed “Get Results” 37 evaluation briefs shared Most focus on student achievement outcomes Sample sizes range from 16 to 3,226 Median = 282, mean = 610

21 Opportunities Building comfort with evaluation
Building capacity to think more analytically Districts find value in planning tools Building capacity to conduct rigorous evaluations Case study: professor at South Dakota State using the Coach in a course Have a few examples where we see link between the questions, results and decisions

22 Challenges Data availability and management Quality of evaluations
Staff time Evidence building not a top priority Districts are reluctant to share their results

23 Where can we go from here?

24 Learning from Evals in the Coach
Collect data needed to populate findings brief Decision thresholds Brief description of intervention and implementation Details of matching or random assignment Summary stats for matched sample and analysis sample Picture of posterior distribution Probabilities associated with decision regions Goal: aggregate information from different evals Currently only a small number of evaluations shared Few evaluations of same product (even among non-shared evals)

25 Plans Current priorities Longer term
Support users to complete strong evaluations Increase knowledge and usage of Coach Longer term Improve search function for shared evaluations Aggregate results if multiple studies on same intervention Assess whether set of results gives us useful information for setting priors

26 For More Information Find the Coach at edtechrce.org
Series of blogs on the Coach Contact me at


Download ppt "What can we learn from small pilots conducted by school districts?"

Similar presentations


Ads by Google