Big Data, Education, and Society April 4, 2018
Assignment 3 Let’s go over assignment 3
Implementation Fidelity NIH definition “Implementation fidelity is the degree to which an intervention is delivered as intended…”
Questions to ask during implementation (Feng et al., 2014, p. 564) “Is this the quality of implementation we expected as creators of the intervention?” “What actions can we take that might bring implementation up to our desired levels?”
The peril of implementation fidelity You design a brilliant innovation
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development Buy-in from school administrators
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development Buy-in from school administrators Monitoring of ongoing implementation
The peril of implementation fidelity You design a brilliant innovation You refine it in carefully-controlled settings Well-designed teacher professional development Buy-in from school administrators Monitoring of ongoing implementation You then throw it out into the cold, cruel world
Failure to take medicine
Carnegie Learning RAND study
Chartiers Valley Substitute Teacher
Does anyone have any personal examples of implementation fidelity failures to share?
Categories of implementation fidelity (Dane & Schneider, 1998) Adherence – are program components delivered as prescribed? Exposure/dosage – are program components delivered as much as intended? (and in the right proportions?) Quality – are program components delivered in the “theoretical ideal” intended fashion?
What attributes influence implementation fidelity (Carroll et al Intervention complexity – and prescriptiveness Support – by developer, school, district Participant responsiveness – how interested/willing are teachers, and how difficult is intervention for them to adopt
How can we improve odds of good implementation fidelity? Your thoughts?
Reasoning Mind approach (Khachatryan et al., 2014) Regional implementation coordinators (1 per 37.7 teachers) Look at data on student engagement and performance to identify problem spots Visit classrooms periodically and conduct observations according to a detailed rubric (average = 6 visits/year) Meet with teachers after observations to provide professional development on classroom issues This was rated as teachers’ favorite part of adopting the curriculum
Reasoning Mind approach (Khachatryan et al., 2014) Rubric Does teacher use analytics reports to make instructional decisions Does teacher plan lesson activities and student interventions before class Does teacher conduct varied interventions with students in need Proportion of class time students spend using system Do students use all system features Does teacher engage with students during class
Reasoning Mind approach (Khachatryan et al., 2014) Rubric (continued) Does teacher use effective classroom management procedures Does teacher establish clear goals and rewards for individual students and entire class Do students have well-organized notebooks that show student work Do students use recommended learning strategies Are students on-task
Reasoning Mind approach (Khachatryan et al., 2014) Curriculum modification Use data from regional coordinator visits to identify areas where implementation fidelity is generally low Encouraging students to take notes and show written work Checking student notes and written work Modify automated curriculum to better scaffold these areas for both teachers and students Modifying teacher professional development to emphasize these areas (2 days before school year, six half-day workshops during school year)
Thoughts What do you like about this approach? What do you dislike about this approach?
Feng et al. (2014) ASSISTments approach 12-hour (2-3 day) “Best practices” workshop with teachers at beginning of year Count as state professional development credit Beginning-of-year interviews with principals Beginning-of-year teacher survey Analyze system log data during year Classroom observations of teacher practices End-of-year teacher interviews
Feng et al. (2014) ASSISTments log data analysis How often do teachers assign homework in ASSISTments? What are homework completion rates? How long do students spend on homework? Which teachers are not opening homework reports?
Feng et al. (2014) actions taken Visit teachers who did not use system as intended, with targeted plans for which behaviors to coach Change agenda of “best practices” workshop to match general issues Modify design of reports given to teachers
Thoughts What do you like about this approach? What do you dislike about this approach?
Carnegie Learning Implementation Fidelity Approach (Pane et al Carnegie Learning Implementation Fidelity Approach (Pane et al., 2014, p. 129) 3 days of professional development/training during summer 1 visit from PD staff to a school during year PD staff “observe classrooms, offer recommendations, and help teachers address any problems they are having with implementations” “Teachers also receive a set of training materials, an implementation guide, and a book of resources and assessments.”
Thoughts What do you like about this approach? What do you dislike about this approach?
Karam et al., 2017 Studied implementation fidelity of Cognitive Tutor in real-world use (self-report surveys) Only 45% of HS teachers reported using software for prescribed amount of time Only 14% of HS teachers reported working with non-software recommended practices for prescribed amount of time Only 30% of HS teachers reported spending as much time during software use working with students as prescribed
Effectiveness of PD Correlation between more PD and prescribed pedagogical practices in Reasoning Mind (Miller et al., 2015) No correlation between more PD and prescribed pedagogical practices in Cognitive Tutor (Karam et al., 2017)
Questions? Comments?
How Does Adoption Occur? Makes a lot of difference to how good implementation fidelity will be State-level curriculum approval processes Top-down decisions based on sales Curriculum reviews by groups of teachers Individual teachers’ decision-making Individual learners’ decision-making
Questions? Comments?
Activity You are the implementation director for Bob’s Discount Math Curriculum (BDMC) 1000 teachers use your system Each teacher’s classes produce $3000 of profit after all expenses except implementation/PD/support
Costs Assuming Salary is $50,000 year plus double for benefits, equipment, office space In other words Daily cost of implementation team member is $300 Daily site visit cost of implementation team member is $500
Sample Costs "trouble-spotting data analysis" 1 day 350 specific teacher data analysis 0.5 days 150 specific teacher classroom observation 500 single-teacher PD 250 full-day multi-teacher PD 2 days 1000 half-day multi-teacher PD multi-teacher PD prep 3 days 900 books and materials 100
Create a plan for implementation In groups of 3-4 What forms of support will you use? How much profit is left over after support?
Each group read out your final profit
Each group read out your final profit Highest-profit group and lowest-profit group Please read out what implementation support you offered, and why you chose this combination
Should this calculation change… Between a non-profit like Reasoning Mind And a for-profit like Carnegie Learning What are the risks of too low a profit for each type of organization?
Questions? Comments?
Upcoming office hours April 11 930am-1030am or by appointment