Evaluating a Multi-Year, Federally-Funded Educational Initiative: Lessons from a Successful School District-Evaluator Partnership Kelly Feighan, Research for Better Schools Rorie Harris, Memphis City Schools
Review characteristics of successful partnerships Summarize five-year partnership forged between K- 12 school district and evaluator Present how we addressed challenges that are inevitable to conducting school-based evaluations Provide lessons learned from the partnership to stimulate thought about improving evaluation work Objectives
“ Great” Programs Provide direct services that address needs Have a ‘life cycle’ that ends Attract resources for expansion Benefit from constant monitoring Lieberman (2009) “ Great ” Evaluations Begin when program is first conceptualized Measure identifiable outcomes Demonstrate program effectiveness and value to funders Inform the “field” about effective programs Program immediate indicators of program decline
Data collection interferes with program activities Data do not show positive outcomes Partners do not understand other’s role/expertise Program staff cannot articulate evaluation purpose Evaluators cannot describe the program Potential Sources of Conflict (Lieberman, 2009)
Five-year Striving Readers Grant supported literacy integration in eight middle schools MCS: large, urban district with economically disadvantaged students, low state test outcomes, and high turnover in leadership and staff positions Evaluation research design: RCT for targeted intervention and quasi-experimental design for whole-school intervention Background
MCS provided implementation leadership, READ 180 classes, literacy coaches, PD instructors, R & E staff and numerous inputs University of Memphis developed and delivered the four-semester teacher course and mentored coaches RBS collected data through surveys, daily coaching logs, individual and group interviews, classroom observations, program record review, and secondary analysis of testing data Partnership Roles
Evaluator Buy-in among school personnel & respondents’ time Entrée into the schools Responsive R & E district staff District Unobtrusive, sensitive evaluators who understand context and minimize respondent burden Timely, actionable formative information What both Parties Need from the Other
Maintaining random selection compliance Persuading “actors” to participate in evaluation Bridging cultural divides Logistical issues related to data collection (e.g., arranging observations) Obtaining parental consent for Iowa Test of Basic Skills (ITBS) Evaluators’ Anticipated Challenges
Translating evaluators’ needs to school personnel Administering and scheduling time for ITBS Ensuring continued school buy-in despite changes in leadership and district-wide objectives Balancing the needs of schools, IHE actors, and evaluators in context of competing demands Communicating “up” the chain of command how operations will look at the school level District’s Anticipated Challenges
Consulted PI on all data collection scheduling Carefully-nurtured connections “Three cups of tea” “Do no harm” PI characteristics: 25 years teaching in district Included R & E district staff on monthly calls Possible Reasons for Strong Partnership
Respected the skills/knowledge of district actors Relied on local evaluators where possible Exercised “common” courtesy Convened sessions focused on mapping program Ensured that evaluators and program staff had shared language Possible Reasons for Strong Partnership
Model good “etiquette” for field staff Reinforce importance of sensitivity to actors on- the-ground Schedule monthly calls with any actor charged with collecting/sending data Invite stakeholders from all authority levels to working sessions that result in better articulating goals and objectives Recommendations for Evaluators
Kelly Feighan at Research for Better Schools Rorie Harris at Memphis City Schools, Office of Research, Evaluation, and Assessment Contact: