Download presentation
Presentation is loading. Please wait.
Published byNigel Norton Modified over 9 years ago
1
A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson
2
Explicit criteria for judging program quality: Can be clearly discerned in the text. “A successful program will display the following characteristics…” Implicit criteria for judging program quality: Can be inferred by research questions. “We will measure various aspects of the program…” Implicit vs. Explicit Criteria for Judgments of Program Quality
3
Types of Studies: Impact and Outcome reports (14) Implementation reports (12) Methodologies: Mixed Method: Interview and Survey (17) Qualitative: Interview and/or focus group (6), Observation (5) Quantitative: RCT (4), Quasi-experiment (6), Comparative Statistical Analysis (4) Methodologies of Research Reports
4
Explicit criteria were primarily included in implementation and outcome evaluations When reports included explicit criteria, program quality was judged along methodological standards Statistical significance in quantitative studies Logic Model often used as rubric in implementation studies Most evaluation reports refrain from making actual judgments of program quality Authors tend to be uncritical of the evaluated program Evaluations tend to report findings in lieu of making judgments Explicit Statements of Criteria for judging program quality
5
Implementation evaluation: ‘Ending Violence in Schools: A Study of Morton North High School’ Logic model used as rubric Evaluators constructed logic model based on relevant research and used this model to evaluate the implementation of violence prevention approaches used by the school Impact evaluation: ‘Start Reading: Impact Study’ Statistical significance used as explicit criteria for judging program Statistically detectable differences between treatment and control schools in using a regression discontinuity student reading achievement classroom reading instructional practices student time engaged with print Examples of Explicit Criteria used during Program Evaluations
6
Few explicit statements of criteria: 9 of 31 reports have explicit statements Explicit criteria are stated more often when the program is deemed to be successful 6 of 9 reports with explicit statements were found to be successful Explicit Statements of Criteria for judging program quality
7
Frequently Provided as a basis for judging program quality: Statistical significance was often set as a goal of a research model attempting to estimate the positive impact of a program. Research questions were used to establish the goal of the study, but the questions often did not contain criteria for making judgments. Program goals were often referenced as the desired outcomes of the stakeholders or clients, but evaluators usually avoided such statements. Implicit Statements of Criteria for judging program quality
8
Outcomes evaluation: ‘Extended School Day Program’ Evaluators framed evaluation questions as research questions What are the outcomes for students, teachers, and schools in this program? What were the effects on test scores, attendance, teacher attitudes, etc.? Example of Implicit Criteria used during Program Evaluations
9
Many of the reports imply that stakeholder expectations are a guiding principle for program ‘quality’. Implicit Criteria for Program Quality in each of the 31 reviewed reports. Not easily discernable (found in discussion of results in 20/31 reports) Implied criteria tied to stakeholder expectations (25/31 reports) Implicit Statements of Criteria for judging program quality
10
Examining the Program/Quality Criteria/Methodology. Implicit criteria reflects stakeholders’ desired outcomes. Desired outcomes influence methodological choices. Methodological choices influence the criteria used to judge program quality. Implicit Statements of Criteria in Reports Stakeholder Expectations MethodologyProgram Quality
11
How do Statements of Criteria relate to Methodology Used? MethodologyCriteria “For this program to be successful…” Interview/ Focus Groups Interviews and discussions with program recipients will demonstrate that these stakeholders have benefitted from the program in a certain manner Mixed Methods: Interview/Survey The survey and interview data will display that the implementation of the program is congruent with established goals of the program design RCTThis program will have a positive effect of a minimum level on the desired outcome. Stakeholder expectations: Program recipients experience the intervention in a certain way. Large scale implementation of the intervention occurs in a certain manner. The program will have an effect on a certain outcome.
12
Contact Information: Matt Linick mlinic1@illinois.edu Diane R. Fuselier-Thompson diat@illinois.edu Questions, Comments, or Praise?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.