Download presentation
Presentation is loading. Please wait.
1
Evaluation Plans & Performance Indicators Office of Research, Evaluation, and Policy Studies Marcella M. Reca Zipp November 30, 2010
2
Necessity of evaluation plan Types of evaluation plans Components of an evaluation plan Performance indicators Reporting requirements Sample evaluation plans
3
Purpose of Evaluation Plan Cohesive approach to conducting evaluation and using results Explains what, when, how, why, who Documents the evaluation process Ensures implementation fidelity Promotes a participatory approach Source: University of Toronto
11
3 Levels of Evaluation Project-Level Evaluation Context Implementation Outcome Cluster Evaluation Program and Policymaking Evaluation
12
Project-Level Evaluation Context Needs, assets, and resources of community Political atmosphere, social and environmental strengths/weaknesses Implementation Critical components/activities of project Aspects that are strengths and weaknesses How do the components connect to goals and outcomes Outcome Critical outcomes you are trying to achieve Impact on clients, community, etc. Unexpected impact
13
Cluster Evaluation Determines how well the collection of projects fulfills the objective of systemic change. Not a substitute for project-level evaluation. Looks across a group of projects to identify common themes. Information reported in aggregate form to granting organization. EVALUATOR PROJECT APROJECT BPROJECT C
14
Program and Policy Making Evaluation Macro form of evaluation. Utilizes information gathered from both project-level and cluster evaluation to make effective decisions about program funding and support. Supports communities in creating policy change at local, state, and federal levels.
15
Elements of an Evaluation Introduction Project Objectives Logic Model Partnership Roles and Responsibilities Intervention Programming/Research Methodology/Data Collection Instrumentation Measures
16
Introduction Provides background information for the evaluation, identifies purpose and goal, sets the course on the evaluation road map. Evaluation purpose and goals What does the evaluation strive to achieve? Evaluation team Who is the evaluation coordinator? Who are the members of the evaluation team?
17
Logic Model Graphic depiction of the program description. Links needs, objectives, activities, measurements. Provides scope of program. Ensures systematic decisions are made about what will be measured. Identifies and organizes indicators.
18
Objectives PIMO method The number of objectives will be determined by purpose (i.e., intervention, treatment, prevention). Interrelated to your projected activities (i.e., education, service, research). Feasible to collect, will provide accurate results.
19
Partnership Roles and Responsibilities Project partners are expected to provide certain, unique expertise to your project activities, either in direct service function or as advisory units. Identify each partner in their role and responsibility in terms of involvement for your project. Partner-cited activities must be evaluated formatively and summatively.
20
Intervention Programming Identify one or more intervention strategies used to support project activities and anticipated outcomes. Cite if the program is on the federal evidence-based initiative list: EBIEBI For market-available programs that require training and certification of direct service providers, provide a timetable for acquiring training before intervention can be used.
21
Performance Indicators Visible, measurable signs of program performance. Relevant, understandable and useful. Reflect program objectives, logic model and evaluation questions. Define success Reasonable expectations of program performance. Source: University of Toronto
22
Performance Indicators cont. Other terms – industry jargon Key Performance Indicator (KPI) Performance metric Performance standard Balanced Scorecard Quality indicators All are different words for the same thing: measure performance.
23
Data Collection What methods will be used? How often will data be collected? Who will collect the data? Validity and reliability of data sources Baseline data Outcomes-based triangulation Quality assurance Design (experimental, quasi- experimental, etc.)
24
Instrumentation Measures Tools for data collection Only collect the information you need Easy to administer and use Pilot test tools before use in the evaluation Human Subjects Considerations IRB, school board approval Data management and storage Confidentiality and data quality
25
Tips & Helpful Hints Be realistic In your assessment of resources In your timeline Seek help Use templates, tables, or guides that may be provided in the RFP or model after past funded proposals.
26
Reporting and Dissemination Dissemination How will you disseminate findings? Who is responsible? How, where, when will findings be used? Reporting Formative reports – quarterly, biannually Summative reports – final report/end of project Project deliverables
27
Sample RFP Evaluation Plan Two examples of an evaluation plan within an RFP General, limited specifications Complex, very detailed
28
Evaluation Resources CDC: www.cdc.gov/evalwww.cdc.gov/eval University of Toronto: www.utoronto.ca/shp/hcuwww.utoronto.ca/shp/hcu W.K. Kellogg Foundation: www.wkkf.org/Publications/evalhdbk www.wkkf.org/Publications/evalhdbk Connell, J.P., Kubisch, A.C., Schorr, L.B., Weiss, C.H. (1995). New Approaches to Evaluating Community Initiatives, New York, NY: Aspen Institute. Shadish, W.R., Cook, T.D., Leviton, L.C. (1991). Foundations of Program Evaluation. Newbury Park, CA: Sage Publications. Taylor-Powell, E., Steele, S., Douglas, M. (1996). Planning a Program Evaluation. Madison, WI: University of Wisconsin Cooperative Extension.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.