Download presentation
Presentation is loading. Please wait.
Published byTerry Hinch Modified over 10 years ago
1
Evaluation Capacity Building Identifying and Addressing the Fields Needs
2
Learning Objectives Understanding of the fields current evaluation capacity Understanding of CNCSs strategy for building the fields evaluation capacity
3
Assessing Current Capacity Discussions with State Service Commissions and National Direct Grantees Reviews of AmeriCorps Grantee Evaluation Plans and Reports Focus group with AmeriCorps Program Officers Inventory of CNCS Evaluation Resources and Tools
4
Discussions with the Field Grant Size Respondent Type LargeSmall Total National Direct Grantees Continuing426 Re-compete202 State Competitive Grantees Continuing235 Re-compete101 State Service Commissions 5 Total 19
5
Key Discussion Questions What materials, tools, and technical assistance are needed by grantees in order to fulfill their CNCS evaluation requirements? What is the grantees current capacity to conduct evaluations? What is the State Commissions current capacity to support and advise their sub-grantees on evaluation? What materials are needed by the State Service Commissions in order to set their own evaluation standards and assist their sub-grantees in fulfilling their evaluation requirements?
6
Findings from the Field Grantees who participated in this assessment have limited knowledge of basic evaluation methodology and concepts Both grantees and state service commission staff are unclear of CNCS evaluation requirements, specifically when products are due (e.g., evaluation plans and reports) Both grantees and state service commission staff need guidance to determine which evaluation designs will allow them to demonstrate project impact and be in compliance with the requirements
7
Findings from the Field Grantees are not allocating sufficient funds to conduct experimental and quasi-experimental evaluations Grantees do not readily distinguish between performance measurement and program evaluation Grantees want access to resources other than their local evaluator for advice on their evaluations
8
Reviews of Evaluation Plans and Reports The CNCS Research and Evaluation office reviewed 33 small (less than $500,000) applicants to assess evaluation plans and reports The NORC team reviewed 23 large ($500,000 or more) applicants to assess evaluation plans and reports
9
Assessment Forms: Evaluation Plans & Reports Description of the intervention/program – Problem/Issue statement – Program impact theory Evaluation questions and design – Evaluation objectives and research questions – Evaluation methodology/design – Outcomes
10
Assessment Forms: Evaluation Plans & Reports Data collection methodology and procedures – Types and sources of data collected – Population/Sample – Data analysis Evaluation results and conclusions – Intended use of evaluation results/evaluation results – Conclusions and potential next steps (Reports only) – Limitations of the evaluation (Reports only)
11
Assessment Findings: Small Grantees Most evaluation reports did not meet CNCS evaluation requirements as defined in the CFR Evaluation plans did not describe evaluations capable of determining program impact in accordance with the CFR Many grantees seem to equate performance measurement and monitoring with program evaluation
12
Assessment Findings: Small Grantees Capacity for evaluation is promising – Most applications described implicit theories of change connecting community needs to program resources, activities, outputs and outcomes – Some applicants described program models informed by evidence such as prior program performance data, peer-reviewed research on similar practices or activities, and research on national models
13
Assessment Findings: Small Grantees Evaluation Capacity Continued… – Many of the applicants described reasonable pre/post program outcomes and had quality data sources ; some applicants also gathered data post- program participation – All of the evaluation reports or summaries reviewed (with 1 exception) were process or formative evaluations – this is an important program practice to ensure quality programming
14
Assessment Findings: Large Grantees Only two (of 6) evaluation plans reviewed included sufficient detail about the evaluation approach to assess particular aspects of the design A majority of grantees are not designing and implementing evaluations that address questions about program impact in accordance with the CFR
15
Assessment Findings: Large Grantees Capacity for evaluation is promising – A few grantees are moving in the right direction by conducting evaluations that gather pre and post data – A few grantees are implementing experimental or quasi-experimental designs
16
Focus Group Discussion Themes What feedback have you received on the quality and usefulness of the CNCS evaluation resources and tools that are available to applicants and grantees? What questions do you typically receive from AmeriCorps grantees about their evaluations? Are there specific types of technical assistance that are requested by AmeriCorps grantees on evaluation design and implementation? Are there additional materials, tools, or assistance that AmeriCorps grantees need in order for them to successfully design and execute evaluations of their projects? What training and support do you need to assist your grantees in fulfilling the evaluation requirements?
17
Focus Group Findings Both Program Officers and grantees expressed an interest in learning more about evaluation and evidence There is a need to increase awareness and understanding of CNCSs evaluation requirements
18
Inventory of CNCS Evaluation Resources and Tools CNCS evaluation resources and tools available to grantees via the Knowledge Network were reviewed and inventoried. A systematic method of reviewing and inventorying all evaluation material included developing a mapping of the contents and developing an assessment form to standardize the review of the material.
19
Inventory Findings Evaluation resources were not easy to access via the Knowledge Network website Many resources were dated and did not reflect the most current thinking in evaluation methods Informational gaps include guidance on: – evaluation planning – evaluation management – data analysis – use of existing data
20
Strengthening Capacity Findings from these activities have informed our capacity-building strategy for the coming year
21
Evaluation Capacity Building: Summer 2013 Disseminate Evaluation FAQs Conduct Webinars: – Evaluation Capacity Building – CNCS Evaluation Requirements and FAQs Create central location for evaluation materials on website
22
Evaluation Capacity Building: Fall 2013 Grantee Symposium, Performance Measurement & Evaluation Track – Performance Measurement Pre-Conference Workshop – Developing evaluation plans and reports – Evaluation 101 Initiate individualized technical assistance – 1:1 coaching to help finalize evaluation plans, use performance monitoring and evaluation for program improvement, and develop evaluation reports
23
Evaluation Capacity Building: 2013 - 2014 Individualized technical assistance Evaluation Core Curriculum (webinars, online courses, downloadable resources) – Developing evaluation plans and reports – Evaluation 101 – Logic models – How to report evaluation findings – How to manage an external evaluation – Budgeting for evaluation
24
Evaluation Capacity Building: Your Technical Assistance Team AmeriCorps Program Officers – Primary point of contact CNCS Office of Research and Evaluation Staff – Partnered with program officers to provide evaluation expertise NORC Team – Partnered with program officers and research/evaluation staff to increase CNCSs evaluation technical assistance capacity
25
Q & A ?????
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.