Presentation is loading. Please wait.

Presentation is loading. Please wait.

Agenda: Evaluation Progress Update

Similar presentations


Presentation on theme: "Agenda: Evaluation Progress Update"— Presentation transcript:

0 Customize this slide with agency or workgroup information as needed.
This PowerPoint file is to be used as a template for the Evaluation Workgroup to report on its progress and findings. The local data partner is likely to be the presenter or a co-presenter of this update. A guided discussion follows, during which priorities and programs are reviewed and program specific evaluation results are shared. The group ends the meeting by discussing next steps for reporting recommendations back to the programs to address identified issues. A 2.5-hour block of time should be scheduled for each meeting.  To customize this PowerPoint deck, the evaluation workgroup will need to fill in quite a bit of information about local data collection efforts and the community board’s prioritized outcomes, risk factors and protective factors. Work with your local data partner if you need this information. The evaluation workgroup will also need to add customized information about program implementation and outcome results. Work with your local data partner on these slides as well. The notes section of most slides contains talking points. These can be adapted as appropriate. Italicized text is intended to be delivered as scripted for making important points or to help make smooth transitions. Walk through the presentation with your local data partner as part of your preparation. Note: Since updates are provided to the community board regularly, it may not be necessary to all the slides every time. Introductory slides might only be needed the first time or two. Evidence-Based Practice Group

1 Agenda: Evaluation Progress Update
Purpose: Provide Community Board with an update on evaluation progress Context Review of outcomes, risk factors, protective factors Review of prioritized programs Key evaluation findings Next steps Address any identified implementation issues or gaps in the data Welcome the members of your community board and do brief introductions for any new members. After introductions, review the agenda for the meeting. Share location of restrooms and share any other logistical information participants need. Post a parking lot on a flipchart sheet, and refer participants to it. Different communities use different risk- and protective-focused surveys. We are assuming your community uses the Youth Experience Survey (YES). If your community uses SHARP, then replace mentions of YES with SHARP throughout this presentation. As we are getting started on Evidence2Success, I know there is a variety of questions about aspects of the work that we are putting in place. With our time today, we want to stay focused on sharing evaluation progress and key findings. If a topic is raised in regard to broader questions and considerations about the Evidence2Success work as a whole and it feels like we are losing our focus, I will ask that we put those issues on the parking lot. At the end of the meeting, we will come back to them and determine how to address them.

2 Evaluation: An Integral Part of Evidence2Success
Sharing Proposed Priorities Evaluation: An Integral Part of Evidence2Success 1 Get Started Engage city, public system, school, neighborhood and community leaders Designate a coordinator Engage community board Conduct Youth Experience Survey 2 Get Organized Build all partners’ capacity to engage in joint decision making Orient community board Educate and inform community and systems Begin fund mapping Expand community board 3 DEVELOP COMMUNITY PROFILE Produce a “big picture” view of child well-being Work to interpret data Develop broad strategy focused on priority outcomes, risk factors and protective factors Identify opportunities to shift funding 4 CREATE A PLAN Select tested, effective programs to advance broad strategy Develop short- and long-term action plans Develop financing strategies to support tested, effective programs Share action plan Identify service providers 5 IMPLEMENT AND EVALUATE Implement tested, effective programs and monitor performance Finalize plans and train providers Engage the community in recruitment Celebrate successes Track progress and make changes as needed Handout: Roadmap We are now in Phase 5. Point out the highlights that have been completed in phases 1-4. We are now in the final phase of our roadmap where we are implementing and evaluating tested and effective programs. This is an exciting phase of the work because we can begin to track the progress we’ve made, make changes when needed, and celebrate our successes in the community. Give them a few minutes to read the rest of the handout. Presentation to the Community Board

3 Evidence2Success Implementation Progress Update
Outcome-Focused Planning Here’s another way to look at where we are in the Evidence2Success process.  We’ve done our outcome-focused planning for community-level outcomes (point to community-level outcomes part of graphic).  Then we selected programs/policies to implement in order to achieve those outcomes.  We drafted participant outcomes goals and implementation goals for implementing those programs and policies in a way that will get the impact we want for our children, youth, and families. Then we evaluated those programs to see whether they measured up to our implementation and outcome goals.  We will be sharing those results with you today. Graphic from Communities That Care PLUS Presentation to the Community Board

4 Evaluation Workgroup Charge
Sharing Proposed Priorities Evaluation Workgroup Charge Our mandate is to oversee the monitoring and evaluation of the programs being implemented under Evidence2Success:  Ensure implementation goals are met  Measure improvement in participant outcomes  Measure change in community-level risk, protection and outcomes  Update the community action plan, as needed  As mentioned, today we are here to focus on one piece of the Evidence2Succes work – the evaluation of selected evidence based practices and programs.  The mandate of the Evaluation Workgroup is to  Ensure implementation goals are met  Measure improvement in participant outcomes   Measure change in community-level risk, protection, and outcomes   Update the community action plan, as needed   Presentation to the Community Board

5 Sharing Proposed Priorities
Workgroup Members Add the workgroup members’ names to this slide. In the presentation, introduce the members of the workgroup, and especially any new members. Also introduce the local data partner and recap the data partner’s role. Presentation to the Community Board

6 Work of the Data Partner
Consult on data resources and provide subject matter expertise Useful resources (local, state, federal) and gaps Manage community-level data collection (YES/ChES/SHARP) and track Evidence2Success progress Collect program data for monitoring and evaluation Warehouse collected data Communicate with stakeholders Co-lead evaluation workgroup and build capacity Assist finance workgroup with sustainability The evaluation workgroup is assisted by the local data partner, who is a co-leader and active participant of this workgroup. (Introduce him/her now.) This is a high-level summary of the major buckets of work for the data partner. This is broken out in greater detail in the one page factsheet and in the document Local Data Partner Role and Responsibilities. In the second bullet, list only the survey(s) in use in your community.

7 Evaluation Timeline and Progress
Task Date Progress Create a plan to monitor Action Plan programs XX/XX/XX Completed Provide plans to appropriate workgroup(s) Complete evaluation schedule Establish partnership with data partner Collect pre- and post-test data Re-administer Youth Experience Survey In progress Provide a brief overview of the workgroup's timeline and progress to date. Customize this slide as needed (including the name of the survey in the last row).

8 YES DATA AND SHARED PRIORITIES
Customize the slide text as appropriate: If your community uses SHARP, then substitute SHARP for YES. If your community uses both YES and ChES (the Evidence2Success Childhood Experience Survey), then you can use both. Make sure it is clear that the data from these surveys is community-level data, whose purpose is to provide community-level assessment of health and behavior outcomes, risk factors and protective factors. These were used to identify shared priorities in Phase 3.

9 Where to Look for a Local Data Partner
Individual researcher from nearby university social or health science departments (social work, psychology, sociology, education, health promotion or education, public health, epidemiology, etc.) Research center at a local/regional university (e.g., Institute for Innovation and Implementation at University of Maryland School of Social Work) Independent evaluation studies consultant Private evaluation research firm with expertise in community-based research It is possible to share duties between two or more data partners. The list of responsibilities (Data Partner Role and Responsibilities) is lengthy and very comprehensive. One person might not have all the skills needed.

10 Sharing Proposed Priorities
Identified Risk and Protective Factors and Outcomes Outcomes: Risk Factors: Recall that our school district administered the Youth Experience Survey [or SHARP] to [identify the specific population that completed the survey in your locality], which enabled us to better understand how children and youth in our community are fairing on well-being outcomes. In addition to these outcomes, the survey measures risk factors, which are predictors of problem outcomes, along with protective factors, which buffer against risk factors. The risk and protective factors and outcomes shown here were identified as the key focus areas to be addressed by the selected tested, effective programs. Highlight key findings from the community-level outcomes and risk and protective factor data in this slide. There are likely slides that were created and shared as part of the community board priority-setting process that can be included here. Protective Factors: Presentation to the Community Board

11 Evidence2Success Implementation Progress Update
Example Shared Priorities and Programs Identified by the Community Board Elevated Risk Factors Low Perceived Risk of Drug Use Antisocial Peers Towards Antisocial Behavior Favorable Attitudes Add identified priorities here Add identified priorities here Tested, Effective Programs* Strengthening Families ADD Selected Program HERE Insert in the table and review the priorities and programs the community board identified and a brief review of why each was chosen based on the community-level survey findings.   The community board in Evidence2Success used the data to select priorities and a portfolio of tested, effective programs that the partner agencies committed to funding and implementing collaboratively. Evidence2Success used the Blueprints database to select proven programs that best meet children’s strengths and needs, as well as the community’s unique combination of risk and protective factors * Random assignment evaluation or multiple comparison group evaluations Presentation to the Community Board

12 Example Data Sources The evaluation workgroup collects data on:
Program fidelity (i.e., adherence, dosage, quality of delivery and participant involvement) Using program-specific checklists, observation forms, registration records and participant attendance logs, and training logs Participant outcomes (i.e., knowledge, attitudes, skills, behaviors) Using pre-post surveys and administrative data Community-level outcomes Using data from the Youth Experience Survey (YES), along with national and state comparison data, as well as administrative data Bring sample documents to give the community board a glimpse of some of the data sources. Four types of data are used to assess program fidelity. Program-specific checklists are typically developed by program purveyors. Observation forms are completed by volunteers and staff and, like program-specific checklists, allow for the monitoring of the four fidelity factors. Participant attendance & registration records allow us to monitor saturation and participant involvement. Finally, training logs are another way to assess adherence to the program model. Program participants complete surveys prior to and after the program, and responses are used to assess changes in outcomes. In some cases, administrative data (e.g., grades) can be linked to individuals and may serve as another form of outcome data. Three types of data are used for assessing a community. The main source of data is the community level survey. Survey reports provide data on outcomes, risk factors and protective factors for the children in the school district as a whole. (Hold up a copy of the data report. Tell them you will give highlights of the reports as you present the workgroup’s priorities). It can also be useful to compare our kids’ outcomes data to national or state samples. Finally, administrative data can be used as a source for outcomes, risk factors and protective factors that are difficult for surveys to measure, and these data are also a good source of information of interest to participating public systems. The data partner provides support to the evaluation workgroup for all these functions. Ask if there are questions about any of these data sources. Also be prepared for questions about who the reports are shared with. 

13 KEY EVALUATION FINDINGS

14 Example Program Overview: Strengthening Families
Strengthening Families is a family program designed for parents and children Program Aims: Strengthen family relationships Improve parenting skills Increase youth’s social and life skills Reduce risk factors for delinquency and substance abuse The following set of slides (15–20) provides an example of the things you might consider presenting with regard to each of your selected programs, to include: the program overview and aims, implementation and outcome findings, a summary of findings, issues identified, and next steps.  We recommend that you work with your data partner to create these slides.

15 Example Implementation Fidelity Results
COMPONENT GOAL RESULTS ACHIEVED GOAL ADHERENCE (how much of the content was covered and major modifications) Content Covered 70%-100% of the objectives covered 96% Modifications No or few major modifications to the program model Average of 2.3 modifications per cycle DOSAGE (the number, length and frequency of sessions) Sessions Held Seven two-hour weekly sessions Create one of these slides for each tested, effective program you are implementing. Name the program in the slide’s title bar.

16 Example Implementation Fidelity Results
COMPONENT GOAL RESULTS ACHIEVED GOAL PARTICIPANT INVOLVEMENT (attendance and saturation) Attendance 70%-100% of families attend at least 4 of 7 sessions 79% (85 of 107 families) Saturation 100 families 107 families QUALITY OF DELIVERY (location and special circumstances, etc.) Room Set-up Good set-up (3.0 or higher on a 4-point scale) 3.5 in overall set-up Create one of these slides for each tested, effective program you are implementing. Name the program in the slide’s title bar.

17 Example Participant Outcome Findings
GOAL PRE-SURVEY MEAN SCORE POST-SURVEY MEAN SCORE ACHIEVED GOAL Improve peer pressure skills 3.05 3.36 Increase stress management skills 2.99 3.24 Improve relationships with parents 3.38 3.53 Improve family communication 2.78 Create one of these slides for each tested, effective program you are implementing. Name the program in the slide’s title bar.

18 Example Summary of Results
Results suggest that Strengthening Families was implemented with high fidelity: High adherence to the model High retention of families Surpassed saturation goal High quality of delivery YES findings indicate participants are experiencing significant changes in these areas: Peer pressure skills Stress management skills Relationships with parents A significant change was not indicated for family communication. Create one of these slides for each tested, effective program you are implementing. Name the program in the slide’s title bar. If your community uses SHARP instead of YES, replace YES with SHARP in the slide. Do the same with ChES (the Childhood Experience Survey) if you are referring to ChES data.

19 Issues Identified A few implementation challenges arose
Shortage of time Occasional participant misbehavior Lack of participant response Problems with location Some modifications to the curriculum across the cycles, such as skipped activities due to lack of time Evaluation challenges: Only 62% of youth (72 out of 116) completed both a pre- and a post- test survey

20 Next Steps Report recommendations back to program to address identified issues: Allow more time for sessions, specifically first and last when participants are asked to complete surveys. Provide opportunities for participants to move around, give praise and offer incentives for rule following and participation. Create an environment where all participants feel comfortable communicating. Allow for alternative ways of participating, such as writing, speaking in small groups and talking to facilitators during breaks. Talk to program coordinator to see about possible location changes for future programming. Invest time and resources into improving survey follow-up efforts.

21 QUESTIONS AND DISCUSSION
Evidence2Success Implementation Progress Update QUESTIONS AND DISCUSSION Use this slide as a placeholder to field questions from meeting participants, and from the parking lot, if time permits. Presentation to the Community Board

22


Download ppt "Agenda: Evaluation Progress Update"

Similar presentations


Ads by Google