Download presentation
Presentation is loading. Please wait.
Published byDella Bates Modified over 8 years ago
1
Getting Results with Evidence-Based Programs: The Importance of Effective Implementation Michelle Hughes, MA, MSW Project Director, Benchmarks Robin Testerman Executive Director, Children’s Center of Surry and Yadkin October 12, 2011
2
Today’s Presentation Highlights and Clarifications from Webinar #1 The Importance of Implementation – Defining the Implementation Gap – Core Drivers for Implementation – Stages of Implementation Case Study: A Non-Profit’s Story of Moving Toward Evidence-Based Practice and Programs Resources and Questions
3
Evidence-based programs…. What’s all the fuss?
4
Observations from Webinar #1 Changing Landscape in Which We Practice… Good intentions are not enough. Increased focus on accountability in multiple sectors. Stewardship of public/philanthropic dollars: limited resources need to be used strategically. Increased scientific knowledge about effective practice/programs. Policy and practice is changing across all levels of government, philanthropic community, and fields.
5
Observations continued… The importance of rigorous evaluation Different evaluation designs contribute different kinds of knowledge about interventions with children and families. Randomized controlled trials are best at determining cause and effect (did the intervention produce the outcomes) Conventional wisdom is often wrong (e.g., hormone replacement therapy, the DARE program)
6
Observations continued… BUT evidence alone is not enough….!!! Implementation, Implementation, Implementation….and…. IMPLEMENTATION!!!
7
Implementation MATTERS! Successfully replicating evidence-based programs requires…. Proven practice + fidelity/quality implementation = Better Outcomes Fidelity: adherence to core elements which contribute to effectiveness A poorly implemented practice/program will yield poor outcomes (and results in a poor investment)
8
EBP: 5 Point Rating Scale: High = 5; Medium = 3; Low = 1. Midpoints can be used and scored as a 2 or 4. HighMediumLow Need Fit Resources Availability Evidence Readiness for Replication Capacity to Implement Total Score: Need in Agency, Setting Socially Significant Issues Parent & Community Perceptions of Need Data indicating Need Need Fit Fit with current - Initiatives State and Local Priorities Organizational structures Community Values Resource Availability Resource Availability IT Staffing Training Data Systems Coaching & Supervision Administrative & system supports needed Evidence Outcomes – Is it worth it? Fidelity data Cost – effectiveness data Number of studies Population similarities Diverse cultural groups Efficacy or Effectiveness Evidence Assessing Evidence-Based Programs and Practices Intervention Readiness for Replication Qualified purveyor Expert or TA available Mature sites to observe # of replications How well is it operationalized? Are Imp Drivers operationalized? Intervention Readiness for Replication Capacity to Implement Staff meet minimum qualifications Able to sustain Imp Drivers Financially Structurally Buy-in process operationalized Practitioners Families Agency Capacity to Implement © National Implementation Research Network 2009 Adapted from work by Laurel J. Kiser, Michelle Zabel, Albert A. Zachik, and Joan Smith at the University of Maryland
9
Observations continued…. This is a learning process that should be undertaken collaboratively -- changing systems and practice is a long-term (and challenging) endeavor. everyone is working to figure this out!!! Resources available – websites, experts in NC, your peers in Smart Start, your peers in other disciplines (public health, child welfare, etc.)
10
Changes in Smart Start Legislation Use of evidence-based and evidence-informed programs and practices in early care and education, health, and family support. – Programs vs. practices – Thinking about EBP’s in early care and education, health and family support – Don’t recreate the wheel…use each other!!!
11
Changes in Smart Start Legislation Goal: What is your need and why do you think that the strategies/practice/program you have chosen will produce the intended outcome to address those needs? – Other local partnerships – Community-based agencies with whom you partner – Child Care and Early Education Research Connections www.childcareresearch.orgwww.childcareresearch.org – Smart Net>Program & Evaluation>Evidence-Based & Evidence-Informed Information>Resources
12
Let’s talk implementation… Many thanks to Melissa Van Dyke of the National Implementation Research Network at Frank Porter Graham, UNC-Chapel Hill for allowing me to use these slides. NIRN is a FABULOUS resource for those interested in learning more about implementation. Please see their website at http://www.fpg.unc.edu/~nirn/ http://www.fpg.unc.edu/~nirn/
13
The Challenge Before Us…. “It is one thing to say with the prophet Amos, ‘Let justice roll down like mighty waters’ … … and quite another to work out the irrigation system.” William Sloane Coffin Social activist and clergyman
14
How: The Implementation Gap RESEARCH PRACTICE Implementation is defined as a specified set of activities designed to put into practice an activity or program of known dimensions. IMPLEMENTATION
15
Implementation Gap RESEARCH PRACTICE Why Focus on Implementation? IMPLEMENTATION “Children and families cannot benefit from interventions they do not experience.”
16
Science-to-Service Gap Implementation Gap What is adopted is not used with fidelity and, therefore, does not achieve good outcomes What is used with fidelity is not sustained for a useful period of time What is used with fidelity is not used on a scale sufficient to impact social problems
17
Implementation Review and synthesis of the implementation research and evaluation literature (1970 – 2004) Multi-disciplinary Multi-sector Multi-national
18
Insufficient Methods Implementation by laws/compliance by itself does not work Implementation by “following the money” by itself does not work Implementation without changing supporting roles and functions does not work Diffusion/dissemination of information by itself does not lead to successful implementation Training alone, no matter how well done, does not lead to successful implementation Fixsen, Naoom, Blase, Friedman, Wallace, 2005
19
Copyright © Dean L. Fixsen and Karen A. Blase, 2010 What Works EffectiveNOT Effective Effective NOT Effective IMPLEMENTATION INTERVENTION Actual Benefits (Institute of Medicine, 2000; 2001; 2009; New Freedom Commission on Mental Health, 2003; National Commission on Excellence in Education,1983; Department of Health and Human Services, 1999) Inconsistent; Not Sustainable; Poor outcomes Unpredictable or poor outcomes; Poor outcomes; Sometimes harmful from Mark Lipsey’s 2009 Meta- analytic overview of the primary factors that characterize effective juvenile offender interventions – “... in some analyses, the quality with which the intervention is implemented has been as strongly related to recidivism effects as the type of program, so much so that a well-implemented intervention of an inherently less efficacious type can outperform a more efficacious one that is poorly implemented.”
20
Implementation Drivers Common features of successful supports to help make full and effective uses of a wide variety of innovations – Staff Competency – Organizational Supports – Leadership
21
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers Leadership Technical Adaptive
22
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers SELECTION Be clear about the skills required to implement practice/program Select for the “unteachables” Provide situations in which candidates can demonstrate skills Select for “coachability” Allow for mutual selection Improve likelihood of retention after “investment”
23
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers TRAINING Acquire knowledge of program/practice “theory of change” (understanding of why this works) Skills-based, adult learning Practice and feedback Lots of focus spent on pre- service, but on-going, intentional skills development needed for professional growth and expertise
24
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers COACHING & SUPERVISION Often overlooked, but necessary to ensure implementation Solidify newly emerging skills and competencies Opportunity to present challenges, receive feedback and support Ensure fidelity Not a “luxury” – critical to good practice
25
Training and Coaching OUTCOMES % of Participants who Demonstrate Knowledge, Demonstrate New Skills in a Training Setting, and Use new Skills in the Classroom TRAINING COMPONENTS Knowledge Skill Demonstration Use in the Classroom Theory and Discussion 10%5%0%..+Demonstration in Training 30%20%0% …+ Practice & Feedback in Training 60% 5% …+ Coaching in Classroom 95% Joyce and Showers, 2002
26
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers Leadership Technical Adaptive FIDELITY/PERFORMANCE ASSESSMENT Measuring fidelity to the core components of the program Are there already tools provided by purveyor? Who administers fidelity tools? What kind of training? How are tools administered and when? Fidelity is linked to outcomes and part of program improvement plans
27
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration DecisionSupport Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers DATA SYSTEMS Measure outcomes AND fidelity. Why? Low fidelity and poor outcomes = implementation issue High fidelity and poor outcomes = effectiveness issue Output data, outcomes, fidelity, staff performance assessment, client satisfaction
28
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers FACILITATIVE ADMINISTRATION Helps ensure implementation actually occurs with fidelity and “with ease” Align policies and procedures with required elements of program Policy/practice feedback loops within agency
29
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers SYSTEMS INTERVENTION Aligning policy, practice, funding, expectation of program/agency within larger system Changing the system so that the program may be effectively implemented VERSUS changing the program to fit into the system
30
© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers Leadership Technical Adaptive
31
Implementation is…. An ongoing process, not a one-time event. It is a on- going series of actions, decisions, assessments and corrective actions to bring a program or practice to full implementation (with fidelity, quality, sustained by agency policy, funding, environment) Full implementation a major program/initiative takes anywhere from 2-4 years READINESS, READINESS, READINESS, READINESS!!
32
Assessing Capacity for Implementation Readiness assessment allows you to analyze your capacity among the implementation drivers: Do we have the right staff to implement this program? Do we have capacity to provide effective coaching/supervision to improve program delivery? If not, can we get support? Do we have a data system to support decision- making? Do we know how to use it effectively?
33
Plug Into Existing Implementation Supports Implementation Supports Internal to the agency itself (agencies that have experience with program delivery, CQI systems, experience with developing coaching/supervision capacity, etc.) National purveyor (national office) State level “intermediaries” Prevent Child Abuse North Carolina Incredible Years Pre-School Circle of Parents Child Treatment Program TFCBT and other trauma treatment for children NC Division of Public Health Home Visiting Infrastructure
34
Keeping Your Eye on the Drivers… For some local partnerships, you are delivering practices and programs For other local partnerships, you are contracting with local community-based agencies to deliver the practices and programs Someone needs to “own” the drivers. How can you assure good program implementation Data Systems – outcomes, fidelity Establishing strong contractual relationships, open communication, transparency
35
A Key to Successful Implementation The way in which a change process is conceptualized is far more fateful for success or failure than the content one seeks to implement. You can have the most creative, compellingly valid, productive idea in the world, but whether it can become embedded and sustained in a socially complex setting will be primarily a function of how you conceptualize the implementation-change process. (Sarason, 1996, p. 78)
36
On the Ground One Agency’s Story of its Work with Evidence-Based Programs Children’s Center of Surry & Yadkin
37
How we started Child Maltreatment Prevention Training Institute in 2006 with Prevent Child Abuse NC Moved from 100% intervention and treatment to add prevention (2005) to 75% intervention and prevention and 25% treatment. Data and outcomes – we focused on the number served not the outcomes – now we are outcome driven
38
What we learned It takes time Assess --- Assess --- Assess Community’s needs Organization readiness Infrastructure Staff Board Referring Agencies Families
39
Building a Strong Foundation for Success Technical Support Coaching Supervision of staff and program Continuous Quality Improvement – Program Committee and COA (accreditation) Has to become a part of your everyday conversation with staff, board, and community partners Policies and Procedures
40
What we hope our Community has learned Think big! It’s more than “how many people did you serve?” Ask the hard questions Follow the outcomes Do your homework – Know your EBP’s Visit programs Provide funding for technical support and coaching from outside source
41
To Summarize…. Getting good outcomes for kids and families is a two-part formula… Building your capacity to effectively implement a specific EBP will build your organization’s capacity to be effective in multiple areas (the learning translates) Important component of program selection is to assess your agency’s capacity to implement – go through an analysis of the implementation drivers and how well operationalized.
42
To Summarize…. Implementation is a process of growth and development – not a one-time event -- a continuous learning cycle. Implementation takes time. Start with one or two programs, not all at once. Focusing on quality implementation of evidence- based programs can positively transform your agency, your practice, and your relationship with your community, funders, and stakeholders.
43
NCPC Resources Program Staff – Ann Ward, 919-821-9556, award@ncsmartstart.orgaward@ncsmartstart.org – Cynthia Turner, 919-821-9565, cturner@ncsmartstart.orgcturner@ncsmartstart.org – Gale Wilson, 919-821-9563, gwilson@ncsmartstart.orggwilson@ncsmartstart.org – Lois Slade, 919-821-9577, lslade@ncsmartstart.orglslade@ncsmartstart.org Evaluation Staff – Leigh Poole, 919-821-9580, lpoole@ncsmartstart.orglpoole@ncsmartstart.org – Meshell Reynolds, Counts & reporting, 919-821-9567, mreynolds@ncsmartstart.org mreynolds@ncsmartstart.org Smart Net Resources – Program & Evaluation tab> folders for Evidence-Based/Informed programs, & for Family Support, Health, & Early Care & Education
44
Presenters’ Contact Information Michelle Hughes, MA, MSW Project Director, Benchmarks mhughes@benchmarksnc.org (919) 357-7361 www.benchmarksnc.org Robin Testerman Executive Director, Children’s Center of Surry and Yadkin robin@surrychildren.com 336-386-9144 www.surrychildren.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.