Getting Results with Evidence-Based Programs: The Importance of Effective Implementation Michelle Hughes, MA, MSW Project Director, Benchmarks Robin Testerman.

Slides:



Advertisements
Similar presentations
Successfully Implementing Evidence-Based Programs for Children and Families in North Carolina A Presentation for the Family Impact Seminar Michelle Hughes,
Advertisements

Scaling Up: From Research to National Implementation
SISEP Dean Fixsen, Karen Blase, Rob Horner, and George Sugai
Barbara Sims Dean Fixsen Karen Blase Caryn Ward National SISEP Center National Implementation Research Network FPG Child Development Center University.
Dean L. Fixsen, Karen A. Blase, Michelle A. Duda, Sandra F. Naoom, Melissa Van Dyke National Implementation Research Network Implementation and System.
Evidence-Informed Planning: Benefiting from Evidence- Based Interventions Defending Childhood January 26, 2011 Melissa K. Van Dyke, LCSW Associate Director.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Building Implementation Capacity to Improve Youth Outcomes Allison Metz, Ph.D. Associate Director National Implementation Research Network University of.
Getting Ready for Phase II of the SSIP
Implementing Evidence-Based Practice in Massachusetts: The Provider Perspective Elizabeth Funk, M.B.A. President/CEO, Mental Health & Substance Abuse Corporations.
Planning for Success: Using Implementation Data to Action Plan for Full and Sustained Implementation Barbara Sims Caryn Ward National SISEP Center National.
Oregon’s Community-Involved Approach to Differential Response Implementation.
WHAT DOES IT TAKE? 5 Lessons Learned from Supporting Evidence-based Home Visiting to Prevent Child Maltreatment Virginia Home Visiting Consortium Meeting.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
RE-EXAMINING THE ROLE OF PROFESSIONAL DEVELOPMENT AND TRAINING EVALUATION THROUGH AN IMPLEMENTATION SCIENCE LENS MICHELLE GRAEF & ROBIN LEAKE NHSTES June.
Dean L. Fixsen, Ph.D. Karen A. Blase, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health Institute Implementing Innovations.
IMPLEMENTING EVIDENCE-BASED PRACTICES IN COMMUNITY CORRECTIONS Stephen M. Haas, Ph.D., Director Office of Research and Strategic Planning JRSA Training.
Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
Overview of the State Systemic Improvement Plan (SSIP) Anne Lucas, WRRC/ECTA Ron Dughman, MPRRC Janey Henkel, MPRRC 2013 WRRC Leadership Forum October.
ISLLC Standard #2 Implementation
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Implementation Science 101 Vestena Robbins, PhD Kentucky Dept for Behavioral Health, Developmental and Intellectual Disabilities.
Implementation Science Vision 21: Linking Systems of Care June 2015 Lyman Legters.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
SectionVideo/PresentSlidesTotal Time Overview + Useable Intervention8:30 min Stages7:19 min Teams PDSA Terri present Drivers8:50 min Lessons Learned +
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
Understanding the Move Toward Evidence-Based Programs: Considerations for Early Childhood Leaders Michelle Hughes, MA, MSW Project Director, Benchmarks.
Improving Outcomes for All Students: Bringing Evidence-Based Practices to Scale March 25, 2009 MN RtI Center Conference Cammy Lehr, Ph.D. EBP & Implementation.
Commissioning Self Analysis and Planning Exercise activity sheets.
Critical Elements Effective teaching Alignment of curriculum and instruction to standards Instructional program coherence Fidelity of implementation Evaluation.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Barbara Sims Brenda Melcher Dean Fixsen Karen Blase Michelle Duda Washington, D.C. July 2013 Keep Dancing After the Music Stops OSEP Project Directors’
Aligning HR & Business Strategy. “The long-held notion that HR would become a truly strategic function is finally being realized.”
Barbara Sims Dean L. Fixsen Karen A. Blase Michelle A. Duda
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Dean L. Fixsen, Karen A. Blase, Sandra F. Naoom, Melissa Van Dyke, Frances Wallace National Implementation Research Network Louis de la Parte Florida Mental.
Implementation Conversations Jennifer Coffey Audrey Desjarlais Steve Goodman.
: The National Center at EDC
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. &
Karen A. Blase, PhD, Allison Metz, PhD and Dean L. Fixsen, PhD Frank Porter Graham Child Development Institute University of North Carolina at Chapel Hill.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
Connecting Resources, Impacting Lives FRCA Quarterly Fall Retreat Tuesday, November 3, 2015 Chris Rubino Director, Strengthening Families Programs.
Catholic Charities Performance and Quality Improvement (PQI)
Copyright © Dean L. Fixsen and Karen A. Blase, Dean L. Fixsen, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health.
Circe Stumbo, West Wind Education Policy Inc. and CCSSO CCSSO/SCEE National Summit on Educator Effectiveness April 10, 2013 Implementation Science: Closing.
State Implementation and Scaling up of Evidence-based Practices U.S. Department of Education July 2010 Dean L. Fixsen, Ph.D., Karen A. Blase, Ph.D. Michelle.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Developed by: July 15,  Mission: To connect family strengthening networks across California to promote quality practice, peer learning and mutual.
Understanding Implementation and Leadership in School Mental Health.
Michelle A. Duda Barbara Sims Dean L. Fixsen Karen A. Blase October 2, 2012 Making It Happen With Active Implementation Frameworks: Implementation Drivers.
What Makes A Juvenile Justice Program Evidence Based? Clay Yeager Policy Director, Blueprints for Healthy Youth Development.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
Min.cenmi.org Michigan Implementation Network Providing Support through District Leadership and Implementation Team April 29, 2010 Michigan Implementation.
Sandra F. Naoom, MSPH National Implementation Research Network Frank Porter Graham Child Development Institute University of North Carolina- Chapel Hill.
Coaching for Impact Susan Barrett
An Introduction to Implementation Tools to Help Build Implementation Capacity SPDG Evaluators May 2012 Michelle A. Duda, Dean L. Fixsen,
District Leadership Team Sustainability Susan Barrett Director, Mid-Atlantic PBIS Network Sheppard Pratt Health.
Developing a System to Build Quality
Using the Hexagon tool to Effectively select a practice
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Key messages Public policy decisions should be based on evaluations of programs that have been implemented with quality. Otherwise, the relative value.
The Hexagon An EBP Exploration Tool
Installation Stage and Implementation Analysis
Presentation transcript:

Getting Results with Evidence-Based Programs: The Importance of Effective Implementation Michelle Hughes, MA, MSW Project Director, Benchmarks Robin Testerman Executive Director, Children’s Center of Surry and Yadkin October 12, 2011

Today’s Presentation Highlights and Clarifications from Webinar #1 The Importance of Implementation – Defining the Implementation Gap – Core Drivers for Implementation – Stages of Implementation Case Study: A Non-Profit’s Story of Moving Toward Evidence-Based Practice and Programs Resources and Questions

Evidence-based programs…. What’s all the fuss?

Observations from Webinar #1 Changing Landscape in Which We Practice… Good intentions are not enough. Increased focus on accountability in multiple sectors. Stewardship of public/philanthropic dollars: limited resources need to be used strategically. Increased scientific knowledge about effective practice/programs. Policy and practice is changing across all levels of government, philanthropic community, and fields.

Observations continued… The importance of rigorous evaluation Different evaluation designs contribute different kinds of knowledge about interventions with children and families. Randomized controlled trials are best at determining cause and effect (did the intervention produce the outcomes) Conventional wisdom is often wrong (e.g., hormone replacement therapy, the DARE program)

Observations continued… BUT evidence alone is not enough….!!! Implementation, Implementation, Implementation….and…. IMPLEMENTATION!!!

Implementation MATTERS! Successfully replicating evidence-based programs requires…. Proven practice + fidelity/quality implementation = Better Outcomes Fidelity: adherence to core elements which contribute to effectiveness A poorly implemented practice/program will yield poor outcomes (and results in a poor investment)

EBP: 5 Point Rating Scale: High = 5; Medium = 3; Low = 1. Midpoints can be used and scored as a 2 or 4. HighMediumLow Need Fit Resources Availability Evidence Readiness for Replication Capacity to Implement Total Score: Need in Agency, Setting Socially Significant Issues Parent & Community Perceptions of Need Data indicating Need Need Fit Fit with current - Initiatives State and Local Priorities Organizational structures Community Values Resource Availability Resource Availability IT Staffing Training Data Systems Coaching & Supervision Administrative & system supports needed Evidence Outcomes – Is it worth it? Fidelity data Cost – effectiveness data Number of studies Population similarities Diverse cultural groups Efficacy or Effectiveness Evidence Assessing Evidence-Based Programs and Practices Intervention Readiness for Replication Qualified purveyor Expert or TA available Mature sites to observe # of replications How well is it operationalized? Are Imp Drivers operationalized? Intervention Readiness for Replication Capacity to Implement Staff meet minimum qualifications Able to sustain Imp Drivers Financially Structurally Buy-in process operationalized Practitioners Families Agency Capacity to Implement © National Implementation Research Network 2009 Adapted from work by Laurel J. Kiser, Michelle Zabel, Albert A. Zachik, and Joan Smith at the University of Maryland

Observations continued…. This is a learning process that should be undertaken collaboratively -- changing systems and practice is a long-term (and challenging) endeavor. everyone is working to figure this out!!! Resources available – websites, experts in NC, your peers in Smart Start, your peers in other disciplines (public health, child welfare, etc.)

Changes in Smart Start Legislation Use of evidence-based and evidence-informed programs and practices in early care and education, health, and family support. – Programs vs. practices – Thinking about EBP’s in early care and education, health and family support – Don’t recreate the wheel…use each other!!!

Changes in Smart Start Legislation Goal: What is your need and why do you think that the strategies/practice/program you have chosen will produce the intended outcome to address those needs? – Other local partnerships – Community-based agencies with whom you partner – Child Care and Early Education Research Connections – Smart Net>Program & Evaluation>Evidence-Based & Evidence-Informed Information>Resources

Let’s talk implementation… Many thanks to Melissa Van Dyke of the National Implementation Research Network at Frank Porter Graham, UNC-Chapel Hill for allowing me to use these slides. NIRN is a FABULOUS resource for those interested in learning more about implementation. Please see their website at

The Challenge Before Us…. “It is one thing to say with the prophet Amos, ‘Let justice roll down like mighty waters’ … … and quite another to work out the irrigation system.” William Sloane Coffin Social activist and clergyman

How: The Implementation Gap RESEARCH PRACTICE Implementation is defined as a specified set of activities designed to put into practice an activity or program of known dimensions. IMPLEMENTATION

Implementation Gap RESEARCH PRACTICE Why Focus on Implementation? IMPLEMENTATION “Children and families cannot benefit from interventions they do not experience.”

Science-to-Service Gap Implementation Gap What is adopted is not used with fidelity and, therefore, does not achieve good outcomes What is used with fidelity is not sustained for a useful period of time What is used with fidelity is not used on a scale sufficient to impact social problems

Implementation Review and synthesis of the implementation research and evaluation literature (1970 – 2004) Multi-disciplinary Multi-sector Multi-national

Insufficient Methods Implementation by laws/compliance by itself does not work Implementation by “following the money” by itself does not work Implementation without changing supporting roles and functions does not work Diffusion/dissemination of information by itself does not lead to successful implementation Training alone, no matter how well done, does not lead to successful implementation Fixsen, Naoom, Blase, Friedman, Wallace, 2005

Copyright © Dean L. Fixsen and Karen A. Blase, 2010 What Works EffectiveNOT Effective Effective NOT Effective IMPLEMENTATION INTERVENTION Actual Benefits (Institute of Medicine, 2000; 2001; 2009; New Freedom Commission on Mental Health, 2003; National Commission on Excellence in Education,1983; Department of Health and Human Services, 1999) Inconsistent; Not Sustainable; Poor outcomes Unpredictable or poor outcomes; Poor outcomes; Sometimes harmful from Mark Lipsey’s 2009 Meta- analytic overview of the primary factors that characterize effective juvenile offender interventions – “... in some analyses, the quality with which the intervention is implemented has been as strongly related to recidivism effects as the type of program, so much so that a well-implemented intervention of an inherently less efficacious type can outperform a more efficacious one that is poorly implemented.”

Implementation Drivers Common features of successful supports to help make full and effective uses of a wide variety of innovations – Staff Competency – Organizational Supports – Leadership

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers Leadership Technical Adaptive

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers SELECTION Be clear about the skills required to implement practice/program Select for the “unteachables” Provide situations in which candidates can demonstrate skills Select for “coachability” Allow for mutual selection Improve likelihood of retention after “investment”

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers TRAINING Acquire knowledge of program/practice “theory of change” (understanding of why this works) Skills-based, adult learning Practice and feedback Lots of focus spent on pre- service, but on-going, intentional skills development needed for professional growth and expertise

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers COACHING & SUPERVISION Often overlooked, but necessary to ensure implementation Solidify newly emerging skills and competencies Opportunity to present challenges, receive feedback and support Ensure fidelity Not a “luxury” – critical to good practice

Training and Coaching OUTCOMES % of Participants who Demonstrate Knowledge, Demonstrate New Skills in a Training Setting, and Use new Skills in the Classroom TRAINING COMPONENTS Knowledge Skill Demonstration Use in the Classroom Theory and Discussion 10%5%0%..+Demonstration in Training 30%20%0% …+ Practice & Feedback in Training 60% 5% …+ Coaching in Classroom 95% Joyce and Showers, 2002

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers Leadership Technical Adaptive FIDELITY/PERFORMANCE ASSESSMENT Measuring fidelity to the core components of the program Are there already tools provided by purveyor? Who administers fidelity tools? What kind of training? How are tools administered and when? Fidelity is linked to outcomes and part of program improvement plans

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration DecisionSupport Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers DATA SYSTEMS Measure outcomes AND fidelity. Why? Low fidelity and poor outcomes = implementation issue High fidelity and poor outcomes = effectiveness issue Output data, outcomes, fidelity, staff performance assessment, client satisfaction

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers FACILITATIVE ADMINISTRATION Helps ensure implementation actually occurs with fidelity and “with ease” Align policies and procedures with required elements of program Policy/practice feedback loops within agency

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers SYSTEMS INTERVENTION Aligning policy, practice, funding, expectation of program/agency within larger system Changing the system so that the program may be effectively implemented VERSUS changing the program to fit into the system

© Fixsen & Blase, 2008 Performance Assessment (fidelity measurement) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Improved outcomes for children and families Implementation Drivers Leadership Technical Adaptive

Implementation is…. An ongoing process, not a one-time event. It is a on- going series of actions, decisions, assessments and corrective actions to bring a program or practice to full implementation (with fidelity, quality, sustained by agency policy, funding, environment) Full implementation a major program/initiative takes anywhere from 2-4 years READINESS, READINESS, READINESS, READINESS!!

Assessing Capacity for Implementation Readiness assessment allows you to analyze your capacity among the implementation drivers: Do we have the right staff to implement this program? Do we have capacity to provide effective coaching/supervision to improve program delivery? If not, can we get support? Do we have a data system to support decision- making? Do we know how to use it effectively?

Plug Into Existing Implementation Supports Implementation Supports Internal to the agency itself (agencies that have experience with program delivery, CQI systems, experience with developing coaching/supervision capacity, etc.) National purveyor (national office) State level “intermediaries” Prevent Child Abuse North Carolina Incredible Years Pre-School Circle of Parents Child Treatment Program TFCBT and other trauma treatment for children NC Division of Public Health Home Visiting Infrastructure

Keeping Your Eye on the Drivers… For some local partnerships, you are delivering practices and programs For other local partnerships, you are contracting with local community-based agencies to deliver the practices and programs Someone needs to “own” the drivers. How can you assure good program implementation Data Systems – outcomes, fidelity Establishing strong contractual relationships, open communication, transparency

A Key to Successful Implementation The way in which a change process is conceptualized is far more fateful for success or failure than the content one seeks to implement. You can have the most creative, compellingly valid, productive idea in the world, but whether it can become embedded and sustained in a socially complex setting will be primarily a function of how you conceptualize the implementation-change process. (Sarason, 1996, p. 78)

On the Ground One Agency’s Story of its Work with Evidence-Based Programs Children’s Center of Surry & Yadkin

How we started Child Maltreatment Prevention Training Institute in 2006 with Prevent Child Abuse NC Moved from 100% intervention and treatment to add prevention (2005) to 75% intervention and prevention and 25% treatment. Data and outcomes – we focused on the number served not the outcomes – now we are outcome driven

What we learned It takes time Assess --- Assess --- Assess Community’s needs Organization readiness Infrastructure Staff Board Referring Agencies Families

Building a Strong Foundation for Success Technical Support Coaching Supervision of staff and program Continuous Quality Improvement – Program Committee and COA (accreditation) Has to become a part of your everyday conversation with staff, board, and community partners Policies and Procedures

What we hope our Community has learned Think big! It’s more than “how many people did you serve?” Ask the hard questions Follow the outcomes Do your homework – Know your EBP’s Visit programs Provide funding for technical support and coaching from outside source

To Summarize…. Getting good outcomes for kids and families is a two-part formula… Building your capacity to effectively implement a specific EBP will build your organization’s capacity to be effective in multiple areas (the learning translates) Important component of program selection is to assess your agency’s capacity to implement – go through an analysis of the implementation drivers and how well operationalized.

To Summarize…. Implementation is a process of growth and development – not a one-time event -- a continuous learning cycle. Implementation takes time. Start with one or two programs, not all at once. Focusing on quality implementation of evidence- based programs can positively transform your agency, your practice, and your relationship with your community, funders, and stakeholders.

NCPC Resources Program Staff – Ann Ward, , – Cynthia Turner, , – Gale Wilson, , – Lois Slade, , Evaluation Staff – Leigh Poole, , – Meshell Reynolds, Counts & reporting, , Smart Net Resources – Program & Evaluation tab> folders for Evidence-Based/Informed programs, & for Family Support, Health, & Early Care & Education

Presenters’ Contact Information Michelle Hughes, MA, MSW Project Director, Benchmarks (919) Robin Testerman Executive Director, Children’s Center of Surry and Yadkin