Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation SPDG National Conference Washington,

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

Research Findings and Issues for Implementation, Policy and Scaling Up: Training & Supporting Personnel and Program Wide Implementation
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
April 10, 2013 SPDG Implementation Science Webinar #3: Organization Drivers.
Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation SPDG National Conference Washington,
A Quick Look at Implementation Science Mary Louise Peters Technical Assistance Specialist National Early Childhood Technical Assistance Center
What Is “It” and How Do We Make “It” Happen Karen Blase, PhD Dean L. Fixsen, PhD Melissa Van Dyke, LCSW Michelle Duda, PhD Frank Porter Graham Child Development.
Program Wide Approaches for Addressing Children’s Challenging Behavior Mary Louise Hemmeter University of Illinois at Urbana Champaign Lise Fox University.
Beth Rous University of Kentucky Working With Multiple Agencies to Plan And Implement Effective Transitions For Head Start Children Beth Rous University.
performance INDICATORs performance APPRAISAL RUBRIC
Coaching Workshop.
Research to Practice: Implementing the Teaching Pyramid Mary Louise Hemmeter Vanderbilt University
Developing School-Based Systems of Support: Ohio’s Integrated Systems Model Y.S.U. March 30, 2006.
Administrator Checklist Research and Training Center on Service Coordination.
Meeting SB 290 District Evaluation Requirements
SW-PBS District Administration Team Orientation
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
United Advocates for Children of California 1401 El Camino Avenue, Suite 340 Sacramento, CA (916) direct  (866) toll free.
Strengthening Service Quality © The Quality Service Review Institute, a Division of the Child Welfare Policy & Practice Group, 2014.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
1 Adopting and Implementing a Shared Core Practice Framework A Briefing/Discussion Objectives: Provide a brief overview and context for: Practice Models.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
Barbara Sims, Co-Director National SISEP Center FPG Child Development Center University of North Carolina at Chapel Hill Greensboro.NC March 20, 2013 Implementation.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Implementation Science Vision 21: Linking Systems of Care June 2015 Lyman Legters.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
SectionVideo/PresentSlidesTotal Time Overview + Useable Intervention8:30 min Stages7:19 min Teams PDSA Terri present Drivers8:50 min Lessons Learned +
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Thomas College Name Major Expected date of graduation address
Ensuring the Fundamentals of Care in Family Planning and Reproductive Health Services MODULE 2 Facilitative Supervision for Quality Improvement Curriculum.
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Understanding TASC Marc Harrington, LPC, LCASI Case Developer Region 4 TASC Robin Cuellar, CCJP, CSAC Buncombe County.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Barbara Sims Brenda Melcher Dean Fixsen Karen Blase Michelle Duda Washington, D.C. July 2013 Keep Dancing After the Music Stops OSEP Project Directors’
Practice Model Elements Theoretical framework Values and principles Casework components Practice elements Practice behaviors.
Your Presenters Melissa Connelly, Director, Regional Training Academy Coordination Project, CalSWEC Sylvia Deporto, Deputy Director, Family & Children’s.
Quad-level Engagement: Leveraging Partnerships That Support Change in School and Statewide Professional Development Practices.
Mountains and Plains Child Welfare Implementation Center Maria Scannapieco, Ph.D. Professor & Director Center for Child Welfare UTA SSW National Resource.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
: The National Center at EDC
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Terry Deane Donna Breger-Stanton Irma Walker-Adame Sharon Gorman Lauri Paolinetti.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Striving Towards Excellence in Comprehensive Care: What do Children Need? July 10, 2007 Christopher A. Kus, M.D., M.P.H.
These slides were taken from: Practice Profiles Melissa Nantais, Ph.D. Professional Learning Coordinator Michigan’s Integrated Behavior & Learning Supports.
Common Core Parenting: Best Practice Strategies to Support Student Success Core Components: Successful Models Patty Bunker National Director Parenting.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Pennsylvania Training and Technical Assistance Network PAPBS Network Coaches Day January 28, Fidelity Measures Lisa Brunschwyler- School Age- School.
EFFECTIVE IMPLEMENTATION: EXPLORATION
Standards and Competencies for Cancer Chemotherapy Nursing Practice in Canada: CANO/ACIO AN INTRODUCTION.
Social Work Competencies Social Work Ethics
District Literacy Leaders Network Meeting March 24, :00am-12:00pm Dr. LaWonda Smith Manager, English Language Arts Dr. Argentina Back Manager, Multilingual.
Canberra Chapter July PMI Chapter Meeting July 2007 PMCDF Competence Framework A presentation by Chris Cartwright.
Sandra F. Naoom, MSPH National Implementation Research Network Frank Porter Graham Child Development Institute University of North Carolina- Chapel Hill.
CHW Montana CHW Fundamentals
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Child Outcomes Summary Process April 26, 2017
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
Dr. Phyllis Underwood REL Southeast
Presentation transcript:

Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation SPDG National Conference Washington, DC March 5, 2013 Allison Metz, PhD, Associate Director, NIRN Frank Porter Graham Child Development Institute University of North Carolina

Goals for Today Define fidelity and its link to outcomes Identify strategies for developing fidelity measures Discuss fidelity within a stage-based context Describe the use of Implementation Drivers to promote high fidelity Provide case example

“ PROGRAM FIDELITY ” “The degree to which the program or practice is implemented ‘as intended’ by the program developers and researchers.” “Fidelity measures detect the presence and strength of an intervention in practice.”

Context, Compliance, and Competence Three components – Context: Structural aspects that encompass the framework for service delivery – Compliance: The extent to which the practitioner uses the core program components – Competence: Process aspects that encompass the level of skill shown by the practitioner and the “way in which the service is delivered” Definition of Fidelity

Purpose and Importance Interpret outcomes – is this an implementation challenge or intervention challenge? Detect variations in implementation Replicate consistently Ensure compliance and competence Develop and refine interventions in the context of practice Identify “active ingredients” of program Fidelity

Socially Significant Outcomes Effective Interventions A well- operationalized “What” Effective Implementation Methods Enabling Contexts Formula for Success

Clear description of the program Clear essential functions that define the program Operational definitions of essential functions (practice profiles; do, say) Practical performance assessment Usable Intervention Criteria

Practice Profiles Developing Fidelity Measures Practice Profiles Operationalize the Work Describe the essential functions that allow a model to be teachable, learnable, and doable in typical human service settings Promote consistency across practitioners at the level of actual service delivery Consist of measurable and/or observable, behaviorally- based indicators for each essential function Gene Hall and Shirley Hord, (2010) Implementing Change: Patterns, Principles, and Potholes (3rd Edition)

Measuring Competency Practice Profiles For each Essential Function:  Identifies “expected” activities  Identifies “developmental” variation(s)in practice  Identifies “unacceptable,” incompatible, or undesirable practices

Sample Template Practice Profiles

Case Example Differential Response Implementation Science Engagement Assessment Partnership Goal Planning Implementation  Communication  Evaluation  Advocacy  Culturally Competent Service Delivery Functions

Case Example Differential Response Practice Profiles

Multiple Purpose for Implementation Practice Profiles If you know what “it” is then: You know the practice to be implemented You can improve “it” Increased ability to effectively develop the Drivers Increased ability to replicate “it” More likely to deliver high quality services Outcomes can be accurately interpreted Common language and deeper understanding

When are we ready to assess fidelity? Stages of Implementation Practice profiles are a part of stage-based work. When we are engaged in program development work, practice profiles operationalize the intervention so that installation activities can be effective and fidelity can be measured during initial implementation. Stages

When are they developed? Practice Profiles In order to create the necessary conditions for… Creating practitioner competence and confidence Changing organizations and systems ….we need to define our program and practice adequately so that we can install the Drivers necessary to promote consistent implementation of the specific activities associated with the essential functions of the new service(s) Drivers

Fidelity Measures Practice Profiles  Start with the Expected/Proficient column  Develop an indicator for each Expected/Proficient Activity  Identify “evidence” that this activity has taken place  Identify “evidence” that this activity has taken place with high quality  Identify potential data source(s)

Performance Assessment Practice Profiles Engagement IndicatorData SourceEvidence that engagement is happening Evidence that engagement is happening WELL Occurrence of visits  Initial  Assessment  Goal Planning Database  X visits in y months  Time spent in those visits  What engagement interventions were used during the visits?  Outcome of visits/what happened?

Establishing Fidelity Practice Profiles II. Engagement- The ongoing ability to establish and sustain a genuinely, supportive relationship with family while developing a partnership, establishing healthy boundaries, and maintaining contact as mutually negotiated. The ability to identify risk and protective factors, including family’s positive supports and assess the best way to engage the family and/or who in the family to engage first. A. Initial Engagement SC explains service; ensures family’s understanding that service is voluntary; and describes the scope of service Engagement Survey (ES) SC clearly establishes the purpose of involvement with the family ES Family signs service agreementDB Family signs consentsDB Health status forms completed (by SC with family)DB Where appropriate, SC alerts Educational Advocate (EA) Educational Advocate & DB Conducted an Engagement Activity within 60 days of the service agreement A.Who was there (list) B.What activity DB SC demonstrates respect, genuineness, and empathy for all family members, as defined by the family ES (Q:1,3,&18)

Establishing Fidelity Practice Profiles B. Ongoing Engagement (SC creates trust and buy-in) Providing services that clients view as relevant and helpfulES (Q:3,10,12,13,&14) Parents/caregivers to listen carefully, obtain information, and begin to develop trust ES (Q:1&25) SC is consistent, reliable and honest with familiesES (Q: 35) SC maintains contact as negotiated with family DB (Add to DB) Respecting the culture, racial, ethnic, linguistic, and religious/spiritual backgrounds, and sexual orientation of children, youth, and families and uses as protective factors ES (Q:14) Motivational Interviewing (use to elicit change)DB Family-centered case planning and management: includes all family members in process, reflected in SP DB/SP-SCS review (Checkbox/tab) Use of family satisfaction measures feedback to inform workSCS

5 Steps New or established criteria 1. Assure fidelity assessors are available, understand the program or innovation, and are well versed in the education setting 2. Develop schedule for conducting fidelity assessments 3. Assure adequate preparation for teachers/practitioners being assessed 4. Report results of the fidelity assessment promptly 5. Enter results into decision-support data system Fidelity Data Collection

Implementation Supports Promote High Fidelity Fidelity is an implementation outcome How can we create an implementation infrastructure that supports high fidelity implementation?

IMPROVED OUTCOMES FOR CHILDREN AND FAMILIES Performance Assessment (fidelity) Coaching Training Selection Integrated & Compensatory Systems Intervention Facilitative Administration Decision Support Data System Adaptive Technical Competency Drivers Organization Drivers Leadership Drivers Integrated & Compensatory

Building the Infrastructure Practice Profiles Differential Response Essential Functions SelectionEngagementAssessmentPartnershipCommunicationEvaluation What are prerequisites (skills, value, knowledge) practitioners ideally would have when hired or redeployed to implement DR? What features of DR practice would be helpful to assess through behavioral rehearsals during the selection process? What aspects of DR practice would be important to include in the caseworker job descriptions?

Building the Infrastructure Practice Profiles Differential Response Essential Functions Facilitative Administration EngagementAssessmentPartnershipCommunicationEvaluation Will new policies or procedures need to be developed by the State or County to support DR essential functions? What role does leadership need to play at State and County levels to reduce administrative barriers to DR practice? How can State leadership institute policy-practice feedback loops?

Function X Driver Example Practice Profiles Success Coach Example Engagement KSAsTraining The ongoing ability to establish and sustain a genuinely supportive relationship with family while developing a partnership, establishing healthy boundaries, and maintaining contact as mutually negotiated. The ability to identify family’s positive supports and assess the best way to engage the family and/or who in the family to engage first. Develop rapport and build relationship with family members Listen actively and openly to families’ perspective, needs and story Complete life circles, genograms, eco maps, case mapping, and other engagement/assessment tools Use basic Motivational Interviewing techniques with families to overcome resistance Motivational Interviewing (MI) Family Preservation Training Cultural Competency Training

Improvement Cycles Shewhart (1924); Deming & Juran (1948); Six-Sigma (1990) Practice Profiles Plan Do Study Act Make AdjustmentsDecide what to do Do it (be sure)Look at the results Cycle Do over and over again until intended benefits realized

Improvement Cycles Practice Profiles PDSA cycles Competency Drivers Organization Drivers Leadership Essential Functions of the Profile Data Collection activities Practice Profiles and accompanying implementation supports will change multiple times during initial implementation Cycles

Program Improvement Program Review Process Process and Outcome Data Detection Systems for Barriers Communication protocols Questions to Ask What formal and informal data have we reviewed? What is the data telling us? What barriers have we encountered? Would improving the functioning of any Implementation Driver help address barrier? Fidelity Data

Results from Child Wellbeing Project Case Example ComponentT1 Selection1.44 Training1.33 Coaching1.27 Perf. Assessment0.78 DSDS0.18 Fac. Administration1.38 Systems Intervention1.29 Average Composite Score1.1 Fidelity (% of cases)18% Case management model involved intense program development of core intervention components and accompanying implementation drivers. Clinical case management and home visiting model for families post-care.

Case Example How did Implementation Teams improve fidelity? – Intentional action planning based on implementation drivers assessment data and program data – Improved coaching, administrative support, and use of data to drive decision-making ; adapted model – Diagnosed adaptive challenges, engaged stakeholders, inspired change Using Data to Improve Fidelity

Results from Child Wellbeing Project Case Example ComponentT1T2T3 Selection * 1.89* Training * 1.10 Coaching * 1.83* Perf. Assessment * DSDS * Fac. Administration * 2.0* Systems Intervention * 2.0* Average Composite Score * 1.83* Fidelity (% of cases) 18%83% Success Coach model involved intense program development of core intervention components and accompanying implementation drivers

Positive Outcomes Stabilized families Prevented re-entry of children into out of home placements High Fidelity Did high fidelity implementation lead to improved outcomes? Early outcomes include…

Methods, Resources and Feasibility If fidelity criteria are already developed 1. Understand reliability and validity of instruments a. Are we measuring what we thought we were? b. Is fidelity predictive of outcomes? c. Does fidelity assessment discriminate between programs? 2. Work with program developers or purveyors to understand the detailed protocols for data collection a. Who collects the data (expert raters, teachers) b. How often is data collected c. How are data scored and analyzed 3. Understand issues (reliability, feasibility, cost) in collecting different kinds of fidelity data a. Process data vs. Structural data Fidelity Data Collection

Program Fidelity Fidelity has multiple facets and is critical to achieving outcomes Fully operationalized programs are pre-requisites for developing fidelity criteria Valid and reliable fidelity criteria need to be collected carefully with guidance from program developers or purveyors Fidelity is an implementation outcome; effective use of Implementation Drivers can increase our chances of high-fidelity implementation Fidelity data can and should be used for program improvement Summary

Program Fidelity Examples of fidelity instruments Teaching Pyramid Observation Tool for Preschool Classrooms (TPOT), Research Edition, Mary Louise Hemmeter and Lise Fox The PBIS fidelity measure (the SET) described at IS_ResourceID=222 IS_ResourceID=222 Articles Sanetti, L. & Kratochwill, T. (2009). Toward Developing a Science of Treatment Integrity: Introduction to the Special Series. School Psychology Review, Volume 38, No. 4, pp. 445–459. Mowbray, C.T., Holter, M.C., Teague, G.B., Bybee, D. (2003). Fidelity Criteria: Development, Measurement and Validation. American Journal of Evaluation, 24 (3), Hall, G.E., & Hord, S.M. (2011). Implementing Change: Patterns, principles and potholes (3 rd ed.)Boston: Allyn and Bacon. Resources

Stay Connected! nirn.fpg.unc.edu