IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Evidence Based Practices Lars Olsen, Director of Treatment and Intervention Programs Maine Department of Corrections September 4, 2008.
© 2009 Berman Group. Evidence-based evaluation RNDr. Jan Vozáb, PhD partner, principal consultant Berman Group.
Main title slide Keeping Safe. NSPCC and Department of Education Developing Preventative ‘Keeping Safe’ Education in Primary Schools in Northern Ireland:
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.
Implementation Research: Using Science to Guide Implementation of Evidence-Based Practices Brian S. Mittman, PhD Director, VA Center for Implementation.
Introduction Results and Conclusions Comparisons on the TITIS fidelity measure indicated a significant difference between the IT and AS models on the Staffing.
Site Dev Highlighted article or topic – Transport Findings Update and Summary _____________________________________ Website: Pre Sonja Schoenwald, Ph.D.
Session 2: Implementation and process evaluation Implementation Neil Humphrey (Manchester) Ann Lendrum (Manchester) Process evaluation Louise Tracey (IEE.
Chapter 15 Evaluation.
The Importance of Fidelity as a Means for Enhancing and Assessing Program Quality A Case Study of the RESPECT Program.
Effect of Staff Attitudes on Quality in Clinical Microbiology Services Ms. Julie Sims Laboratory Technical specialist Strengthening of Medical Laboratories.
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
Supported Employment Demonstration Sites 2010/2011.
Evaluating Services & Expenditure in Social Sectors Approaches supported by The Atlantic Philanthropies Gail Birkbeck Feb 1, 2013.
Making a difference? Measuring the impact of an information literacy programme Ann Craig
Cross Border Animal Health Plan of Action – Kenya and Uganda Four Strategic areas 1. To improve prevention, management and control of cross border animal.
RRTC-EBP-VR The Rehabilitation Research and Training Center on Effective Vocational Rehabilitation Service Delivery Practices (RRTC-EBP-VR) is established.
Health Promoting Health Service: Development day.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Module 3. Session DCST Clinical governance
Abu Raihan, MD, MPH Director of Program, Asia IAPB 9th GA, Hyderabad, September 2012 Symposium 6: Challenges in Monitoring.
Program Evaluation and Logic Models
Implementation Science and the Adoption of Practice in Addiction Treatment Harold I Perl, PhD Center for the Clinical Trials Network National Institute.
The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Strategic Prevention Framework Overview Paula Feathers, MA.
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
Organizational Conditions for Effective School Mental Health
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
INFLUENCES ON EVALUATION: IMPLEMENTATION SCIENCE AND ITS USEFULNESS FOR EVALUATION Robyn Mildon, PhD Parenting Research Centre Annette Michaux & Andrew.
MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference
Grade 9 Drug Education Programme For Cleveland District State High School By Alison Clark.
Fernando Salazar, Ph.D. Universidad Peruana Cayetano Heredia San Francisco 30 May, 2013 International Dissemination of Evidence Based Practice.
Models of Change and the Impact on Organizational Culture in Nonprofit Agencies: Comparison and Validation of The Sanctuary Model ® Wendy M. McSparren,
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Examination of Public Perceptions of Four Types of Child Sexual Abuse Prevention Programs Brandon Kopp Raymond Miltenberger.
Disciplined Curriculum Innovation Developing and implementing the curriculum is a major investment for all involved and it is important for schools and.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
Selecting Evidence Based Practices Oregon’s initial attempts to derive a process Implementation Conversations 11/10.
Measuring Fidelity in Early Childhood Scaling-Up Initiatives: A Framework and Examples Carl J. Dunst, Ph.D. Orelena Hawks Puckett Institute Asheville,
Kerry Cleary An evaluation of the impact of Values Based Interviewing at the OUH Values Based Conversations and wider engagement strategies.
+ Evidence Based Practice University of Utah Evidence-Based Treatment and Practice: New Opportunities to Bridge Clinical Research and Practice, Enhance.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
SCHOOL COUNSELING INTERVENTIONS Adrienne WatkinsBall State University.
Introduction to Research for Physical Therapy Students.
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
Abstract Service Learning is a useful avenue in developing agency in college students, giving them the opportunity to interact with issues linking course.
A Multi-Level Framework to Understand Factors Influencing Program Implementation in Schools Celene E. Domitrovich, Ph.D. Penn State Prevention Research.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Scottish Improvement Science Collaborating Centre Strengthening the evidence base for improvement science: lessons learned Dr Nicola Gray, Senior Lecturer,
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Implementation and Sustainability in the US National EBP Project Gary R. Bond Dartmouth Psychiatric Research Center Lebanon, NH, USA May 27, 2014 CORE.
Initial Project Aims To increase the capacity of primary schools in partnership with parents to implement a sustainable health and sexuality education.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
AN EXPLORATION OF PERSON- CENTRED CARE ACROSS ACUTE HOSPITAL SETTINGS IN IRELAND By Dr R Parlour & Dr P Slater.
Zdenko Klepić University of Mostar, Faculty of Economics
Nicolette Roman* & Adele Grosse INTRODUCTION/BACKGROUND
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
ERFCON th International Conference of the Faculty of Education and Rehabilitation Sciences University of Zagreb 17 – 19 May 2017, Zagreb SOCIAL COMPETENCE.
Novak, M., Mihic, J. Basic, J., Nix, R.L.
Evaluation of 15 projects – ‘Supporting School Leavers’
Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009
Brotherson, S., Kranzler, B., & Zehnacker, G.
Conclusions Context Long-Term Conditions Questionnaire Results
What barriers and facilitators influence the implementation of new high-risk medicine services in Scottish community pharmacies? Ms Natalie Weir1, Dr Rosemary.
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Some Further Considerations in Combining Single Case and Group Designs
Presentation transcript:

IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences Josipa Mihic, Clemens Hosman, Celene Domitrovich Health Promotion Research Centre 18th Annual Summer Conference National University of Ireland Galway

MOTIVATION FOR RESEARCH o Although the efficacy of various evidence-based interventions has been established through carefully designed trials in control conditions, there is a lack of evidence for its utilization in natural community conditions (Kam, Greenberg and Weiss, 2003; Greenberg et al., 2005; Fixsen et al., 2005, Proctor and Rosen, 2008) o A more systematic process is warranted to translate efficacy results into positive participants’ outcomes SCIENCE TO PRACTICE CONTEXT FACTORS AND SUPPORT SYSTEM CHARACTERISTICS OF PROGRAM IMPLEMENTATION RESEARCH

DEFINITION OF IMPLEMENTATION o implementation refers to what a program consist of when it is delivered in a particular setting (Durlak and Dupre, 2008) o characteristics of intervention itself and characteristics of the intervention support system o it is important to differentiate between factors affecting implementation quality (Fixsen et al., 2005, 2009) and implementation aspects (Durlak and Dupre, 2008)

WHY WE MONITOR IMPLEMENTATION To know what actually happened To provide feedback for quality To strengthen the conclusions made about outcomes To examine the change process To understand the internal dynamics of program To advance knowledge for replication To strengthen the quality of evaluation

ORGANIZATIONAL CAPACITY ATTITUDES TOWARDS THE INTERVENTION TRAINING AND KNOWLEDGE SUPPORT FOR IMPLEMENTER MONITORING SYSTEM CHARACTERISTICS OF A PROGRAM IMPLEMENTER'S SKILL PROGRAM STANDARDIZATION ATTITUDES TOWARDS THE INTERVENTION IMPLEMENTATION FACTORS - CAPACITY FOR PROGRAM IMPLEMENTATION IMPLEMENTATION QUALITY FIDELITY PARTICIPAN T RESPONSIVE NESS DOSAGE QUALITY OF PROGRAM DELIVERY PERCEIVED PROGRAM IMPACT INDICATORS OF IMPLEMENTATION QUALITY

Research project »Preffi – Quality assurance in the Region of Istria« Department of Health and Social Care, Region of Istria and Faculty of Education and Rehabilitation Sciences, University of Zagreb Incorporation of science-based principles in prevention practice in Istria through collaboration of scientists and practitioners

TRAINING PROGRAM Principles of science-based practice Logic modelling  Implementatio  Evaluation  Advocacy Interactive group education and Individual consultation ORGANIZATION MANAGERS PROGRAM AUTHORS PROGRAM DELIVERERS

The general aim of this research was to study implementation processes and their outcomes in prevention programs in Croatia.

1.To construct valid and reliable measures of implementation quality. 2.To explore the level and variation of implementation quality 3.To explore the differences in perception of implementation quality between program managers, implementers and participants. 4.To explore the relationships of implementation factors and indicators of implementation quality. 5.To test the impact of the Training for Prevention on the level of implementation quality in experimental group

SAMPLE: 24 mental health promotion and prevention programs MANAGERSIMPLEMENTERS IMPLEMENTATION FACTORS Standardization Implementers’ skills Attitudes Training Support Monitoring Standardization Attitudes Training Support Monitoring IMPLEMENTERSPARTICIPANTS INDICATORS OF IMPLEMENTATION QUALITY Fidelity Quality of delivery Responsiveness Perceived program impact Dosage Quality of delivery Responsiveness Perceived program impact Parenting Positive development and SEL Substance abuse

MEASURES o Implementation Factors Questionnaire for Program Managers o Implementation Factors Questionnaire for Program Implementers o Indicators of Implementation Quality Questionnaire for Program Implementers o Indicators of Implementation Quality Questionnaire for Program Participants o The Preffi 2.0 instrument

June December January 2011 February February January 2011 January 2011 March 2011 December assessors independently assess 24 written programs’ proposals Division 24 programs : experimental (N=12) and control condition (N=12) EXPERIMENTAL Training for Prevention to experimental condition Mid- Assessment implementation quality Focus groups Evaluate Initial set of items 19 PROGRAM MANAGERS 50 PROGRAM DELIVERERS 434 PROGRAM PARTICIPANTS CONSTRUCTION OF SCALES FOR IMPLEMENTATION QUALITY ASSESSMENT Implementation of 24 programs Post- Assessment Implementation quality

June December January 2011 February February January 2011 January 2011 March 2011 December assessors independently assess 24 written programs’ proposals Division 24 programs : experimental (N=12) and control condition (N=12) Mid- Assessment implementation quality Focus groups Evaluate Initial set of items Implementation of 24 programs Post- Assessment Implementation quality STUDY OF IMPLEMENTATION QUALITY 19 PROGRAM MANAGERS 50 PROGRAM DELIVERERS 434 PROGRAM PARTICIPANTS EXPERIMENTAL Training for Prevention to experimental condition

19 PROGRAM MANAGERS 50 PROGRAM DELIVERERS 434 PROGRAM PARTICIPANTS June December January 2011 February February January 2011 January 2011 March 2011 December assessors independently assess 24 written programs’ proposals Division 24 programs : experimental (N=12) and control condition (N=12) Mid- Assessment implementation quality Focus groups Evaluate Initial set of items Implementation of 24 programs Post- Assessment Implementation quality STUDY ON THE TRAINING FOR PREVENTION IMPACT 22 PROGRAM MANAGERS 55 PROGRAM DELIVERERS 744 PROGRAM PARTICIPANTS EXPERIMENTAL Training for Prevention to experimental condition

RESULTS

STUDY OF IMPLEMENTATION QUALITY

Reports on implementation factors across 24 programs show that views of managers and programs implementers are rather similar MAN: Lowest ratings for standardization, training and monitoring IMP: Lowest ratings for standardization and training

Indicators of implementation quality are really high – group results all over value of 3 Overall group results show that program implementers give similar although higher ratings than participants

STUDY OF THE TRAINING FOR PREVENTION IMPACT

Levels of implementation factors reported by managers and the effect of the Training for Prevention at post-intervention POST INTERVENTIONINTERVENTION EFFECT CONTINT BETA (SE) PEFFECT SIZE Standardization M SD 2,89 0,43 2,05 0, (0.28) <.001** Implementers’ skills M SD 3,61 0,47 3,76 0, (0.18) Attitudes M SD 2,97 0,37 2,74 0, (0.18) Training M SD 2,95 0,72 2,39 0, (0.27).05*-0.92 Support M SD 3,47 0,63 3,25 0, (0.22) Monitoring M SD 3,14 0,70 2,35 0, (0.22).01** N11 Managers from experimental conditions give more critical answers about implementation factors than those from control = CHANGE OF CRITERIA

Levels of implementation factors reported by implementers and the effect of the Training for Prevention at post-intervention POST INTERVENTIONINTERVENTION EFFECT CONTINT BETA (SE) PEFFECT SIZE Standardization M SD 3,10 0,54 2,85 0, (0.23) Attitudes M SD 3,04 0,41 2,83 0, (0.13) Training M SD 3,07 0,66 2,85 0, (0.17) Support M SD 3,44 0,47 3,49 0, (0.25) Monitoring M SD 2,97 0,57 2,80 0, (0.25) N3322 Implementers from experimental conditions give more critical answers about implementation factors than those from control = CHANGE OF CRITERIA

Levels of implementer’s ratings on indicators of implementation quality and the effect of the Training for Prevention at post-intervention POST INTERVENTIONINTERVENTION EFFECT CONTINT BETA (SE) PEFFECT SIZE Fidelity M SD 3,29 0,37 3,22 0, (0.13) Quality M SD 3,55 0,40 3,49 0, (0.14) Responsiveness M SD 3,36 0,41 3,49 0, (0.15) Perceived program impact M SD 3,34 0,48 3,25 0, (0.19) N3322 At post-test, program implementers from the experimental group did not report improved indicators of implementation quality in comparison with the control group

Levels of participant’s ratings on indicators of implementation quality and the effect of the Training for Prevention at post-intervention POST INTERVENTIONINTERVENTION EFFECT CONTINTBETAPEFFECT SIZE Dosage M SD 92,33 18,00 99,45 2, (4.41) Quality M SD 3,43 0,52 3,64 0, (0.07).02* 0.49 Responsiveness M SD 3,09 0,67 3,50 0, (0.12).04*0.76 Perceived program impact M SD 2,94 0,74 3,23 0, (0.12) N At post-test, participants from the experimental group report on higher levels of implementation quality, two of them being significant - quality of delivery and responsiveness.

Moderator analyses have shown that Training for Prevention is more effective for SHORT PROGRAMS

Moderator analyses have shown that Training for Prevention is more effective for PROGRAMS WHERE MANAGER IS NOT ACTIVE

CRUCIAL CONDITIONS FOR SUCCESFUL IMPLEMENTATION

Investment in implementation factors ORGANIZATIONAL CAPACITY ATTITUDES TOWARDS THE INTERVENTION TRAINING AND KNOWLEDGE SUPPORT FOR IMPLEMENTER MONITORING SYSTEM CHARACTERISTICS OF A PROGRAM IMPLEMENTER'S SKILL PROGRAM STANDARDIZATION ATTITUDES TOWARDS THE INTERVENTION

THANK YOU FOR YOUR ATTENTION