CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.

Slides:



Advertisements
Similar presentations
The NDPC-SD Intervention Framework National Dropout Prevention Center for Students with Disabilities Clemson University © 2007 NDPC-SD – All rights reserved.
Advertisements

What You Will Learn From These Sessions
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Home Visiting Overview April 8, Help Me Grow A program for Ohio’s expectant parents, newborns, infants and toddlers.
Healthy Start Interconception Care Learning Community (ICC LC) Using Quality Improvement for Better Preconception Care Preconception Care Summit June 14,
Key Leader Orientation
Robert Wm. Blum, MD, MPH, PhD Center for Adolescent Health & Development WHO Collaborating Centre on Adolescent Health University of Minnesota Prepared.
Developing a Logic Model
Dennis McBride, Ph.D. The Washington Institute (253) Goal Driven Logic Models.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
WORKING TOGETHER TO PROVIDE EVIDENCE-BASED HEALTHY AGING PROGRAMS: PUBLIC HEALTH, AGING, AND UNIVERSITY COMMUNITIES Lucinda L. Bryant PhD MSHA MBA, University.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
Entertainment Education Components of Successful Campaign Design.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
DC Home visiting Implementation and impact evaluation
PROGRAM EVALUATION Amanda Valbuena Evaluator, CFEP.
Implementation Science of Sexual Health in Schools Sarah Kershner, MPH, CHES Project Coordinator Doug Taylor, MPH, CHES Chief Program Officer Healthy Youth.
Is Health Education Important in Schools?
1-2 Training of Process FacilitatorsTraining of Coordinators 2-1.
Science-Based Approaches to Teen Pregnancy Prevention Susan E. Washinger, M.Ed. Project Coordinator, SBP.
Youth Development as a Public Health Policy: How to Make it Work Richard E. Kreipe, MD, FAAP, FSAM Professor of Pediatrics University of Rochester Leadership.
Beth Mastro New York State Center for School Safety 2010 ACT for Youth Center of Excellence Cornell University Family Life Development Center Cornell University.
Critical Resources to Support School and Community Partnerships: The School Counselor’s Role Sabri Dogan, Doctoral Student, OSU David Julian, Ph.D., OSU.
Models for Program Planning in Health Promotion
In Shape From: National Registry of Evidence- based Programs and Practices (NREPP) Trey Thomas 11/19/2012 Health 313_01 Drugs and Human Behavior.
Family-Centered Youth HIV Prevention: Journey of the Families Matter Program from the US to Sub-Saharan Africa Kim S. Miller, PhD Senior Advisor for Youth.
Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
Planned Parenthood Columbia Willamette A Leader in Reproductive and Sexual Health Services in Oregon and Southwest Washington since 1963.
Using Implementation Research to Inform Technical Assistance Practice Sam Morgan Peggy Malloy
Participants Adoption Study 109 (83%) of 133 WSU Cooperative Extension county chairs, faculty, and program staff responded to survey Dissemination & Implementation.
PREVENTING TEEN PREGNANCY AND SEXUALLY TRANSMITTED INFECTIONS.
Early Intervention: Policy and practice developments in England Helen Jones Professional Adviser ACWA Conference Sydney.
Program Evaluation and Logic Models
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Strategic Prevention Framework Overview Paula Feathers, MA.
CAPP Evaluation August 2011 Update Jane Powers, Amanda Purington, Jenny Parise ACT for Youth Center of Excellence.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Overview of the Plain Talk Data Collection System Sarabeth Shreffler, MPH, CHES Program Officer, Plain Talk Program Public/Private Ventures.
Organizational Conditions for Effective School Mental Health
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
HIV Prevention Programs That Work Centers for Disease Control and Prevention (CDC)
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Key Leaders Orientation 2- Key Leader Orientation 2-1.
IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.
Plain Talk Lorelei Walters Program Officer Plain Talk Replication Public/Private Ventures Replication and Expansion Services.
Austin Healthy Adolescent Program Overview
Types of Research 1. Basic research: To answer questions about the nature of human behavior and to understand psychological processes. Goal is to increase.
: The National Center at EDC
Measuring Fidelity in Early Childhood Scaling-Up Initiatives: A Framework and Examples Carl J. Dunst, Ph.D. Orelena Hawks Puckett Institute Asheville,
SafeCare as a Catalyst for Promoting Positive Parenting in Congregate Family Shelters Janee Harvey Program Director, CAMBA Jenelle R Shanley, PhD Associate.
1-2 Training of Process Facilitators Training of Process Facilitators To learn how to explain the Communities That Care process and the research.
Project Overview In the school year (SY), the School District of Philadelphia (SDP) was awarded a grant from the Centers for Disease Control.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Resource Review for Teaching Resource Review for Teaching Victoria M. Rizzo, LCSW-R, PhD Jessica Seidman, LMSW Columbia University School of Social Work.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Strategic Prevention Framework - Assessment Program Title Here date.
1 This project was supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under.
Promoting Science-based Approaches to Preventing Teen Pregnancy, STDs and HIV Policy, Partnerships, and Creativity Brigid Riley, MPH American Public Health.
Overview of Intervention Mapping
Comprehensive Adolescent Pregnancy Prevention
VIEWPOINTS AND ACTIONS ON THE COMPLEXITY OF VULNERABLE CHILDREN
THE NURSING PROCESS A systematic problem-solving approach used to identify, prevent and treat actual or potential health problems and promote wellness.
Putting Public Health Evidence in Action
PrEP introduction for Adolescent Girls and Young Women
@smharden Specifying implementation strategies to enhance reviewer comprehension of your work 2019 Dissemination & Implementation Science Grant Development.
Presentation transcript:

CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent Pregnancy Prevention (CAPP) providers in New York State

Overview Review Basic Concepts of Evaluation The Science of Implementation CAPP evaluation: Implementing EBPs Evaluation Partnership with COE

1)You care about youth 2)You want to make a difference in their lives 3)You want to know whether you have made a difference in their lives

My question is, “Are we making a difference?”

Program Evaluation can help us answer the question: Are we making a difference???

What is Program Evaluation? “Program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming.” Michael Quinn Patton (1997)

Evaluation Terminology

Types of Program Evaluation PROCESS PROCESS OUTCOME OUTCOME

Process Evaluation Focuses on: Focuses on: – What happened in the program Who got how much of what? Who got how much of what? Was the program implemented as planned? Was the program implemented as planned? –Participant reactions to the program

Examples of Process Questions Which youth are participating in our program? (neighborhood, RHY, LGBTQ, FC) Which youth are participating in our program? (neighborhood, RHY, LGBTQ, FC) Who are we not reaching? Who are we not reaching? How many sessions were offered? What % of participants attended all of the sessions? How many sessions were offered? What % of participants attended all of the sessions? What program activities were conducted? What program activities were conducted? Were there any adaptations made to the EBP? Were there any adaptations made to the EBP?

Outcome Evaluation Focuses on whether the program made a difference Focuses on whether the program made a difference Answers the question: SO WHAT? Answers the question: SO WHAT? What difference does the program make for participants, individuals, groups, families, and the community?

Examples of Outcome Questions Have adolescents increased knowledge about different types of birth control? Have adolescents increased knowledge about different types of birth control? Have adolescents learned how to use a condom? Have adolescents learned how to use a condom? Have attitudes toward condom use changed? Have attitudes toward condom use changed? Are parents more knowledgeable about adolescent sexuality? Are parents more knowledgeable about adolescent sexuality? Do parents talk to their kids about contraception? Do parents talk to their kids about contraception?

Process Data are foundational In order to get good outcome data, must obtain good process data In order to get good outcome data, must obtain good process data

CAPP Initiative Goal 1: Promote healthy sexual behaviors and reduce the practice of risky sexual behaviors among adolescents Provide comprehensive, age appropriate, evidence-based and medically accurate sexuality education to promote healthy sexual behaviors Core Strategy 1: Provide comprehensive, age appropriate, evidence-based and medically accurate sexuality education to promote healthy sexual behaviors

We know a lot about what works to prevent teen pregnancy EBP Increase age of first intercourse Increase use of condoms Decrease # sexual partners Decrease frequency of sex Decrease Teen Pregnancy Promote Adol Sexual Health

Identify problem or disorder and determine its extent Identify risk and protective factors associated with the problem Develop intervention and conduct efficacy trials Conduct large scale effectiveness trials of the intervention Implement the program in the community and conduct ongoing evaluation Feedback Loop The Prevention Research Cycle Reproduced from Fig. 1. The interactive systems framework for dissemination and implementation (p174) published in Wandersman et al

Just DO IT!

What do we know about implementing EBPs in communities?

Taking EBPs to scale Very little is known about the processes required to effectively implement EBPs on a national scale (Fixsen et al, 2005) Very little is known about the processes required to effectively implement EBPs on a national scale (Fixsen et al, 2005) Research to support the implementation activities that are being used is even rarer Research to support the implementation activities that are being used is even rarer While many EBPs have yielded positive outcomes in research settings, the record at the local level of “practice” is mixed (Wandersman, 2009; Lesesne et al, 2008). While many EBPs have yielded positive outcomes in research settings, the record at the local level of “practice” is mixed (Wandersman, 2009; Lesesne et al, 2008).

What do we know about Implementation? Durlak and DuPre, 2008: Level of implementation influences program outcomes Level of implementation influences program outcomes If EBPs are not implemented with fidelity and quality, not likely to result in outcomes observed in research If EBPs are not implemented with fidelity and quality, not likely to result in outcomes observed in research Achieving good implementation increases chances of program success and stronger benefits for participants Achieving good implementation increases chances of program success and stronger benefits for participants

Factors Affecting Implementation Community Level Community Level Facilitator Characteristics Facilitator Characteristics Program Characteristics Program Characteristics Organizational Capacity Organizational Capacity Training and TA Training and TA

Need to Document Implementation Assessment of implementation is critical in program evaluation Assessment of implementation is critical in program evaluation Evaluations that lack carefully collected implementation data are incomplete Evaluations that lack carefully collected implementation data are incomplete Our understanding of program outcomes rests on knowing how the intervention was delivered. Our understanding of program outcomes rests on knowing how the intervention was delivered.

The Fidelity Tension Program developers and prevention researchers are concerned that changes in implementation of EBP will dilute effectiveness Program developers and prevention researchers are concerned that changes in implementation of EBP will dilute effectiveness Community leaders and practitioners are concerned that “one size does not fit all” Community leaders and practitioners are concerned that “one size does not fit all” US Department of Health and Human Services, 2002

HELP NEEDED!!!

Data Collection Tools for CAPP Evaluation of Implementation Fidelity Check List: individualized, keep track of what you did, successes/challenges Fidelity Check List: individualized, keep track of what you did, successes/challenges Attendance Record: who you reached, where, dosage Attendance Record: who you reached, where, dosage

Fidelity Checklist

Demographic Survey

Attendance Record

After you have completed an entire cycle of the EBP (i.e., ALL of the EBP sessions or modules): 1)Send all the completed evaluation tools (except the Brief Demo Survey!) to the Center of Excellence 2)Make sure that you clip all completed documents together so that we can keep track of individual EBP cycles. This includes: Fidelity Check List (one per EBP cycle) Fidelity Check List (one per EBP cycle) Attendance Record (one per EBP cycle with all names removed) Attendance Record (one per EBP cycle with all names removed) 3)Mail these documents to: Amy Breese Cornell University ACT for Youth Center of Excellence Beebe Hall Ithaca, NY 14853

Questions? Amanda Purington: or

Comments? Jane Powers: