Copyright © 2006 Healthy Teen Network. All rights reserved. Promoting Evidence-Based Approaches to Teen Pregnancy Prevention CityMatCH Audio-Conference.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Successfully Implementing Evidence-Based Programs for Children and Families in North Carolina A Presentation for the Family Impact Seminar Michelle Hughes,
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Best Practices for State & Local Oral Health Programs ASTDD Best Practices Project March 2010 Introduction to Best Practices Optimal oral health across.
Research methods – Deductive / quantitative
Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.
Developing a Logic Model
Dennis McBride, Ph.D. The Washington Institute (253) Goal Driven Logic Models.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Research problem, Purpose, question
How to Write Goals and Objectives
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
UOFYE Assessment Retreat
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
How to Develop the Right Research Questions for Program Evaluation
Science-Based Approaches to Teen Pregnancy Prevention Susan E. Washinger, M.Ed. Project Coordinator, SBP.
Using Qualitative Data to Contextualize Chlamydia and Birth Rates Joyce Lisbin EdD, Anna Groskin MHS, Rhonda Kropp RN MPH, Virginia Loo ABD, Julie Lifshay.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Beth Mastro New York State Center for School Safety 2010 ACT for Youth Center of Excellence Cornell University Family Life Development Center Cornell University.
Questionnaires and Interviews
Adolescent Sexual Health Work Group (ASHWG)
Funding Opportunity: Supporting Local Community Health Improvement Sylvia Pirani Director, Office of Public Health Practice New York State Department of.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Planned Parenthood Columbia Willamette A Leader in Reproductive and Sexual Health Services in Oregon and Southwest Washington since 1963.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
“Advanced” Data Collection January 27, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Strategic Prevention Framework Overview Paula Feathers, MA.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Perspectives on Impact Evaluation Cairo, Egypt March 29 – April 2, 2009 Presented by: Wayne M. Harding. Ed.M., Ph.D., Director of Projects, Social Science.
Using Research to Select, Adapt and Improve Programs
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
SCIENCE-BASED PROGRAMS AND ADAPTATIONS Prepared by Healthy Teen Network and ACT For Youth.
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Overview of the Plain Talk Data Collection System Sarabeth Shreffler, MPH, CHES Program Officer, Plain Talk Program Public/Private Ventures.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
June 6, 2013 Presenters: Nadrine P. Hayden & Brandi Bowen.
The Process of Conducting Research
HIV Prevention Programs That Work Centers for Disease Control and Prevention (CDC)
Sex Education: Scientific Evidence and Public Policy The Institute for Research & Evaluation April 29, 2009.
CARLOS F. CACERES, MD, PHD PROFESSOR OF PUBLIC HEALTH CAYETANO HEREDIA UNIVERSITY DIRECTOR, INSTITUTE OF HEALTH, SEXUALTY AND HUMAN DEVELOPMENT LIMA, PERU.
Reasoned Abstinence Plus Focus group: y/o Female Hispanic and African American RAP will be presented for implementation to the SHAC of zip codes:
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Brianna Loeck Principles of Health Behavior - MPH 515 Kimberly Brodie August 22, 2013 Educate Prevent Sexually Transmitted Diseases & Teen Pregnancy.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
Evaluation Tools and Models Northern Plains CCC Program Comprehensive Cancer Control 2008 Annual Grantee Meetings May 14th, 2008 Atlanta, GA.
Increasing Women’s Contraceptive Use in Myanmar Using Empowerment & Social Marketing Strategies By: Michelle Santos MPH 655 Dr. Rhonda Sarnoff May 2, 2013.
Core Competencies for Adolescent Sexual and Reproductive Health Performance Assessment and Human Resources Toolkit.
Facilitate Group Learning
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Using Logic Models to Create Effective Programs
Project Overview In the school year (SY), the School District of Philadelphia (SDP) was awarded a grant from the Centers for Disease Control.
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
1 No glove, no love: Why California’s ethnic youth report using contraception Shelly Koenemann, MPH Marlena Kuruvilla, MPH/MSW Michelle Barenbaum, MPH.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Welcome! Seminar – Monday 6:00 EST HS Seminar Unit 1 Prof. Jocelyn Ramos.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
1 Abstinence and Comprehensive Sex/HIV Education Programs: Their Impact on Behavior In Developed and Developing countries Douglas Kirby, Ph.D., ETR Associates.
Promoting Science-based Approaches to Preventing Teen Pregnancy, STDs and HIV Policy, Partnerships, and Creativity Brigid Riley, MPH American Public Health.
Chapter 1: Introduction to Scientific Thinking
Presentation transcript:

Copyright © 2006 Healthy Teen Network. All rights reserved. Promoting Evidence-Based Approaches to Teen Pregnancy Prevention CityMatCH Audio-Conference Nov. 30, 2006 Mary Martha Wilson, Training Director, Healthy Teen Network

Defining “Evidence-Based”  In 2002, CDC Division of Reproductive Health funded a National Project to promote science- based approaches in teen pregnancy, HIV and STI prevention.  Project goal: to decrease teen pregnancy, STI and HIV rates by increasing the use of research- proven practices and programs, or what we call “science-based approaches.”  Three national and five state organizations were funded and together with CDC agreed on definitions.

Six Approaches 1.Using local and state data to select and assess the priority populations for programs; 2.Identifying the risk and protective factors of the priority populations to be served; 3.Using health behavior or health education theory to guide the selection of risk and protective factors that will be addressed by the program, and also to guide the selection of program strategies;

Six Approaches 4.Using a logic model to link risk and protective factors with program strategies and outcomes; 5.Conducting process and outcome evaluation of the implemented program, and using evaluation data to make modifications; and 6.Selecting, adapting, if necessary, and implementing programs that are either science-based or are promising.

Evidence-Based Programs Evidence-based programs are those programs proven through rigorous evaluation to be effective in changing sexual risk-taking - risk-taking behavior (e.g. delay onset of sexual intercourse, use condoms and contraception, reduce frequency of sex, reduce number of sexual partners) Criteria: Used a comparison design Sample size over 100 Changed sexual risk-taking behavior at 3-6 months after program Published in a peer-reviewed journal

What Are They? Lists of evidence-based programs: Advocates for Youth Science and Success: Slightly different criteria: National Campaign to Prevent Teen Pregnancy What Works: PASHA at Sociometrics:

Examples Safer Choices Reducing the Risk Making Proud Choices SiHLE: Health Workshops for Young Black Women Teen Outreach Program (TOP) HIV Risk Reduction for African American and Latina Adolescent Women Aban Aya Youth Project for African American Boys

Promising Programs Not everyone will choose an evidence-based program and then replicate that program faithfully! Promising programs are those that have not been Rigorously evaluated but have most of the characteristics of effective programs. The Tool to Assess for Characteristics of Effective Teen Pregnancy, STD and HIV Programs (The TAC ) is available at:

No Matter What … … programs we are implementing, we all need to conduct some kind of program evaluation to know if our programs are meeting objectives.  Basic program evaluation doesn’t have to be difficult, time consuming or expensive!

Why Don’t We Evaluate? What keeps people from conducting evaluations of their programs? Lack of knowledge Lack of time, skills, supports, funds Funder’s requirements for external evaluator Fear and anxiety Other program and organizational priorities

It’s Essential! Who wants to know about the effectiveness of your programs? Board and staff Funders Potential funders Partners Potential partners Community

Purposes of Evaluation Process: think numbers Outcome: think changes in the program participant Impact: think long-term changes in a population or community

Process Evaluation  Used to assess whether or not the program is being implemented the way it is described on paper.  Conducted throughout the implementation of the final program design.  Can also be used to revise and strengthen a program.

Outcome Evaluation  Focuses on changes that occur in the participants shortly after the completion of the program.  Changes are generally categorized into knowledge, attitude and/or behavior changes.  Requires pre- and post-test data.  Can use a control or comparison group.

Examples of Outcome Objectives Knowledge (information, facts, stats) –List 5 contraceptive methods and describe how they work. –List 3 common symptoms associated with STDs. Attitude (values, opinions, feelings) –How comfortable are you in talking to your parents about sex? –When do you believe it’s OK for someone to have sex? Behaviors (skills, actions) –Can refuse a sexual advance assertively. –Can demonstrate the correct use of a condom.

Impact Evaluation  Similar to Outcome Evaluation, except changes are tracked over a longer period of time.  Follow-up data is generally tracked 12 months or more after the completion of the intervention.

BDI Logic Models and Evaluation InterventionsDeterminantsBehaviors Health Goal Helps to develop process objectives Helps to develop outcome objectives (short term) Helps to develop impact objectives (long term)

Writing Objectives  Be specific: Clarify who, what, how much, and by when  Make is measurable: Are there ways to measure your success? Do you have the resources to do so?  Be realistic : Don’t over-promise!  Focus on funder requirements and your interests

Evaluation Designs 1.Post-only design 2.One group pretest-posttest design 3.Non-equivalent control or comparison group design 4.Randomized pretest-posttest control or comparison group design

Post-Only Design + Easy - Without pretest, don’t really know magnitude of change ProgramMeasurement

One Group Pre-Post Design + Relatively easy - Without comparison group, limited ability to attribute changes to program ProgramMeasurement

Comparison Group Design Experiment Group Comparison Group + Measurement before and after program + Includes comparison group that is similar to program group (or the same for randomized design) - Comparison group may be biased - More expensive and difficult to manage MeasurementProgramMeasurement NO Program

Quantitative & Qualitative Methods Qualitative data collection methods emphasize deep and detailed understandings of human experience. Issues are explored openly without the constraint of predetermined categories. Quantitative data collection methods emphasize precise, objective and generalizable findings. These methods require the use of standardized measures so that varying perspectives and experiences can fit into a limited number of predetermined response categories.

Qualitative Measurement Methods  Interviews  Focus Groups  Open-ended responses on a survey  Portfolios  Essays  Meeting minutes  Observations/Field notes  Photo Voice

Qualitative Measurement Methods Advantages + Captures more depth + Provides insight to why and how Disadvantages - Time consuming to capture and analyze data - More difficult to summarize results - Typically yields smaller sample

Quantitative Measurement Methods  Surveys - closed ended questions (e.g., True/False, multiple choice, matching, Likert scale)  Implementation/activity logs  Performance tests  Clinical tests (e.g., urine and blood tests for STDs)

Quantitative Measurement Methods Advantages + Easy to administer + Can include relatively large number of questions + Can yield large samples + Easier to summarize data + More widely accepted as a form of evidence regarding program effectiveness

Quantitative Measurement Methods Disadvantages - Data may not be as rich or detailed as qualitative - Survey taking is difficult for some participants - Large amounts of data require more sophisticated analysis approach

Evaluation Reporting Possible Methods Written Report (e.g. annual report) Oral Presentation Newsletter/Journal Article Poster Board Presentation Grant Proposal

What to Include in Evaluation Report 1.Executive Summary (summary of evaluation findings) 2.Project Background & Description 3.Evaluation Methods 4.Description of Tools to Used to Collect Data 5.Report on Data (tables, charts, etc.) 6.Changes made to program as a result of evaluation

Where Can You Find: Program Evaluation Info: –Sage Publications: sagepub.com –Sociometrics: socio.com –Philliber Research: philliberresearch.com –American Evaluation Association: eval.org –Management Assistance Program for Nonprofits: mapnp.org Info on State Coalition and Health Dept contacts for each state: –Healthy Teen Network Coalition Directory –Available on the HTN website

Resources Proven Programs –Advocates For Youth’s Science and Success –NCPTP’s What Works, Emerging Answers Research –Science Says Research Briefs –Kirby’s Risk and Protective Factor Paper –Kirby’s 17 Characteristics of Effective Programs

Resources Healthy Teen Network’s Trainings on Science-Based Approaches –Introduction to Science-Based Approaches, BDI Logic Model, Program Evaluation Basics, Assessing for Program Characteristics, HIV and TPP Integration, Getting to Outcomes, etc. –Coming Up: Adaptation Guidelines