Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT.

Slides:



Advertisements
Similar presentations
Title X Objectives How Writing Measurable Objectives Helps DSHS Evaluate the Success of Your Title X Project.
Advertisements

Introduction to Monitoring and Evaluation
CDI Module 10: A Review of Effective Skills Training
Donald T. Simeon Caribbean Health Research Council
GOALS AND OBJECTIVES Intermediate Injury Prevention August 23-26, 2011 Billings, MT GOAL Objective.....
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Everyday Program Evaluation Sheena Cretella MSPH SC DHEC Diabetes Division Program Evaluator 1.
Does It Work? Evaluating Your Program
Leading Teams.
Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT Data Collection Planning.
Focusing Your Evaluation Activities Chapter Four.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Key Terms for Monitoring and Evaluation. Objectives Explain the difference between monitoring and evaluation. Introduce the most common M&E terms. Review.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Car Seat Education for Refugees: Bridging Barriers.
+ Program Planning Chapter 2. + Individual and/or Program Cornerstones 1. Needs assessment** Needs assessment 2. Planning Planning 3. Implementation Implementation.
Increasing Helmet Use Among Middle-School Aged Youth Dr. J. E. Louis Malenfant Center for Education and Research in Safety Dr. Ron Van Houten Western Michigan.
REAL-START : Risk Evaluation of Autism in Latinos (Screening Tools and Referral Training) Assuring No Child Enters Kindergarten With an Undetected Developmental.
National Public Health Performance Standards Local Assessment Instrument Essential Service:3 Inform, Educate, and Empower People about Health Issues.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
A Tool to Monitor Local Level SPF SIG Activities
OKLAHOMA QUALITY IMPROVEMENT COLLABORATIVE INTERIM TRAINING Marlene Mason MCPP Healthcare Consulting, Inc. October 28, 2010.
ACE Personal Trainer Manual 5th Edition
Measuring Our Success Kristen Sanderson, MPH, CHES Program Coordinator, Safe Kids Georgia 1.
Opioid Misuse Prevention Program “OMPP” Strategic Planning Workshop
May 20, Purpose of the Self- Assessment Required by the Head Start Performance Standards (i)(1) Head Start Ac 2007 Head Start Act Section.
Training of Process Facilitators Training of Process Facilitators.
Chase Bolds, M.Ed, Part C Coordinator, Babies Can’t Wait program Georgia’s Family Outcomes Indicator # 4 A Systems Approach Presentation to OSEP ECO/NECTAC.
GOALS AND OBJECTIVES Intermediate Injury Prevention August 23-26, 2011 Billings, MT GOAL – Where you want to be Objective- Steps you need to get there.
Program Evaluation and Logic Models
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Data Analysis & Interpretation Intermediate Injury Prevention August 23-26, 2011 Billings, MT.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Assuring Safety for Clinical Techniques and Procedures MODULE 5 Facilitative Supervision for Quality Improvement Curriculum 2008.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
Family Resource and Youth Services Centers: Action Component Plan.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
A Team Members Guide to a Culture of Safety
Communications Management
FILLING THE GAPS THERE ARE NO PROBLEMS ONLY SOLUTIONS.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Social Marketing and Advocacy. Public Health Approach Define the Problem Identify Risk Factors Find what Prevents the Problem Implement & Evaluate Programs.
The purpose of evaluation is not to prove, but to improve.
Georgia Comprehensive Cancer Control Program 3/10/2015 Program Monitoring and Evaluation Activities Short-Term Outcomes Long-Term Outcomes Intermediate.
A Presentation on the Report of the Monitoring and Evaluation Exercise conducted between 1st January - 30th June, 2011 Presented By Jil Mamza Monitoring.
Module 7 Presentation and Evaluation.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Session 2: Developing a Comprehensive M&E Work Plan.
American Association for the Surgery of Trauma 1. Organized and Effective Injury Prevention (IP) Prioritize IP activities based upon data (CD18-1) – Trauma.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
National Partnership to Improve Dementia Care 1 Denise F. O’Donnell, RN, MN, GCMS-BC, MASM, NHA Nurse Consultant/ Division of Nursing Homes/Survey and.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Increased # of AI/AN receiving in- home environmental assessment and trigger reduction education and asthma self-management education Increased # of tribal.
Purpose Of Training: To guide Clinicians in the completion of screens and development of Alternative Community Service Plans.
A Review of Effective Teaching Skills
Key Building Blocks Evaluating Community Coalitions & Partnerships
Roles of District Community-Directed Intervention (CDI) Team Members
Presentation transcript:

Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT

Session Goal To provide Participants with the information to design a process evaluation for an injury prevention project.

Session Objectives Define Process Evaluation Describe why and when to use process evaluation Recognize the type of data collected during Process Evaluation Describe Process Evaluation methods Design a Process Evaluation for an injury prevention project

Formative Evaluation Process Evaluation Impact Evaluation Outcome Evaluation Short-term: Long-term: Timely Feedback Baseline Data Collection Project Implementation The 4 Stages of Evaluation Review Pre-TestingMonitoring Health Outcomes Knowledge, Attitudes & Practice Changes in the Target Audience

Process Evaluation Defined The type of evaluation used to determine if your project is: Being implemented as planned, and Reaching its target audience.

Process Evaluation Defined “Did I do what I set out to do, and did it make a difference?” Process Evaluation Impact/ Outcome Evaluation

Process Evaluation asks: –What was actually done? –Where and when was it done? –How often was it done? –Who did it, and who did they do it for? Process Evaluation Defined

Process evaluation planning takes place during project planning and before formative evaluation starts. Begins immediately after your project is implemented. Continues throughout the life of your project. Process Evaluation When its Used

Allows you to make adjustments in a timely manner. Needs of target population might change, and project may need to adapt. Identifies any problems that occur in reaching the target population. Process Evaluation Why it’s Used

Can be used to show funding agencies the project’s level of activity. Tells you how well your project is being implemented. Tells other interested programs, the “how” and “why” your program works. Process Evaluation Why it’s Used

Is necessary but not sufficient in evaluating a program’s effectiveness. Depends upon accurate record keeping and effective communication. Process Evaluation Limitations

Process Evaluation Measures ImplementationTarget Audience Process Evaluation Measures if your project is reaching its target audience Measures & evaluates project implementation

Process Evaluation Implementation Implementation activities: Developing project’s goals and objectives Creation of an implementation protocol Monitoring daily operations Data collection Coalition building Ensuring staffing is at proper level to meet program needs Ensuring that staff are sufficiently trained

Process Evaluation Implementation Questions What was actually done? When and where was it done? Who did it? Who did they do it for? Were any materials distributed? What barriers or challenges were discovered? What was the cost?

Process Evaluation Target Audience Reaching the Target Audience activities: Ensuring the target pop. is being reached Measuring program participation by target pop. Making materials and resources available and understood by target pop. Determining if program is relevant to target pop. Tracking the distribution of materials

Process Evaluation Target Audience Questions Is the target population being reached? What was the nature of this contact? How often and for how long was the target population involved? Are the project’s messages and materials appropriate for the target population?

Process Evaluation Data Collection Techniques A.Project Exposure B.Progress Review C.Internal Audit D.Target Population Survey E.Project Site Survey

Process Evaluation Data Collection Techniques A.Project Exposure Monitors all project contacts (telephone, , classes, etc.) and materials distributed (brochures, products, etc.) Monitors who utilizes project information (e.g., sign-in sheets)

Process Evaluation Data Collection Techniques B.Progress Review Reviews program activities to determine if goals and objectives are being met (e.g., meetings with coalition, tribal members, courses provided) Conducts interviews with staff and coalition members Conducted by project staff and interested parties

Process Evaluation Data Collection Techniques C.Internal Audit Compares implementation plan to actual activities Documents staff efforts, resources, amount of time devoted to each task, and date of task completion E.g., Sleep Safe Coordinator’s quarterly reports

Process Evaluation Data Collection Techniques D.Target Population Survey Measures whether program is reaching target audience Describes target population’s awareness of project, level of interest in the project, number who utilize project

Process Evaluation Data Collection Techniques E.Project Site Survey Determines if materials are being distributed effectively Monitors the number of materials used over a certain period of time E.g., Sleep Safe smoke alarm follow-up data collection form

Process Evaluation Interpreting the Data The 5 process evaluation methods are now producing results… Results of Internal Audit are used to inform project staff No problems discovered: status-quo Problems discovered: timely revisions to project May require Formative evaluation to determine the problem’s cause

Formative Evaluation Process Evaluation Impact Evaluation Outcome Evaluation Short-term: Long-term: Timely Feedback Baseline Data Collection Project Implementation Process Evaluation Interpreting the Data Pre-TestingMonitoring Health Outcomes Knowledge, Attitudes & Practice Changes in the Target Audience

Process evaluation is a management tool that is used to make sure that your project is implemented as planned and on schedule. Information collected during process evaluation can be used to make adjustments to your project. Tells you how well your project is being implemented. Process Evaluation Conclusion

“The only difference between stumbling blocks and stepping stones is how you use them.” (And your ability to recognize them.)

Using the provided goal & objectives and the Process Evaluation worksheet, your group should: –Select an objective. –Describe one way you will measure how well your project is implemented and how it will reach the target audience. –Describe how you will measure the objective using one of the process evaluation data collection techniques. Process Evaluation Exercise Time allowed: 15 minutes

Child Passenger Safety Goal:..to increase the use of child passenger safety seats & correct use of seats Objectives: –Coalition to increase number of CPST and Instructors by end of this year. –Media advertisement of clinics, checkpoints, and importance of car seats by March –Conduct a clinic & checkpoint per community yearly, starting next year.

Formative Evaluation Process Evaluation Impact Evaluation Outcome Evaluation Short-term: Long-term: Timely Feedback Baseline Data Collection Project Implementation The 4 Stages of Evaluation Pre-TestingMonitoring Health Outcomes Knowledge, Attitudes & Practice Changes in the Target Audience

Impact Evaluation Collects baseline information of people’s knowledge, attitudes, beliefs or behaviors. Long-term data collection (3 to 5 years) to measure and reduce morbidity mortality.

Impact Evaluation Changes in elders use (observed, self- reported) of walkers. Changes in community members behavior in using occupant restraints (seat belt wearing for adults or car seats for children.

Impact Evaluation Children bicycle helmet use (observed, self- reported) before/after a bicycle campaign. Changes in elders use (observed, self- reported) of walkers. Changes in number of operable smoke detectors installed/maintained in homes after a comprehensive fire prevention campaign.

Impact Evaluation Example: Alaska PFD Promotion Project Drowning rate in Alaska’s YK Delta more than 3 times greater than the State average PFD promotion project began in YK Delta Observational surveys of PFD use were conducted after baseline data was collected Timely changes made to increase sales and PFD use

Impact Evaluation Example: Alaska PFD Promotion Project Usage Rates % # Sales

Formative Evaluation Process Evaluation Impact Evaluation Outcome Evaluation Short-term: Long-term: Timely Feedback Baseline Data Collection Project Implementation The 4 Stages of Evaluation Pre-TestingMonitoring Health Outcomes Knowledge, Attitudes & Practice Changes in the Target Audience

Outcome Evaluation Usually requires significant resources, long periods of time, and ongoing data monitoring. Used less frequently than impact evaluation in NA injury prevention programs. Focuses on the program’s long-term effect on its target audience. Conducted after a program has been completed

Outcome Evaluation Example: Drowning Rates YK Delta vs. All- Alaska YK Starts PFD Program YK Delta All Alaskan 43% Reduction Rate Per 100,000

Evaluation Summary: Identify the evaluation type Number of PFDs distributed Decrease in the number of drinking and driving violations Review of IP materials describing storage of poisons Number of meetings with Tribal council to discuss possible speed limit ordinance Number of completed suicides

Additional Resources