How to Plan a Local Evaluation and Lessons Learned Philip Rodgers, Ph.D. Evaluation Scientist American Foundation for Suicide Prevention 2013 Garrett Lee.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
Copyright © Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark.
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Presented by: John. J. Campbell, M.A. John M. Morrow, Ph.D. Optimizing Federal Funding Streams to Support COD Services.
Data Collection* Presenter: Octavia Kuransky, MSP.
Program Evaluation It’s Not Just for OMB Anymore….
Developing a Logic Model
Getting on the same page… Creating a common language to use today.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Grant Writing1 Grant Writing Lecture What are the major types of grants available in mental health research? What is the process of grant preparation and.
Agenda: Block Watch outcome map Program Theory overview Evaluation theory overview Mentoring Evaluation Assignment 1 Evaluation Debrief.
Evaluation. Practical Evaluation Michael Quinn Patton.
Using Technology to Engage School and Community Mental Health Personnel in Addressing Students in Distress Yolanda Jenkins, Carolyn Givens, Yvette Jackson,
Human Resources Training and Individual Development February 11: Training Evaluation.
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
MENTAL HEALTH SERVICES FOR PARENTS EXPERIENCING HOMELESSNESS: A GRANT PROPOSAL Lindsay Willman California State University, Long Beach May 2013.
SAMHSA Garrett Lee Smith State & Tribal Grant The Ohio Suicide Prevention Foundation The Ohio Department of Mental Health.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
How to Develop the Right Research Questions for Program Evaluation
Program Planning & Evaluation Begin with the End in Mind Dr. Dallas L. Holmes Specialist Institutional Research Utah State University Extension.
1 copyright EDOPC Enhancing Developmentally Oriented Primary Care Swaying Systems and Impacting Lives.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
The Proof is in The Process: Data-Driven Program Implementation Rose Lee Felecia Johnson Tonya Johnson.
A Collaborative Community-Engaged Approach to Evaluation for the Alliance for Research in Chicagoland Communities M. Mason, PhD, 1,2 B. Rucker, MPH 3,4.
Arc-Riverside Child Abuse & Neglect/Disability Outreach Project - CAN/DO An OCJP funded Project through the Children’s Justice Act Task Force.
(c) 2014 Michael Sikes, Ph.D. Theory of Change: Bridging Ideas, Actions, and Results Michael Sikes, Ph.D. (805)
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
School-Wide Positive Behavior Support District Planning Louisiana Positive Behavior Support Project.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Carrie E. Markovitz, PhD Program Evaluation: Challenges and Recommendations July 23, 2015.
Coaches Training Introduction Data Systems and Fidelity.
Funded by SAMHSA through the Garrett Lee Smith Campus Suicide Prevention Grant Program Cohort 1 and Cohort 3 ASU Campus Care
Welcome to the Youth Suicide Prevention Webinar: The Nuts and Bolts of Data Collection Forms For Audio: call and enter participant code:
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
RET conference An External Evaluation of the RET Program in the Materials Science and Engineering Department, University of Arizona: Our Model &
OHIO SAMHSA Garrett Lee Smith State Grant The Ohio Suicide Prevention Foundation The Ohio Department of Mental Health.
TRAINING & DEVELOPMENT Dr. Anil Mehta DEFINITION OF TRAINING “A PLANNED ACTIVITY TO MODIFY ATTITUDE, KNOWLEDGE OR SKILL THROUGH LEARNING EXPERIENCE TO.
Community Planning Training 5- Community Planning Training 5-1.
Program Evaluation.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
From the graphic point of view, P. D. C. A
Program Evaluation Principles and Applications PAS 2010.
Prevention Education Meeting May 29, 2013 Evaluation 101.
J. Aaron Johnson, PhD 1 and J. Paul Seale, MD 2 1 Institute of Public and Preventive Health and Department of Psychology, Georgia Regents University, Augusta,
The purpose of evaluation is not to prove, but to improve.
Evidenced Based Protocols for Adult Drug Courts Jacqueline van Wormer, PhD Washington State University NADCP/NDCI.
Community Resources Assessment Training 2- Community Resources Assessment Training 2-1 Community Resources Assessment Training 2-1.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
A Regional Health Agenda: Opportunities for Collaboration Presentation to the Council of Government’s Health Officers Committee June 11, 2007 Health Working.
Are we there yet? Evaluating your graduation SiMR.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
TRAC e-Training. What are the TRAC Annual Goals? Required for all grantees Specific to TRAC activities o Number of consumers served o Infrastructure indicators.
Florida Linking Individuals Needing Care (FL LINC)
Developing Effective Performance Measures: A Skill-Building Workshop David J. Bernstein, Ph.D., Senior Study Director, Westat Sponsored by the Government.
A Hierarchy for Targeting Outcomes and Evaluating Their Achievement S. Kay Rockwell Professor and Evaluation Specialist Agricultural.
for CIT Program Operation Resource Development Institute
National Public Health Performance Standards Program: A Users Perspective Judy Monroe, MD Indiana State Health Commissioner APHA Annual Meeting November.
Using mixed methods to develop and evaluate public health education interventions Presented by: Louise C. Palmer
Identification of Infrastructure Gaps to
It’s Not Just for OMB Anymore…
Assessment of Service Outcomes
Presentation transcript:

How to Plan a Local Evaluation and Lessons Learned Philip Rodgers, Ph.D. Evaluation Scientist American Foundation for Suicide Prevention 2013 Garrett Lee Smith Combined Annual Grantee Meeting June 11-13, 2013, Washington DC

Acknowledgements U.S. Department of Health & Human Services Substance Abuse and Mental Health Services Agency Howard Sudak, MD AFSP Katrina Bledsoe, PhD SPRC

Defining Evaluation

5 Why is evaluation important? Required by funding agencies. Improves performance. Demonstrates effectiveness. Advances knowledge. As not everything can be done, there must be a basis for deciding which things are worth doing. Enter evaluation. M. Q. Patton Source for Patton Quote: U.S. Department of Health and Human Services, P. H. S. (2001). National Strategy for Suicide Prevention: Goals and Objectives for Action. Rockville, MD: U.S. Department of Health and Human Services, Public Health Service.

6 What are types of evaluation? Participatory evaluation Formative evaluation Summative evaluation Responsive evaluation Goal-free evaluation Empowerment evaluation Advisory evaluation Accreditation evaluation Adversary evaluation Utilization evaluation Consumer evaluation Theory-driven evaluation We will address: Process, Outcome, and Impact Evaluations

7 What is process evaluation? Process “evaluation assesses the extent to which a program is operating as it was intended.” Source: General Accounting Office. (2005). Performance Measurement and Evaluation: Definitions and Relationships (Vol. GAO SP). Washington DC: United States Government Accounting Office.

8 What is outcome evaluation? Outcome “evaluation assesses the extent to which a program achieves its outcome-oriented objectives.” Source: General Accounting Office. (2005). Performance Measurement and Evaluation: Definitions and Relationships (Vol. GAO SP). Washington DC: United States Government Accounting Office.

9 What is impact evaluation? “Impact evaluation…assesses the net effect of a program by comparing program outcomes with an estimate of what would have happened in the absence of the program.” Source: General Accounting Office. (2005). Performance Measurement and Evaluation: Definitions and Relationships (Vol. GAO SP). Washington DC: United States Government Accounting Office.

I. Logic Models & Evaluation

11 InputsActivitiesOutputs Funds Trainers Materials Trainees Gatekeeper Training  People trained  Satisfaction with training  Fidelity of training Logic models can drive evaluations Generic Gatekeeper Training Logic Model Outcome Evaluation Process Evaluation Short-term a.  Knowledge b.  Attitudes Intermediate c.  Identification of those at risk d.  Referral Long-term f.  Treatment g.  Suicide deaths a.  Knowledge b.  Attitudes c.  Identification of those at risk d.  Referral f.  Treatment g.  Suicide deaths Impact Evaluation vs. Control Group

II. How to Develop Evaluation Questions

13 Where do evaluation questions come from? Generally, from the goals listed in a logic model. More specifically, defined by stakeholders through a collaborative process. Depending upon circumstances, stakeholders can be funders, participants, trainers, evaluators, and others, or a combination of these.

14 Divergent, convergent process… Stakeholders meet with relevant materials (grant application, logic model, etc.). After a review of materials, engage in divergent process—a free association of evaluation questions After the divergent process, stakeholders collectively narrow list of questions to manageable proportions through a convergent process.

III. How to Answer Evaluation Questions

16 What is measurement? Measurement is the means you use to collect data. It includes how you collect data and what data you collect (and how well you collect data).

17 How will you collect data? Questionnaires (in-person, mail, , phone) Psychological Tests Interviews Health Records Health Statistics ObservationsLogs

18 Where do you find measures? Create your own (pilot test!) Borrow from other evaluations (with permission!) Search the literature (see Additional Resources) Use standardized measures (may cost) –Brown (adult) and Goldston (adolescent) reviews Use existing data sources and records

IV. How to Develop an Evaluation Plan

20 What evaluation design do you need? What is your purpose? –Performance assessment? –Evidence of concept? –Evidence of effectiveness?

21 There are four basic evaluation designs Posttest OnlyXO Pre- and PosttestOXO Posttest Only w/controlXO O Pre- and Posttest w/controlOXO OO

22 Additional data collection points can be added to the basic desgins Posttest OnlyXOO Pre- and PosttestOXOO Posttest Only w/controlXOO OO Pre- and Posttest w/controlOXOO OOO

23 Best evidence comes when subjects are randomly assigned to groups x Exp. Group Con. Group Exp. Group Con. Group Pool of Subjects or Groups Random assignment increases the likelihood that subjects in both groups are equivalent in regards to factors that may be related to outcomes.

24 If random assignment is not possible… Compare groups that are similar. Use a pretest so that group differences—to some extent—are accounted for.

Evaluation Planning Forms 25

Philip Rodgers, PhD American Foundation for Suicide Prevention