PEPH Evaluation Metrics American Evaluation Association Meeting

Slides:



Advertisements
Similar presentations
Using Logic Models to Think about Outcomes and Metrics Partnerships for Environmental Public Health Evaluation Metrics Manual November 4, 2011 Helena L.
Advertisements

Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Community Level Models; Participatory Research and Challenges
ACCELERATING CLINICAL AND TRANSLATIONAL RESEARCH
Engagement + Accreditation + (X) + (X) = Performance Management
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
Blending the Theoretical and the Empirical in Evaluation Science: A Case Example from Cancer Control/Prevention and a General Discussion Ross Conner University.
Julie R. Morales Butler Institute for Families University of Denver.
704: Conducting Business in Fiscally Challenging Times: Strategies and Tools to Get There PCYA Leadership Academy Presentation March 28, 2012.
The Accessible AIDS Materials for Persons with Disabilities Project Further Development and Dissemination of a Curriculum to Address Issues related to.
The Importance of a Strategic Plan to Eliminate Health Disparities 2008 eHealth Conference June 9, 2008 Yvonne T. Maddox, PhD Deputy Director Eunice Kennedy.
 Council Overview  Past Priorities and Recommendations  Current Priorities ◦ Promoting Equity in State Policies and Programs ◦ Adverse Birth Outcomes.
Does Your HHW Education Program Work? Workshop on Program Evaluation – Evaluation Logic Model NAHMMA Conference September 22, 2006 Tacoma, WA Trudy C.
Session 2: Developing a Comprehensive M&E Work Plan.
November 13, 2010American Evaluation Association Annual Meeting-- San Antonio, TX1 Evaluating Partnerships: One size does not fit all Johnnye L. Lewis,
Module 2 National IEA Process Design and Organization Stakeholder’s consultative workshop/ Training on Integrated assessments methodology ECONOMIC VALUATION.
Surfacing UBC Scholarship through Knowledge Translation & Exchange Christie Hurrell Executive Director, UBC Centre for Health and Environment Research.
From Outputs to Impacts: Emerging Approaches to Track Scientific Research Impacts American Evaluation Association Saturday October 27, 2012.
12/26/2017 From Clinical Trial Research to Reality: Recruitment, Retention, and Community-Engaged Research Sherrie Flynt Wallington, PhD Assistant Professor.
Designing Effective Evaluation Strategies for Outreach Programs
Health Promotion & Aging
Evaluation of Priority Gender Equality
SNAP-Ed Evaluation Framework: Breakfast with Andy
Resource 1. Involving and engaging the right stakeholders.
Background Non-Formal Education is recognized as an important sub-sector of the education system, providing learning opportunities to those who are not.
Blue Hills Community Health Alliance (CHNA 20)
Monitoring and Evaluation Frameworks
Stakeholder consultations
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
BRIEFLY show definition
Loren Bell Linnea Sallack, MPH, RD Altarum Institute
Introduction to Comprehensive Evaluation
Northern Education Action Plan
Introduction to Program Evaluation
Evaluating Partnerships
Florida Institute on Homelessness & Affordable Housing Input Session
Laura E. Pechta, PhD, Centers for Disease Control and Prevention
OUTCOME MEASUREMENT TRAINING
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Centers for Disease Control (CDC) Tribal Advisory Committee Update and Public Health Initiatives Amy Groom, MPH National Center for Chronic Disease Prevention.
Opportunities for Growth
Putting Public Health Evidence in Action
RESEARCH IMPLEMENTATION PRACTICE
Strategic Planning Setting Direction Retreat
Welcoming All Families Into the School Community
Logic Models and Theory of Change Models: Defining and Telling Apart
The Early Childhood Technical Assistance Center
National Workshop on Planning for the GEOHealth Hub for Interdisciplinary Research and Training project overview and progress Kiros Berhane, PhD, Professor.
Building Changes’ Strategic Business Planning Process
Engaging the Community to Achieve Health Equity
As we reflect on policies and practices for expanding and improving early identification and early intervention for youth, I would like to tie together.
Evaluation in the GEF and Training Module on Terminal Evaluations
The role of the ECCP (1) The involvement of all relevant stakeholders – public authorities, economic and social partners and civil society bodies – at.
Building Knowledge about ESD Indicators
The partnership principle in the implementation of the CSF funds ___ Elements for a European Code of Conduct.
Retreat Preview: Reflecting on Current Strategic Priorities
Introduction to the PRISM Framework
Using Data for Program Improvement
Troubleshooting Logic Models
Early Childhood Special Education
What is PACE EH? PACE EH is a process for assessing and analyzing the environmental health of communities and for creating plans to address threats and.
Executive Order No. 23 Update Air & Waste Management Association Conference November 16, 2018 Presentation will focus on the latest policy development.
Using Data for Program Improvement
Quality and Process Improvement Program (QPIP)
A Focus on Strategic vs. Tactical Action for Boards
The Arizona Chronic Disease Plan:
HOW TO ENGAGE COMMUNITY MEMBERS IN OUTCOME EVALUATION?
Building PHN Scientists
Centers of Excellence for Childhood Obesity
Presentation transcript:

PEPH Evaluation Metrics American Evaluation Association Meeting November 13, 2010 Christie Drew, Ph.D. Program Analysis Branch National Institute of Environmental Health Sciences drewc@niehs.nih.gov /919-541-3319

Partnerships for Environmental Public Health (PEPH) Environmental Public Health is the science of conducting and translating research into action to address environmental exposures and health risks of concern to the public PEPH Program Objectives: Prevent, reduce or eliminate environmental exposures that may lead to adverse health outcomes in communities Increase the impact of environmental public health research at local, regional, and national levels Partners include scientists, community members, educators, health care providers, public health officials, and policy makers Participatory development Request for Information, Winter 2007-8 Workshop, June 2008

The PEPH “Umbrella” Includes Breast Cancer and the Environment Research Centers Program Centers for Children’s Environmental Health and Disease Prevention Research Centers for Population Health and Health Disparities Environmental Health Sciences Core Centers Community Outreach and Education Cores Environmental Justice Program Obesity and the Built Environment Research to Action Superfund Basic Research Program Worker Education and Training Program

Why create a PEPH Evaluation Metrics Manual? PEPH Stakeholders identified evaluation metrics as a clear need: RFI & Workshop, 2008 Logic models, approaches, and tangible metrics are useful in both planning and evaluation Helpful to establish a common language around activities, outputs, and impacts among those involved in PEPH projects http://www.niehs.nih.gov/research/supported/progra ms/peph/materials/index.cfm

Quotes from the Request for Information Web‐based evaluation tools would help, especially for scientists who are not accustomed to evaluating endpoints such as behavior change. NIEHS has not defined metrics for evaluation of “research translation” and “outreach”. What constitutes success?

Purpose Show how laying out program activities, outputs and desired impacts can help lead to program metrics Content of the manual is not intended to be prescriptive PEPH grantees and partners NIEHS and other agency program staff working with PEPH Other groups and organizations interested in measuring PEPH-like activities Audience

What do we mean by “metrics”? METRIC = a measure of magnitude (or another characteristic) An inch is a metric for length Length is a characteristic of an object – e.g., a projection screen All metrics are not equal; some are much easier to understand and apply than others It is more challenging to think about how to measure the magnitude of a partnership or an education program A key step to define your metrics is to define the characteristics of what you are trying to measure “Indicators” Philosophy: metrics based logic model

Example Logic Model Process Measures Outcome Measures Actions that use available inputs to “do” something Outcome Measures Direct products of activities For the sake of simplicity, our manual limits logic model components to activities, outputs, and impacts. These are defined as follows: Activities are actions that use available inputs to “do” something Outputs are direct products of activities Impacts are benefits or changes resulting from activities and outputs. Activities, outputs, and impacts are shown in different rows in the diagram. Two‐way arrows indicate that there are relationships among various elements, and that the relationships may move in either direction. Though not shown, relationships may exist between any box in the diagram, since, in theory, any action may lead to any output, and result in any impact. The logic models are presented as linear frameworks, but in practice, PEPH programs are often far from linear. In general, the logic models in our project are presented in increasing levels of maturity from left to right and from top to bottom. Benefits or changes resulting from the activities and outputs

Priority PEPH Program Activity Areas (for metrics) Introduction Partnerships Leveraging Products and dissemination Education and training Capacity Building Evaluation Partnerships Communication Capacity Building As we began conceptualizing the program, the team felt very strongly that we wanted NOT to develop the model on a program by program basis. As Liam discussed this morning, PEPH is an umbrella program that is meant to provide opportunities for outreach and participatory research specialists such as yourselves to reach beyond program boundaries and focus on common program activities. So the NIEHS team members prioritized these nine cross-cutting themes for metrics development the initial draft Manual. We organized them into three broad areas (partnerships, communication and capacity building – all major themes of the PEPH program), and are envisioning corresponding chapters in the manual. I’m very excited to announce to you today that two of these chapters are ready for circulation and public comment (green arrows next to Partnerships and Education and training). Drafts have been included in the flash drive cards that you received when you registered for the meeting and will also be posted to the PEPH website. Two other chapters should follow in the next few months (yellow arrows) and the remaining will follow thereafter. One other point regarding the communication category, we explicitly did not include bibliometrics as a key prioirty area for our metrics development. We heard from grantees that bibliometrics may be less applicable to the PEPH community because the grantees don’t always publish materials in traditional peer reviewed academic journals. When available, these may provide appropriate metrics, but we wanted to develop metrics that went beyond those appropriate for academic journals.

Panelists/Discussants Partnerships Chapter – Ashley Brenner, STPI Education and Training Chapter – Helena Davis, Partnerships in the Real World – Johnnye Lewis, University of New Mexico Evaluating Worker Training Programs – Tom McQuiston, United Steelworkers Discussion – Gretchen Jordan, Sandia National Laboratories