PEPH Evaluation Metrics American Evaluation Association Meeting November 13, 2010 Christie Drew, Ph.D. Program Analysis Branch National Institute of Environmental Health Sciences drewc@niehs.nih.gov /919-541-3319
Partnerships for Environmental Public Health (PEPH) Environmental Public Health is the science of conducting and translating research into action to address environmental exposures and health risks of concern to the public PEPH Program Objectives: Prevent, reduce or eliminate environmental exposures that may lead to adverse health outcomes in communities Increase the impact of environmental public health research at local, regional, and national levels Partners include scientists, community members, educators, health care providers, public health officials, and policy makers Participatory development Request for Information, Winter 2007-8 Workshop, June 2008
The PEPH “Umbrella” Includes Breast Cancer and the Environment Research Centers Program Centers for Children’s Environmental Health and Disease Prevention Research Centers for Population Health and Health Disparities Environmental Health Sciences Core Centers Community Outreach and Education Cores Environmental Justice Program Obesity and the Built Environment Research to Action Superfund Basic Research Program Worker Education and Training Program
Why create a PEPH Evaluation Metrics Manual? PEPH Stakeholders identified evaluation metrics as a clear need: RFI & Workshop, 2008 Logic models, approaches, and tangible metrics are useful in both planning and evaluation Helpful to establish a common language around activities, outputs, and impacts among those involved in PEPH projects http://www.niehs.nih.gov/research/supported/progra ms/peph/materials/index.cfm
Quotes from the Request for Information Web‐based evaluation tools would help, especially for scientists who are not accustomed to evaluating endpoints such as behavior change. NIEHS has not defined metrics for evaluation of “research translation” and “outreach”. What constitutes success?
Purpose Show how laying out program activities, outputs and desired impacts can help lead to program metrics Content of the manual is not intended to be prescriptive PEPH grantees and partners NIEHS and other agency program staff working with PEPH Other groups and organizations interested in measuring PEPH-like activities Audience
What do we mean by “metrics”? METRIC = a measure of magnitude (or another characteristic) An inch is a metric for length Length is a characteristic of an object – e.g., a projection screen All metrics are not equal; some are much easier to understand and apply than others It is more challenging to think about how to measure the magnitude of a partnership or an education program A key step to define your metrics is to define the characteristics of what you are trying to measure “Indicators” Philosophy: metrics based logic model
Example Logic Model Process Measures Outcome Measures Actions that use available inputs to “do” something Outcome Measures Direct products of activities For the sake of simplicity, our manual limits logic model components to activities, outputs, and impacts. These are defined as follows: Activities are actions that use available inputs to “do” something Outputs are direct products of activities Impacts are benefits or changes resulting from activities and outputs. Activities, outputs, and impacts are shown in different rows in the diagram. Two‐way arrows indicate that there are relationships among various elements, and that the relationships may move in either direction. Though not shown, relationships may exist between any box in the diagram, since, in theory, any action may lead to any output, and result in any impact. The logic models are presented as linear frameworks, but in practice, PEPH programs are often far from linear. In general, the logic models in our project are presented in increasing levels of maturity from left to right and from top to bottom. Benefits or changes resulting from the activities and outputs
Priority PEPH Program Activity Areas (for metrics) Introduction Partnerships Leveraging Products and dissemination Education and training Capacity Building Evaluation Partnerships Communication Capacity Building As we began conceptualizing the program, the team felt very strongly that we wanted NOT to develop the model on a program by program basis. As Liam discussed this morning, PEPH is an umbrella program that is meant to provide opportunities for outreach and participatory research specialists such as yourselves to reach beyond program boundaries and focus on common program activities. So the NIEHS team members prioritized these nine cross-cutting themes for metrics development the initial draft Manual. We organized them into three broad areas (partnerships, communication and capacity building – all major themes of the PEPH program), and are envisioning corresponding chapters in the manual. I’m very excited to announce to you today that two of these chapters are ready for circulation and public comment (green arrows next to Partnerships and Education and training). Drafts have been included in the flash drive cards that you received when you registered for the meeting and will also be posted to the PEPH website. Two other chapters should follow in the next few months (yellow arrows) and the remaining will follow thereafter. One other point regarding the communication category, we explicitly did not include bibliometrics as a key prioirty area for our metrics development. We heard from grantees that bibliometrics may be less applicable to the PEPH community because the grantees don’t always publish materials in traditional peer reviewed academic journals. When available, these may provide appropriate metrics, but we wanted to develop metrics that went beyond those appropriate for academic journals.
Panelists/Discussants Partnerships Chapter – Ashley Brenner, STPI Education and Training Chapter – Helena Davis, Partnerships in the Real World – Johnnye Lewis, University of New Mexico Evaluating Worker Training Programs – Tom McQuiston, United Steelworkers Discussion – Gretchen Jordan, Sandia National Laboratories