Presentation is loading. Please wait.

Presentation is loading. Please wait.

PEPH Evaluation Metrics American Evaluation Association Meeting

Similar presentations


Presentation on theme: "PEPH Evaluation Metrics American Evaluation Association Meeting"— Presentation transcript:

1 PEPH Evaluation Metrics American Evaluation Association Meeting
November 13, 2010 Christie Drew, Ph.D. Program Analysis Branch National Institute of Environmental Health Sciences /

2 Partnerships for Environmental Public Health (PEPH)
Environmental Public Health is the science of conducting and translating research into action to address environmental exposures and health risks of concern to the public PEPH Program Objectives: Prevent, reduce or eliminate environmental exposures that may lead to adverse health outcomes in communities Increase the impact of environmental public health research at local, regional, and national levels Partners include scientists, community members, educators, health care providers, public health officials, and policy makers Participatory development Request for Information, Winter Workshop, June 2008

3 The PEPH “Umbrella” Includes
Breast Cancer and the Environment Research Centers Program Centers for Children’s Environmental Health and Disease Prevention Research Centers for Population Health and Health Disparities Environmental Health Sciences Core Centers Community Outreach and Education Cores Environmental Justice Program Obesity and the Built Environment Research to Action Superfund Basic Research Program Worker Education and Training Program

4 Why create a PEPH Evaluation Metrics Manual?
PEPH Stakeholders identified evaluation metrics as a clear need: RFI & Workshop, 2008 Logic models, approaches, and tangible metrics are useful in both planning and evaluation Helpful to establish a common language around activities, outputs, and impacts among those involved in PEPH projects ms/peph/materials/index.cfm

5 Quotes from the Request for Information
Web‐based evaluation tools would help, especially for scientists who are not accustomed to evaluating endpoints such as behavior change. NIEHS has not defined metrics for evaluation of “research translation” and “outreach”. What constitutes success?

6 Purpose Show how laying out program activities, outputs and desired impacts can help lead to program metrics Content of the manual is not intended to be prescriptive PEPH grantees and partners NIEHS and other agency program staff working with PEPH Other groups and organizations interested in measuring PEPH-like activities Audience

7 What do we mean by “metrics”?
METRIC = a measure of magnitude (or another characteristic) An inch is a metric for length Length is a characteristic of an object – e.g., a projection screen All metrics are not equal; some are much easier to understand and apply than others It is more challenging to think about how to measure the magnitude of a partnership or an education program A key step to define your metrics is to define the characteristics of what you are trying to measure “Indicators” Philosophy: metrics based logic model

8 Example Logic Model Process Measures Outcome Measures
Actions that use available inputs to “do” something Outcome Measures Direct products of activities For the sake of simplicity, our manual limits logic model components to activities, outputs, and impacts. These are defined as follows: Activities are actions that use available inputs to “do” something Outputs are direct products of activities Impacts are benefits or changes resulting from activities and outputs. Activities, outputs, and impacts are shown in different rows in the diagram. Two‐way arrows indicate that there are relationships among various elements, and that the relationships may move in either direction. Though not shown, relationships may exist between any box in the diagram, since, in theory, any action may lead to any output, and result in any impact. The logic models are presented as linear frameworks, but in practice, PEPH programs are often far from linear. In general, the logic models in our project are presented in increasing levels of maturity from left to right and from top to bottom. Benefits or changes resulting from the activities and outputs

9 Priority PEPH Program Activity Areas (for metrics)
Introduction Partnerships Leveraging Products and dissemination Education and training Capacity Building Evaluation Partnerships Communication Capacity Building As we began conceptualizing the program, the team felt very strongly that we wanted NOT to develop the model on a program by program basis. As Liam discussed this morning, PEPH is an umbrella program that is meant to provide opportunities for outreach and participatory research specialists such as yourselves to reach beyond program boundaries and focus on common program activities. So the NIEHS team members prioritized these nine cross-cutting themes for metrics development the initial draft Manual. We organized them into three broad areas (partnerships, communication and capacity building – all major themes of the PEPH program), and are envisioning corresponding chapters in the manual. I’m very excited to announce to you today that two of these chapters are ready for circulation and public comment (green arrows next to Partnerships and Education and training). Drafts have been included in the flash drive cards that you received when you registered for the meeting and will also be posted to the PEPH website. Two other chapters should follow in the next few months (yellow arrows) and the remaining will follow thereafter. One other point regarding the communication category, we explicitly did not include bibliometrics as a key prioirty area for our metrics development. We heard from grantees that bibliometrics may be less applicable to the PEPH community because the grantees don’t always publish materials in traditional peer reviewed academic journals. When available, these may provide appropriate metrics, but we wanted to develop metrics that went beyond those appropriate for academic journals.

10 Panelists/Discussants
Partnerships Chapter – Ashley Brenner, STPI Education and Training Chapter – Helena Davis, Partnerships in the Real World – Johnnye Lewis, University of New Mexico Evaluating Worker Training Programs – Tom McQuiston, United Steelworkers Discussion – Gretchen Jordan, Sandia National Laboratories


Download ppt "PEPH Evaluation Metrics American Evaluation Association Meeting"

Similar presentations


Ads by Google