PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Performance Appraisal
Measuring and Monitoring Program Outcomes
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
PPA 502 – Program Evaluation Lecture 4a – Qualitative Data Collection.
Types of Evaluation.
Measuring Learning Outcomes Evaluation
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Competency Assessment Public Health Professional (2012)-
How to Develop the Right Research Questions for Program Evaluation
RESEARCH DESIGN.
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Logistics and supply chain strategy planning
ASSESSMENT OF HRD NEEDS Jayendra Rimal. Goals of HRD Improve organizational effectiveness by: o Solving current problems (e.g. increase in customer complaints)
© 2010 by Nelson Education Ltd.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Logic Models and Theory of Change Models: Defining and Telling Apart
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Using Needs Assessment to Build A Strong Case for Funding Anna Li SERVE, Inc.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Overview of Evaluation ED Session 1: 01/28/10.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
Program Evaluation.
Training and Development Prof R K Singh AIMA CME.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Chapter Thirteen – Organizational Effectiveness.  Be able to define organizational effectiveness  Understand the issues underpinning measuring organizational.
Kathy Corbiere Service Delivery and Performance Commission
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Case Studies and Review Week 4 NJ Kang. 5) Studying Cases Case study is a strategy for doing research which involves an empirical investigation of a particular.
© 2017 Cengage Learning®. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Session 2: Developing a Comprehensive M&E Work Plan.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Role of Community Health Educators. Lecture Objectives By the end of this lecture, you will be able to: Have a good understanding of the role of community.
Introduction Social ecological approach to behavior change
Logic Models How to Integrate Data Collection into your Everyday Work.
Introduction Social ecological approach to behavior change
Data Collection Methods
Introduction to Program Evaluation
Chapter 12 Implementing strategy through organization
Articulate how the practice of management has evolved
Chapter 12 Implementing strategy through organization
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

PPA 502 – Program Evaluation Lecture 2c – Process Evaluation

Introduction  You must figure out exactly what the program theory is—that is, you must articulate the program theory; you must make it explicit, so that it can be examined and evaluated.  Typically, a program theory will be implicit, at best. That’s why the evaluator must understand how the program is intended to operate.  Program theory (like needs assessment) is very important, because it is foundational for later types of evaluation in the RFL model (i.e., implementation assessment, impact assessment, and efficiency assessment).

Introduction Overview of Program Theory Target population Service Arena Program Service Utilization Plan Program's Organizational Plan Impact Theory Target interaction with delivery system Program - target service interactions Program facilities, personnel, activities Proximal Outcomes Distal Outcomes

Program Theory  A social program centers on the transactions that take place between a program’s operations and the population it serves. –Program impact theory. –Service utilization plan. –Program organizational plan.

Program Theory  Program impact theory. –Assumptions about the change process started by the program and the improved conditions expected to result.

Program Theory  Service utilization plan. –The program’s assumptions and expectations about how to reach the target population, provide and sequence service contacts, and conclude the relationship when services are no longer needed or appropriate.

Program Theory  Program’s organizational plan. –Program resources, personnel, administration, and general organization. –Resources, organization, and activities will produce a viable organization that will successfully implement the service utilization plan.  Process theory. –The service utilization plan and the organizational plan combined.

Process Evaluation  Process evaluation assesses the success or failure of the program theory.

Three Major Questions of Process Evaluation  What is the program intended to be? –Methods to develop and specify program components  What is delivered in reality? –Methods for measuring program implementation  Why are there gaps between program plans and program delivery? –Assessing influences on the variability of implementation.

Chain of Events Logic for Process and Impact Evaluation

Process evaluation to develop and specify program components  A major function of process evaluation is using data to help design interventions to clarify and obtain agreement on what is the intended program.

Types of program components  Intended recipients. –Background characteristics appropriate for program. –Eligibility requirements. –Recruitment mechanisms. –Selection process.  Intended context. –Types of agencies that will deliver the program, and their characteristics. –Community context.

Types of program components  Intended delivery. –Activities: Who does what? With whom? –Staffing: What types, backgrounds, and skills will be used? –Materials needed for delivery. –Information and information systems.  Intended scope of program.

Criteria for measurable program components  Specify activities as behaviors that can be observed rather than as goals or objectives.  Ensure that each component is separate and distinguishable from other components so that each one can be separately measured.  Explicitly link each component to its underlying theoretical rationale.  Include all activities and materials intended for use in the intervention.  Identify the aspects of the intervention that are intended to be adapted to the setting, and those that are intended to be delivered as designed.

Techniques for Program Specification  Formative Evaluation –Data collected from pilot situations and recipients while developing an intervention to obtain feedback about the feasibility of proposed activities and their fit with intended settings and recipients.  Evaluability Assessment –A set of systematic processes for developing the underlying program theory and clarifying intended uses of data before initiating full-scale evaluation.  Use of Theory to Aid Program Specification –Applying normative and causal theories relevant to the content area and using data to elucidate the underlying processes.

Data collection methods for formative evaluation  Focus groups  Observation  Open-ended interviews  Ethnographic analysis  Message or form analysis  Expert judgment  Equipment trial

Process evaluation methods for measuring program implementation  The program components are the basis for selecting or developing instruments to measure two key aspects of program delivery: the extent of implementation (number and quality of components delivered) and the scope of implementation (number of recipients reached and their characteristics).

Uses of implementation data  Monitor current activities to identify problems in implementation, then improve service delivery.  Measure variability in program delivery for later statistical analyses of program impacts.  Use as dependent variables in assessing why delivery is or is not carried out as intended.

What to measure  Number of participants reached and their characteristics.  The number of program activity components and their characteristics.

How to measure implementation  Use of technical equipment.  Indirect unobtrusive measures.  Direct observation.  Activity or participation log.  Organizational records.  Written questionnaires.  Telephone or in-person interviews.  Case studies.

Problems in implementation measurement  Single measures.  Reliability and validity.  Representativeness.  Implementation should be explicitly linked to outcomes for impact assessment.

Assessing influences on the variability of implementation  The implemented program is likely to vary from original intentions. –Why does it occur? –How does it effect program effectiveness? –What might be done to create full implementation?

Macro-implementation versus micro-implementation  Macro-implementation: large scale programs. Focus on intergovernmental management. –Decision points, competing priorities, interests and resources of the various actors, and diverse strategies for change.  Micro-implementation: the extent of compatibility between the pre-existing organization and the new program requirements.

Influences on micro- implementation  Organization as a whole –Types of decision-making processes used – centralized or participatory. –Procedures and priorities for allocating resources. –The supportiveness of an organization’s overall culture toward programs of the type implemented. –The types of pressures emanating from the organization’s environment of beneficiaries, supporters, competitors, and regulators.

Influences on micro- implementation  Structures and processes of an organization’s work units. –Expectations and performance feedback from the supervisors of each unit. –Technologies in use and their fit to the new program. –Standard operating procedures used to simplify and regulate work flow. –Use of time and availability of time to learn activities for the new program. –Social norms governing ways of working that have developed within a unit. –Communication processes both within and between units.

Influences on micro- implementation  The actions of individual deliverers (“street-level bureaucrats”). –Their own capabilities or skills in using the innovation. –Extent and types of training they are given for delivering the new program. –Their concerns about how it affects them personally, such as its possible effects on their future employment and advancement. –External incentives and internal motivations to learn and to deliver the new program.

Methods for Data Collection and Analysis  Case Studies  Systematic surveys of organizational members.

Suggested roles for process evaluation  The usefulness of evaluation for improving program management and delivery will be substantially increased by more emphasis on data about program processes.  Evaluations of the impact of program interventions should always include measures of the extent of program delivery.

Suggested roles for process evaluation (contd.)  Understanding of program impacts, whether or not the desired change occurred, will be greatly strengthened by process evaluation data.  The more that variation in program delivery is expected among multiple sites, the greater the need for process evaluation.  The larger the scale of an evaluation study in number of sites and participants, the greater the need to measure the extent and processes of intervention delivery.