 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Introduction to Monitoring and Evaluation
Developing and Implementing a Monitoring & Evaluation Plan
Study Objectives and Questions for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
Mywish K. Maredia Michigan State University
Dr. Suzan Ayers Western Michigan University
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Project Monitoring Evaluation and Assessment
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Chapter 3 Identifying Issues and Formulating Questions – Mary Ellen Good Identification and formulation of questions is a critical phase of the evaluation.
Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr.
PPA 502 – Program Evaluation
PPA 503 – The Public Policy Making Process
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
What should be the basis of
The Research Problem and Objectives Lecture 6 1. Organization of this lecture Research Problem & Objectives: Research and Decision/Action Problems Importance.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Codex Guidelines for the Application of HACCP
CASE STUDIES IN PROJECT MANAGEMENT
What is Business Analysis Planning & Monitoring?
PROGRAM PLANNING, IMPLEMENTATION & EVALUATION The Service Delivery Model Link Brenda Thompson Jamerson, Chair Services to Youth Facet May 8-12, 2013.
Results-Based Management
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Types of evaluation examine different aspects of performance Resources (Inputs) ActivitiesOutputs Short-Term Outcomes Intermediate Outcomes (through customers)
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
1 FUTURREG Evaluation Objectives and methodology 3nd Steering Committee Meeting Malta, 28/6/2006.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
McGraw-Hill/Irwin Copyright © 2006 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 3 Identification and Selection of Development Projects.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
Selecting Criteria and Setting Standards. Useful Criteria and Standards Criteria need to reflect intent of the program Select criteria that can be influenced.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
1 TESL Evaluating CALL Packages:Curriculum/Pedagogical/Lingui stics Dr. Henry Tao GUO Office: B 418.
Evaluation design and implementation Puja Myles
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
1 Performance Auditing ICAS & IRAS Officers NAAA 21 Jan 2016.
Foundations of American Education: Perspectives on Education in a Changing World, 15e © 2011 Pearson Education, Inc. All rights reserved. Chapter 11 Standards,
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
Session 2: Developing a Comprehensive M&E Work Plan.
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
Stages of Research and Development
Building an ENI CBC project
Logic Models How to Integrate Data Collection into your Everyday Work.
Incorporating Evaluation into a Clinical Project
Strategic Planning for Learning Organizations
Measuring Outcomes of GEO and GEOSS: A Proposed Framework for Performance Measurement and Evaluation Ed Washburn, US EPA.
CATHCA National Conference 2018
Regulated Health Professions Network Evaluation Framework
OGB Partner Advocacy Workshop 18th & 19th March 2010
Civil Society Facility and Media Programme Call for proposals: EuropeAid/162473/DH/ACT/Multi Webinar no. 3: Preparing effective Concept Note.
How is an M & E framework derived from the logframe?
M & E Plans and Frameworks
Presentation transcript:

 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation questions  Program Logic Models  Constructing the evaluation framework

 Program (Non-evaluator) Stakeholders  Individuals, groups or organizations with a stake in the program or its evaluation  Program structural and context considerations  Age, program theory, program organization openness to change, consensus among stakeholders, micro politics

 Evaluator-Stakeholder relationships  External evaluation, internal evaluation and collaborative evaluation (Collab, Partic, Emp)  Relationships among non-evaluator stakeholders  Differential access to power, knowledge, expertise  Conflicting values perspectives  Resource considerations: fiscal resources, human resources, expertise (evaluation logic, program logic), time.

 Policy makers/sponsors  Program developers  Program administrators/managers  Program implementers  Intended program beneficiaries  Special interest groups  Others WHO ARE THEY? WHAT ARE THEIR INTERSTS? WHOSE INTERESTS COUNT?

 Identify reasons for initiating the evaluation, issues, questions of interest  Identify contextual conditions: micro political analysis, evaluability assessment  List stakeholders and specify their interest and relative importance  Identify dimensions of performance (criteria) and standards  Describe object for evaluation  Tap multiple sources:  Stakeholders, models/frameworks, literature, professional standards, expert consultants, professional judgement.

 Useful information from stakeholders  Perception of the program  Program purposes/goals  Program theory  Concerns  Evaluation questions  Intended uses of evaluation  Other stakeholders and their stake.

 Identification of primary users and other stakeholders and their main interests, issues or questions.  Initial statement of the problem from primary stakeholders.  Other stakeholders take on the problem.  Generate list of rank ordered issues.  Attention to potential uses and decision rules.  Identify constraints and modify evaluation issues as appropriate.

 From convergent process, identify and determine priority of feasible list of questions  Implementation evaluation  Is the programme being implemented as intended? Why/not?  Process evaluation  Which aspects or components of the program are most potent in affecting desired outcomes?  Impact evaluation  Did the program meet its objectives? What unintended effects of the programme can be observed?

“A tool for describing program theory and for guiding program measurement, monitoring, and management” Rossi et al.

 Needs: Raison d’être for the program. The problem to be solved  Inputs: human, fiscal and other resources (e.g., partnerships, infrastructure) needed to run the program  Activities: all action steps needed to produce program outputs  Outputs: goods and services generated by the program (necessary but insufficient conditions required to realize program outcomes; can be counted)  Outcomes: link to program objectives; short- term or immediate, intermediate, long term (observed changes)

Program Activities Inputs/resources Program Need Outputs Outcomes

‘Results Chain’ Logic Model Efficiency Effectiveness Area of Control Internal to the Organization Area of Influence External to the Organization External Factors Inputs (Resources) ActivitiesOutputs Immediate Outcomes (Direct) Intermediate Outcomes (Indirect) FINAL OUTCOME

 Activities (service utilization plan):  Recruitment  Induction  Delivery Regular program activities Streaming Supplemental activities  Terminal

 Matrix that associates evaluation questions with methods  Elements (rows)  Evaluation questions  Indicators (how would we know?)  Data sources (from whom to gather evidence)  Methods (how evidence will be gathered  Basis for Comparison (how good is good enough?)

QuestionsIndicatorsData Sources MethodsBasis for comparison Implemented as intended? -Instructional activities -Scope & sequence -teachers -admin Observation Focus group Interview Document PLM Inst’l plans Objectives met? Student learning -students -teachers -pre-posttest -interviews -Baseline -external std. -comparison group