Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

A Systems Approach To Training
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Education Policy Advocacy Objectives: 1.To learn why advocacy is one of the roles of CSOs. 2.To learn the process for developing an effective strategic.
Donald T. Simeon Caribbean Health Research Council
Chapter 15 Evaluation Recognizing Success. Social Work Evaluation and Research Historically –Paramount to the work of early social work pioneers Currently.
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
Developing Indicators to Assess Program Outcomes Kathryn E. H. Race Race & Associates, Ltd. Panel Presentation at American Evaluation Association Meeting.
Data-driven Approaches to Monitoring and Evaluating Environmental Strategies Training Workshop for Vermont Community Prevention Coalitions March 20, 2012.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
The Academic Assessment Process
Orientation to Performance and Quality Improvement Plan
CIPPI introduction to the model
Understanding the Effects of Technology Based Enhancement of Professional Development Alan Lesgold February, 2000.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Types of Evaluation.
Formulating the research design
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Home Career Counseling and Services: A Cognitive Information Processing Approach James P. Sampson, Jr., Robert C. Reardon, Gary W. Peterson, and Janet.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Impact Evaluation: Initiatives, Activities, & Coalitions Stephen Horan, PhD Community Health Solutions, Inc. September 12, 2004.
Program Evaluation Using qualitative & qualitative methods.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Skunk Works Evaluation Tools: How do we know if we are having an impact?
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Research methods in clinical psychology: An introduction for students and practitioners Chris Barker, Nancy Pistrang, and Robert Elliott CHAPTER 11 Evaluation.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Introduction to research methods 10/26/2004 Xiangming Mu.
1 Ambulatory Pediatric Association Educational Guidelines for Pediatric Residency Tutorial 6 Source: Kittredge, D., Baldwin, C. D., Bar-on, M. E., Beach,
Evaluating Teacher Training changing classroom practices Richard Lambert, Ph.D.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
[ADD TITLE] Field Experience Culminating Presentation [ADD YOUR NANE] [ADD DATE]
Context Evaluation knowing the setting Context Evaluation knowing the setting.
 2007 Johns Hopkins Bloomberg School of Public Health Introduction to Program Evaluation Frances Stillman, EdD Institute for Global Tobacco Control Johns.
Texas Accountability Intervention System (TAIS). Data Process Reporting State Districts & Campuses IR or Met Standard Indexes & Safeguards Federal (ESEA.
Quasi Experimental and single case experimental designs
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Strategic Planning Crossing the ICT Bridge Project Trainers: Lynne Gibb Sally Dusting-Laird.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
The Kirkpatrick Model organizational change Richard Lambert, Ph.D.
Thinking About Program Evaluation HUS 3720 Instructor Terry Wimberley, Ph.D.
Evaluation Research Dr. Guerette. Introduction Evaluation Research – Evaluation Research – The purpose is to evaluate the impact of policies The purpose.
Considering the Roles of Research and Evaluation in DR K-12 Projects December 3, 2010.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Basic Concepts of Outcome-Informed Practice (OIP).
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
Input Evaluation the origins of the intervention
Introduction to Program Evaluation
Measuring Outcomes of GEO and GEOSS: A Proposed Framework for Performance Measurement and Evaluation Ed Washburn, US EPA.
Performance Measurement
Program Planning and Evaluation Essentials
Program Planning and Evaluation Methods
Purpose of Outcomes measurement
Understanding a Skills-Based Approach
Regulated Health Professions Network Evaluation Framework
Process Evaluation the implementation phase
The Program Plan Problem/needs statement Develop _____ and objectives
Monitoring and Evaluating FGM/C abandonment programs
M & E Plans and Frameworks
Title Team Members.
Presentation transcript:

Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation

It all depends… Educational evaluation methods differ depending upon… Educational evaluation methods differ depending upon… The mission of the program The mission of the program The stakeholders The stakeholders Money available to perform the evaluation Money available to perform the evaluation The purpose of the evaluation The purpose of the evaluation The target audience for the report The target audience for the report

Effective Evaluations Consider: Who is served Who is served –Target Population What services are provided What services are provided –The treatment / The program / The intervention Who has an interest in the success of the program Who has an interest in the success of the program –Stakeholders

Effective Evaluations Consider: The purpose of the program The purpose of the program –Mission / Goals / Measurable Objectives How services are typically delivered How services are typically delivered –Treatment Implementation Fidelity –Service Delivery Cycle

Effective Evaluations Consider: Why the evaluation is being conducted Why the evaluation is being conducted –Purpose of Evaluation The target audience for the evaluation The target audience for the evaluation –Target Audience –Final Report

The Essence of Evaluation Determining the worth or value of a program Determining the worth or value of a program Conducting a context-specific interpretation of what is happening with a program in a real world setting. Conducting a context-specific interpretation of what is happening with a program in a real world setting. Making causal attributions about effects. Making causal attributions about effects.

Thinking and Doing… There are many evaluation models. There are many evaluation models. Independent of the model, evaluation is an intervention in itself. Independent of the model, evaluation is an intervention in itself. Evaluators have an impact on the program. Evaluators have an impact on the program.

The Impact of Evaluation Help define the purpose of the evaluation and the target audience of stakeholders for the results. Help define the purpose of the evaluation and the target audience of stakeholders for the results. Conduct process evaluations that document delivery of the program as well as implementation fidelity. Conduct process evaluations that document delivery of the program as well as implementation fidelity.

The Impact of Evaluation Help programs recognize the usefulness of evaluation as a source of feedback and guidance for program improvement and development purposes, including making programs aware of national quality standards. Help programs recognize the usefulness of evaluation as a source of feedback and guidance for program improvement and development purposes, including making programs aware of national quality standards. Help identify the stage of development of a program / organizational maturity. Help identify the stage of development of a program / organizational maturity.

The Impact of Evaluation Help programs recognize the role of evaluation in measuring program impact – and selling program impact. Help programs recognize the role of evaluation in measuring program impact – and selling program impact. Propose reasonable methods that fit the purpose and target audience. Examples could include surveys, observational measures, analysis of test scores, focus groups, etc. Propose reasonable methods that fit the purpose and target audience. Examples could include surveys, observational measures, analysis of test scores, focus groups, etc.

The Impact of Evaluation Propose reasonable use of comparative strategies where appropriate such as control groups, multiple measures over time, comparison conditions, etc. Propose reasonable use of comparative strategies where appropriate such as control groups, multiple measures over time, comparison conditions, etc. Randomization. Randomization. Help programs recognize the importance of setting specific objectives by which the program can be evaluated. Help programs recognize the importance of setting specific objectives by which the program can be evaluated.

The Impact of Evaluation Mission – Goals – Measurable Objectives Mission – Goals – Measurable Objectives Help programs outline how specific indicators can be tied to each objective. Help programs outline how specific indicators can be tied to each objective. Help programs understand how to fully specify the desired outcomes in terms of how they would be measured. Help programs understand how to fully specify the desired outcomes in terms of how they would be measured.

The Impact of Evaluation Help programs outline realistic indicators that are closely tied to the actual program or intervention, rather than overly lofty or unrealistic expectations of broad program impact. Help programs outline realistic indicators that are closely tied to the actual program or intervention, rather than overly lofty or unrealistic expectations of broad program impact. The world peace issue. The world peace issue.

The Impact of Evaluation Address reliability, validity, and cultural sensitivity of the outcome measures in the context of the specific target population. Address reliability, validity, and cultural sensitivity of the outcome measures in the context of the specific target population. Help programs understand the use of multiple data sources and indicators, ideally more than just state test scores. Help programs understand the use of multiple data sources and indicators, ideally more than just state test scores.