Several Evaluations Theories and Methods Reference: Foundation of Program Evaluation by Sadish, Cook, and Leviton (1991)

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

Strategies to unlock your research potential. Eighth Biennial National Health Occupations Curriculum Conference Houston, TX October 29 – Nov 2, 2002.
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Mywish K. Maredia Michigan State University
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 14, 2010.
Regional Trajectories to the Knowledge Economy: A Dynamic Model IKINET-EURODITE Joint Conference Warsaw, May 2006.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
Quantitative vs. Qualitative Research Method Issues Marian Ford Erin Gonzales November 2, 2010.
RBM in the context of Operations and Programme and Project Management Material of the Technical Assistance Unit (TAU)
V MEASURING IMPACT Kristy Muir Stephen Bennett ENACTUS November 2013.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 7, 2010.
Value for Your Dollar: The Social Enterprise Impact Assessment Project.
Research Basics PE 357. What is Research? Can be diverse General definition is “finding answers to questions in an organized and logical and systematic.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
Getting on the same page… Creating a common language to use today.
Introduction to Research
PPA 502 – Program Evaluation
Introduction to the User’s Guide for Evaluating Learning Outcomes from Citizen Science Tina Phillips, Cornell Lab of Ornithology Marion Ferguson, Cornell.
Health Program Effect Evaluation Questions and Data Collection Methods CHSC 433 Module 5/Chapter 9 L. Michele Issel, PhD UIC School of Public Health.
+ STARS Evaluation Assistant Webinar 1 September 19, 2014 Evaluation Projects.
Models for Program Planning in Health Promotion
Impact Evaluation: Initiatives, Activities, & Coalitions Stephen Horan, PhD Community Health Solutions, Inc. September 12, 2004.
Instructional Design of Distance Training TRDEV 533 Chapter 3.
Program Evaluation Using qualitative & qualitative methods.
26 TH ACADEMIC COUNCIL ST APRIL 2015 Breakout session Group C “EXPERIENTIAL LEARNING”
Evaluation in the GEF and Training Module on Terminal Evaluations
Too expensive Too complicated Too time consuming.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
CRJS 4466 PROGRAM & POLICY EVALUATION LECTURE #5 Evaluation projects Questions?
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
1 Using Logic Models to Enhance Evaluation WESTAT Center to Improve Project Performance (CIPP) Office of Special Education Programs Amy A. Germuth, Ph.D.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Evaluating the impact of health research: Revisiting the Canadian Institutes of Health Research Impact Assessment Framework Nicola Lauzon, Marc Turcotte.
S519: Evaluation of Information Systems Result D-Ch10.
The Scientific Method.
Strengthening the Science-Policy Platform on Biodiversity and Ecosystem Services Africa Consultation on IPBES May 2010 Nairobi, Kenya Peter Gilruth,
1 The Theoretical Framework. A theoretical framework is similar to the frame of the house. Just as the foundation supports a house, a theoretical framework.
The Practical Aspects of Doing Research An Giang University June, 2004 Dennis Berg, Ph.D.
OVERVIEW OF RESEARCH METHODS Dr. Kimaro. INTRODUCTION Why Research; Increase body of knowledge Explain social phenomena Test theoretical hypothesis Explain.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
Chapter 1.1 – What is Science?. State and explain the goals of science. Describe the steps used in the scientific method. Daily Objectives.
Copyright © 2011 Delmar, Cengage Learning. ALL RIGHTS RESERVED. Chapter 3 Research and Evidence-Based Practice.
Introduction to Research. Purpose of Research Evidence-based practice Validate clinical practice through scientific inquiry Scientific rational must exist.
CAPSTONE PROJECTS:CONNECTING TO THE CORE COMPETENCIES OF MPA PROGRAMS ETHEL WILLIAMS, UNIVERSITY OF NEBRASKA-OMAHA.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
Welcome! Seminar – Monday 6:00 EST HS Seminar Unit 1 Prof. Jocelyn Ramos.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Session 2: Developing a Comprehensive M&E Work Plan.
SCIENTIFIC METHOD. Science Is Science is a method of gaining knowledge by studying natural events that occur in our environment by observation, experimentation.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Evaluating the Quality and Impact of Community Benefit Programs
EPAS Presentation. During one of your field seminars, you will present on your field experiences as they relate to CSWE core competencies and practice.
Using Logic Models in Program Planning and Grant Proposals
Presentation on Expenditure Management By Team GVF
Program Evaluation Alternative Approaches and Practical Guidelines
Introduction to Nursing Theory and Science
1-1 What is Science? What Science Is and Is Not
Chapter 1 The Science of Biology.
Presentation transcript:

Several Evaluations Theories and Methods Reference: Foundation of Program Evaluation by Sadish, Cook, and Leviton (1991)

Components of good evaluation theory Five Components Social Programming: Activities of the social program (Example: Alterna Loan disbursement) Knowledge Construct: How to get the knowledge and understand the knowledge. How the evaluator construct the knowledge. Methodology of the study Valuing Component: What are the values for the evaluator No value judgment (good or bad) Use Component: How evaluators produce results that are useful for social problem solving Utilization of evaluation findings Practice Component : How and when the evaluation should be done What is the purpose of the evaluation. What types of questions should be asked

Different Evaluation Theories (Eight) Policy Research Evaluation (Weiss) Evaluation for Program Improvement (Wholey) Theory-Driven Evaluation (Ross) Responsive Evaluation & Qualitative Method (Stake) Methodologist Experimenting Society (Campbell) The Science of Valuing (Scriven) Functional Evaluation Design (Cronbach) Participant Oriented Model (Fitzpatrick)

Description of Different Evaluation Theories (discuss only three) Policy Research Evaluation (Formative evaluation) Follow logic model: Explains inputs, implementation, intervening factors, immediate and long-term outcomes Contributing to improve future programming Management oriented Use internal evaluators Critics Internal bias to protect him from organizational displeasure Evaluation suggestions fails to lead to improve the program Organization resist to change

Evaluation for Program Improvement (Summative Evaluation) C ompare actual program performance with standards of expected program performance Measures successes and failures to meet nation's goals Draw conclusions about program effectiveness to make decisions Use external evaluator Involve multiple stakeholders Critics Failures to meet nation’s goals Depends on external funding Question on evaluator’s objectivity Controlled by management Result oriented Does not confirm evaluation use Need activities to facilitate evaluation use

Theory-Driven Evaluation Uses program theory as a tool for understanding the program to be evaluated, guiding the evaluation. Prefers quantitative techniques –Example Community Economic Development (CED) conceptual framework uses to evaluate Alterna micro-loan program Model guides questions formation & data collection Critics Skeptical to qualitative method Needs to choose a conceptual framework that fits best Work on hypothesis Comprehensive evaluation, but limited budget, time and other resources hamper the study

Participated-Oriented Model Evaluator himself involved in the program Evaluator would involve many stakeholders Use both internal and external evaluators Involved other stakeholders in the program both in program delivery & in program evaluation. Good model involving people. Critics Lengthy process Difficult to coordinate all stakeholders Institution has its own priority

Activity In the context of your own project, which model do you think works better for you and why? Please discuss with your partner and share your ideas with classmates. Policy Research Evaluation ( Formative) (Weiss) YesNO Evaluation for Program Improvement (Wholey) YesNo (Summative) Theory-Driven Evaluation (Ross) YesNo Experimental & Quasi-Experimental Evaluation YesNo (Campbell) Others Yes No Thank You