EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Program Evaluation Alternative Approaches and Practical Guidelines
Introduction to Monitoring and Evaluation
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Yvonne Belanger, Duke University Library Assessment Conference
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 14, 2010.
Dr. Suzan Ayers Western Michigan University
Designing an Effective Evaluation Strategy
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
Project Monitoring Evaluation and Assessment
Program Evaluation Essentials. WHAT is Program Evaluation?
Laura Pejsa Goff Pejsa & Associates MESI 2014
Program Evaluation.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 7, 2010.
Evaluation.
Introduction to Program Evaluation Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
PPA 502 – Program Evaluation
PPA 503 – The Public Policy Making Process
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine September 23, 2010.
EVAL 6970:Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
USING THE METHODOLOGY FOR EXTERNAL LEGAL EDUCATION QUALITY ASSESSMENT Training on behalf of USAID FAIR Justice project – 27 th and 28 th May 2015.
Participant-Oriented Evaluation Prepared by: Daniel Wagner Jahmih Aglahmie Kathleen Samulski Joshua Rychlicki.
6. Penilaian Kurikulum The Meaning of Evaluation 1.Evaluation is a process or group of processes by which evaluators gather data in order to make decisions.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Introduction A GENERAL OVERVIEW OF THE WCD FINDINGS, RECOMMENDATIONS & APPLICATION Alex Muhweezi & Chihenyo Mvoyi IUCN Uganda Country Office.
© 2005 Virtue Ventures LLC. Licensed under a Creative Commons Attribution-Share Alike 3.0 License Feasibility Analysis For Social Enterprise.
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
ON THE NEED TO LEARN IN MONITORING AND EVALUATION Paper for SAMEA 29 March 2007 Prof. Gert J van der Westhuizen University of Johannesburg
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Experimental Research Methods in Language Learning Chapter 16 Experimental Research Proposals.
Process Use: Intentional Practice or Just Good Practice? anzea 2013 Conference 22–24 July 2013 Alexandra Park, Epsom, Auckland Michael Blewden Massey University.
ISO 9001:2008 to ISO 9001:2015 Summary of Changes
Overview of Chapters 11 – 13, & 17
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Copyright © 2007 Pearson Education Canada 1 Chapter 24: Assurance Services: Internal Auditing and Government Auditing.
S519: Evaluation of Information Systems Result D-Ch10.
 Now we are ready to write our evaluation report.  Basically we are going to fill our content to the checklist boxes we learned in lec2. S519.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Chapter 8: Participant-Oriented Evaluation Approaches
Program Evaluation Principles and Applications PAS 2010.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Program Evaluation Alternative Approaches and Practical Guidelines
Strategic Environmental Assessment (SEA)
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Seven Utilization Lessons
Presentation transcript:

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014

Agenda American Evaluation Association (AEA) conference experience(s) Improvement- and accountability- oriented evaluation approaches Social agenda and advocacy evaluation approaches Eclectic approaches

NOTE: If we do not get through all of this material tonight we will continue next week…

AEA experience

What was your most ‘profound’ experience at the conference? What did you learn (or not learn that you expected to)? What did you like most? What did you like least?

Improvement- and accountability-oriented approaches

(Ideally) are expansive and cover all aspects related to merit, worth, and significance of a program or other type of evaluand Intended to inform program improvement and/or decision making (i.e., accountability) Extremely difficult to implement in practice – Stufflebeam – Scriven

Approach 15: Decision- and accountability-oriented studies Advance organizers – Relevant decision makers, decisions to be made, accountability requirements, etc.. Purposes – Decision making—whether formative or summative Sources of questions – Stakeholders (typically limited) Questions – How can the evaluand be improved? – Does the evaluand meet accountability requirements?

Methods – Any relevant method for determining how to improve a program or provide information regarding accountability Pioneers – Lee Cronbach, Daniel Stufflebeam Use considerations – Generally directed toward program management and staff Strengths – Scope Weaknesses – Emphasis on program management and staff

Approach 16: Consumer- oriented studies Advance organizers – Complex conceptions of ‘values’ (i.e., criteria) Purposes – Typically, but not always, summative – Extent to which consumers’ needs are met Sources of questions – Relevant sources of values (see Scriven’s Key Evaluation Checklist [KEC]) Questions – How well does the evaluand meet consumers’ needs?

Methods – Any relevant method for determining how well consumers’ needs have been met Pioneers – Michael Scriven Use considerations – Emphasis is on instrumental use Strengths – Independence Weaknesses – Independence

Approach 17: Accreditation and certification Advance organizers – Accreditation or certification requirements Purposes – Whether accreditation or certification requirements are met Sources of questions – Accreditation or certification requirements Questions – Have accreditation or certification requirements been met?

Methods – Any relevant method for determining how well consumers’ needs have been met Pioneers – Various Use considerations – Accordance to accepted standards Strengths – For making informed judgments Weaknesses – Emphasis on inputs and processes

Social agenda and advocacy approaches

Predominately aimed at increasing social justice Giving power to marginalized/disenfranchised groups Tend to favor ‘qualitative’ methods and a constructivist/constructionist perspective Stress cultural pluralism, relativism, and multiple realities Only a small emphasis on determining merit, worth, or significance

Approach 18: Responsive or stakeholder-centered evaluation Advance organizers – Stakeholders’ concerns Purposes – Varies widely, depending on stakeholders’ needs Sources of questions – Stakeholders Questions – What has the program achieved? – How well has the program been implemented? – What do experts ‘think’ about the program?

Methods – Most often qualitative Document analysis Observations Interviews Pioneers – Robert Stake Use considerations – Primary stakeholders Strengths – Intended to address stakeholders’ concerns Weaknesses – External credibility

Approach 19: Constructivist evaluation Advance organizers – Rejects the notion that merit, worth, and significance can be ‘objectively’ determined Purposes – To ‘construct’ stakeholders’ experiences Sources of questions – Pluralistic values of stakeholders Questions – Emergent

Methods – Predominately qualitative Pioneers – Evon Guba, Yvonne Lincoln Use considerations – Often conflicting accounts and no definitive judgments Strengths – Stakeholder involvement Weaknesses – Time, costs, and—most importantly— relativist results

Approach 20: Deliberative democratic evaluation Advance organizers – Democratic participation – Dialogue – Deliberation Purposes – Democratic participation Sources of questions – Stakeholders/evaluator(s) Questions – What is the merit and/or worth of the program?

Methods – Any relevant methods for gathering evidence Pioneers – Ernie House Use considerations – Should represent interests of all relevant stakeholders Strengths – Democratic participation Weaknesses – Ambitious demands required to execute the approach

Approach 21: Transformative evaluation Advance organizers – Social justice and equity Purposes – Recognizing ‘situational’ nature of knowledge claims Sources of questions – Least advantaged groups of stakeholders Questions – None that can be identified in advance

Methods – Generally mixed-method Pioneers – Donna Mertens Use considerations – Inclusion of marginalized stakeholders Strengths – Emphasis on social justice Weaknesses – External credibility

Eclectic evaluation approaches

No particular philosophical or methodological orientation Draw from various evaluation concepts and methods to secure ‘useful’ evaluation findings Tend to invoke the Program Evaluation Standards

Approach 22: Utilization- focused evaluation Advance organizers – Potential users and uses Purposes – Information necessary for intended uses by intended users Sources of questions – Intended users, often determined through situational analysis Questions – Specific questions articulated by intended users

Methods – Any relevant method for addressing intended users questions Pioneers – Michael Patton Use considerations – Situational, and tailored to intended users intended uses Strengths – Maximizes evaluation impact Weaknesses – Too much user input and involvement

Approach 23: Participatory evaluation Advance organizers – Promoting buy-in and use Purposes – Tends to be directed toward program improvement Sources of questions – Intended users and other stakeholders Questions – Specific questions articulated by intended users and other stakeholders

Methods – Any relevant method for addressing intended user and stakeholder questions Pioneers – Brad Cousins Use considerations – Situational, and tailored to intended users intended uses Strengths – User-friendly Weaknesses – Sometimes, poor technical quality