EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
Chapter 6 Process and Procedures of Testing
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
METAEVALUATION An Overview (dls 8/30/11). Key Questions  1. What is the essence of metaevaluation?  2. Why is metaevaluation important?  3, What are.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 14, 2010.
Learning Objectives LO1 Explain the importance of auditing. LO2 Distinguish auditing from accounting. LO3 Explain the role of auditing in information risk.
Dr. Mohamed A. Hamada Lecturer of Accounting Information Systems Advanced Auditing Lecture 1 Assurance and Attestation Services.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 7, 2010.
Reviewing and Critiquing Research
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
INVITATIONAL PRACTICUM IN METAEVALUATION Session 2 9/15/11 GAO & Joint Committee Standards DLS 9/13/11 1.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
Introduction to Research
PPA 502 – Program Evaluation
PPA 503 – The Public Policy-Making Process
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Copyright ©2015 Pearson Education, Inc Strategy Review, Evaluation, and Control Chapter Nine 9-1.
Introduction, Acquiring Knowledge, and the Scientific Method
Assessing and Evaluating Learning
© American Bar Association Effective Strategic Planning Henry F. White, Jr. Executive Director & Chief Operating Officer American Bar Association 10 th.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine September 23, 2010.
Standards and Guidelines for Quality Assurance in the European
Research Methods for the Social Sciences: Ethics Ryan J. Martin, Ph.D. Thomas N. Cummings Research Fellow March 9, 2010.
Methodology: How Social Psychologists Do Research
Accounting Theory: Roles and Approaches
Copyright © 2008 Allyn & Bacon Meetings: Forums for Problem Solving 11 CHAPTER Chapter Objectives This Multimedia product and its contents are protected.
Training for Improved Performance
Putting Professional Ethics into research and practice BASW.
9 Closing the Project Teaching Strategies
PART II – Management Audit: Basic Standards, Values and Norms Shared by Pratap Kumar Pathak.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Introduction to Auditing. Introduction The role of audits is critical in the business environment of the early twenty-first century. Important decisions.
Psychology Research Methods. There are a variety of ways of validating truth Personal experience Intuition Social or cultural consensus Religious scripture.
THEORETICAL FRAMEWORK and Hypothesis Development
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVALUATION THEORY AND MODEL Theory and model should have symbiotic relationship with practice Theory and model should have symbiotic relationship with.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
1 Contact: From Evaluation to Research Description of a continuum in the field of Global Education.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Copyright © 2010 Pearson Education. All rights reserved. Chapter 2 Methodology: How Social Psychologists Do Research.
Assessment for learning
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Introduction to Research. Purpose of Research Evidence-based practice Validate clinical practice through scientific inquiry Scientific rational must exist.
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
Copyright © 2007 Pearson Education Canada 9-1 Chapter 9: Internal Controls and Control Risk.
Ethical Data Looking at What It Is and How We Use It HarrietAnn Litwin, M.Ed., CRC6 th Summit Conference Darlene Groomes, Ph.D., CRC, LPCRhode Island Kathe.
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
ACCOUNTING THEORY AND STANDARDS
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Dr.Ali K Al-mesrawi. RESEARCH word is originated from the word “Researche”. Research = ‘Re’+ search’. Re means once again,anew, or a fresh. Search means.
Jeanette Gurrola Psychology Department School of Behavioral & Organizational Sciences Claremont Graduate University American Evaluation.
Ethics and Moral reasoning
Software Quality Control and Quality Assurance: Introduction
Converting Research Topics into Research Proposals
Module 1: Introducing Development Evaluation
QRM, IRB, and QRF Differences Explained
EVALUATION THEORY AND MODEL
Taking the STANDARDS Seriously
Elements of evaluation quality: questions, answers, and resources
Presentation transcript:

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014

Agenda Evaluation theory Standards for program evaluations Background for assessing evaluation approaches Psuedoevaluations Brief activity

Evaluation theory

General features of evaluation theory Unlike social science theories, which are – A set of interrelated constructs, definitions, and propositions that present a systematic view of phenomena by specifying relations among variables, with the purpose of explaining and predicting phenomena

Evaluation theories – Describe and prescribe what evaluators do or should do when conducting evaluations – They specify such things as evaluation purposes, users, and uses, who participates in the evaluation process and to what extent, general activities or strategies, method choices, and roles and responsibilities of the evaluator, among others – Largely, such theories are normative in origin and have been derived from practice rather than theories that are put into practice

Theory’s role Has contributed to continuous development of evaluation – Many positive developments – Many negative developments Allows for a large variety of diverse, often conflicting ideologies

Research on evaluation Any purposeful, systematic, empirical inquiry intended to test existing knowledge, contribute to existing knowledge, or generate new knowledge related to some aspect of evaluation processes or products, or evaluation theories, methods, or practices

Shadish, Cook, and Leviton’s criteria for theories of program evaluation Social programming Knowledge construction Valuing Use Practice

Miller’s standards for empirical examinations of evaluation theories Operational specificity Range of application Feasibility in practice Discernable impact Reproducibility

Standards for program evaluations

The need for standards So that members of a profession provide competent, ethical, and safe services To ensure high-quality services and protect the interests of the public So that evaluators deliver sound, useful evaluation services

Existing standards Joint Committee’s The Program Evaluation Standards The American Evaluation (AEA) Association’s Guiding Principles for Evaluators U.S. Government Accountability Office’s (GAO) Government Auditing Standards

Joint Committee standards Utility Feasibility Propriety Accuracy Evaluation accountability

AEA guiding principles Systematic inquiry Competence Integrity/honesty Respect for people Responsibilities for general and public welfare

GAO (general) auditing standards Independence Professional judgment Competence Quality control and assurance

Background for assessing evaluation approaches

Why study evaluation approaches? To help evaluators and clients identify, avoid, or expose misleading or corrupt studies There is no one ‘right way’ of conducting evaluation – To understand the strengths and weaknesses of evaluation approaches and the circumstances under which each is appropriately applied

Pseudoevaluations

Background Pseudoevaluations sometimes falsely characterize constructive efforts—such as providing evaluation training or capacity building—as sound evaluation Some are conducted for corrupt, hidden purposes Others are motivated by political or profit motives In general, these approaches threaten the credibility and integrity of evaluation

Approach 1: Public relations studies Use data to convince constituents that a program is sound and effective Avoid gathering or releasing negative findings Typically use surveys from biased samples Only positive findings are disseminated

Approach 2: Politically controlled studies Conducted with the intent to – Withhold the full set of findings from right-to-know audiences – Violate agreements to fully disclose findings – Selectively release findings Often, decisions are ‘predetermined’ and the evaluation is used as justification

Approach 3: Pandering evaluations Caters to predetermined evaluation conclusions Places evaluator in a favored position to conduct future evaluations Ultimately, both the client and evaluator act together in producing ‘favorable’ findings

Approach 4: Evaluation by pretext Client intentionally deceives evaluator as to true intent of the evaluation – Note: It is the evaluator’s responsibility for investigating the ‘true’ intentions of the client Often emphasizes negative aspects of a program (e.g., so that information could be used to ‘improve’ a program) to support a client’s predetermined decisions

Approach 5: Empowerment under the guise of evaluation Cedes authority of evaluation to groups likely not competent to conduct sound evaluation Gives power and authority to such groups to prepare evaluation reports while claiming that an independent evaluator prepared them Essentially, the approach is directed toward helping groups gain ‘power’

Approach 6: Customer feedback evaluation Supposedly independent ratings and reviews of products and services provided by consumers Whether products and services were actually purchased by those providing ratings and reviews is often unknown Often the ‘vendor’ manufactures ratings and reviews to increase sales

Activity

Discuss the rationale that supports the following statement: “Evaluators should not lend their name and endorsement to evaluations presented by their clients that misrepresent the full set of relevant findings”

Encyclopedia Entries Bias Causation Checklists Chelimsky, Eleanor Conflict of Interest Countenance Model of Evaluation Critical Theory Evaluation Effectiveness Efficiency Empiricism Independence Evaluability Assessment Evaluation Use Fournier, Deborah Positivism Relativism Responsive evaluation Stake, Robert Thick Description Utilization of Evaluation Weiss, Carol Wholey, Joseph