EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
Gifted Education and Response to Intervention Update on Gifted Education Workshop August 2013 Toddie Adams, Marshall County Schools.
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
US Office of Education K
Consensus Questions.  The Education Study scope is broad and includes the following areas under the role of the federal government in public education.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
Lucila Beato UNMIL/HRPS
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
What You Will Learn From These Sessions
Chapter 3 Flashcards. obligation of an individual to other individuals based on a social or legal contract to justify his or her actions; the processes.
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Dr Mark Abrahams, University of Cape Town, March 2007
Evaluation Research Kodi D. Havins AED 615 Fall 2006 Dr. Franklin.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
© 2008 McGraw-Hill Higher Education All rights reserved.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
V MEASURING IMPACT Kristy Muir Stephen Bennett ENACTUS November 2013.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 7, 2010.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
PPA 502 – Program Evaluation
PPA 503 – The Public Policy-Making Process
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
Measuring Learning Outcomes Evaluation
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine September 23, 2010.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
Standards and Guidelines for Quality Assurance in the European
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Accounting Theory: Roles and Approaches
CYCO Professional Development Packages (PDPs) Teacher Responsiveness to Student Scientific Inquiry 1.
Environmental Impact Assessment (EIA): Overview
Brief History of Education Reform A Move to Promote Equity and Equality.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
HECSE Quality Indicators for Leadership Preparation.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Regulation and the Governance Agenda in the 21 st Century Josef Konvitz, Public Governance Directorate.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
CT 854: Assessment and Evaluation in Science & Mathematics
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
IPDET 2015 WHAT IS EVALUATION? Robert Picciotto King’s College, London "There are no facts, only interpretations“ Friedrich Nietzsche 1.
Knowledge Economy Forum World Bank Conference 21 February 2002 Ian Whitman -- OECD
1 THE DESIGN OF INTELLECTUAL MOVEMENTS Stuart Umpleby The George Washington University Washington, DC.
SCIENCE The aim of this tutorial is to help you learn to identify and evaluate scientific methods and assumptions.
CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION.
Evaluation and Designing
Response to Intervention: Introduction Connecting Research to Practice for Teacher Educators.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
Dr. Dan Bertrand LEEA 554 Chapter 11- Policy Evaluation: Determining If the Policy Works.
Copyright © 2011 Delmar, Cengage Learning. ALL RIGHTS RESERVED. Chapter 3 Research and Evidence-Based Practice.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Utilizing Research: Putting Research Evidence Into Nursing Practice Prepare by /Dr. AmiraYahia.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Evaluation: For Whom and for What?
Designing Effective Evaluation Strategies for Outreach Programs
Introductory Class: What is Evaluation All About?
Technical Assistance on Evaluating SDGs: Leave No One Behind
Module 1: Introducing Development Evaluation
Program Evaluation Alternative Approaches and Practical Guidelines
Evaluation of SF in Romania
Research and Methodology
Identify, analyze, evaluate, recognize, describe, compare, explain, make, construct... Foundations of U.S. History and the Historical Thinking Skills.
Presentation transcript:

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011

Agenda Review Activity 2 Lecture – History of evaluation – Basic principles and core concepts – Shadish, Cook, & Leviton’s (1991) five principles of program evaluation theory Questions and discussion Encyclopedia of Evaluation entries

1.Establish criteria 2.Construct standards 3.Measure performance and compare to standards 4.Synthesize into a judgment of merit or worth General Logic (Scriven) Problem: Extent of performance Phenomenon: Functional product Question: Is X a good/less good one of its type? Claim: Performance/value Working Logic (Fournier) Review

Activity 2 Evaluating chocolate chip cookies using evaluation logic

Activity 2 Discussion questions 1.How would you describe your experience in establishing criteria? What were some of the things you discussed with your group? 2.How did you determine what standards to use? Was your group in agreement with the standards? How did you reconcile differences? 3.How comfortable are you with your final judgment about which cookie was the best and which you would recommend?

Historical evolution of evaluation

God’s archangel came then, asking, “God, how do you know that what you have created is ‘very good’? What are your criteria? On what data do you base your judgment? Just what results were you expecting to attain? And aren’t you a little close to the situation to make a fair and unbiased evaluation?” God thought about these questions and that day and God’s rest was greatly disturbed. On the eighth day God said, “Lucifer go to hell.” Thus was evaluation born a blaze of glory.” — Michael Q. Patton “In the beginning God created the heaven and the earth, then God stood back, viewed everything made, and proclaimed, “Behold, it is very good.” An the evening and the morning were the sixth day. And on the seventh day God rested from all work.

Ancient Practice, New Discipline Arguably, evaluation is the single most important and sophisticated cognitive process in the repertoire of human reasoning and logic It is a natural, evolutionary process without which we would not survive Earliest known examples – Product evaluation – Personnel evaluation

Early History in the United States Tyler’s national “Eight Year Study” ( ) – Involved 30 secondary schools and 300 colleges and universities and addressed narrowness and rigidity in high school curricula Mainly educational assessments during the 1950s and early 1960s conducted by social scientists and education researchers

Early History in the United States Johnson’s “War on Poverty” and “Great Society” programs of the 1960s – Head Start, Follow Through Evaluation clause in Elementary and Secondary Education Act (ESEA) – Evaluation became part of every federal grant

Toward Professionalization Two U.S.-based professional evaluation organizations emerged in mid-1970s – Evaluation Network (E-Net) – Evaluation Research Society (ERS) In 1985, the two merged to form what is now the American Evaluation Association (AEA)

Growing Concerns for Use Through the 1970s and 1980s, growing concerns were voiced about the utility of evaluation findings, in general, and the use of experimental and quasi-experimental designs, more specifically

Decreased Emphasis In the 1980s, huge cuts in social programs resulted from Reagan's emphasis on less government involvement The requirement or evaluation was removed or lessoned from many federal programs during this period During the 1980s, many school districts, universities, private companies, state departments of education, The Federal Bureau of Investigation (FBI), the Food and Drug Administration (FDA), and the General Accounting Office (GAO) developed internal evaluation units

Increased Emphasis In the 1990s, there was an increased emphasis on government program accountability and organizations’ efforts to be lean, efficient, global, and more competitive Evaluation was conducted not only to meet government accountability but also to enhance effectiveness In addition, it was during this period that an increasing number of foundations created internal evaluation units, provided support for evaluation activities, or both

Recent Milestones In 2001, the reauthorization of ESEA that resulted in the No Child Left Behind (NCLB) act is considered the most sweeping reform of education since 1965 – It has redefined the federal role in K-12 education by focusing on closing the achievement gap between disadvantaged and minority students NCLB has had a profound influence on evaluation design and methods by emphasizing the use of randomized controlled trials (RCT) To this day, the RCT debate is one of the most pervasive in evaluation

Professionalization By 2010, there were more than 65 national and regional evaluation organizations throughout the world, most in developing countries Although specialized training programs have existed for several decades, graduate degree programs in evaluation have emerged only recently – Australasia – Africa – Canada – Central America – Europe (not every country) – Japan – Malaysia – United Kingdom

Definition Evaluation is the act or process of determining the merit, worth, or significance of something or the product of that process – Merit: Intrinsic quality; absent of context and costs – Worth: Synonymous with value; quality under consideration of costs and context – Significance: Synonymous with importance; merit and worth in a specific context

Competing Definitions Evaluation is “the use of social science research procedures to systematically investigate the effectiveness of social intervention programs” (Rossi, Freeman, & Lipsey). Proponents of theory-driven evaluation approaches characterize evaluation as explaining “how and why programs work, for whom, and under what conditions.”

Competing Definitions Advocates of the empowerment evaluation movement portray evaluation as “the use of evaluation concepts and techniques that foster self-determination.” The Organization for Economic Co- Operation and Development designates evaluation as “the systematic and objective assessment of an on-going or completed project, programme or policy, its design, implementation and results…the aim is to determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability.”

Purposes Formative: To improve Summative: To inform decision making Developmental/proformative: To help develop an intervention or program; ongoing formative Accountability: To hold accountable; usually under summative Monitoring: To assess implementation and gauge progress toward a desired end Knowledge generation: To generate knowledge about general patterns of effectiveness Ascriptive: Merely for the sake of knowing

Functional Forms Process evaluation – Assessment of everything that occurs prior to true outcomes Outcome evaluation – Assessment of an evaluand’s effects Cost evaluation – Assessment of monetary and non- monetary costs, direct and indirect costs, and actual and opportunity costs

The 7 “P”s Program evaluation Policy analysis Personnel evaluation Portfolio evaluation Product evaluation Performance evaluation Proposal evaluation

1.Ideal Use Use Non-Use Legitimate Use Misuse 2.Misuse 4.Justified Non-Use3.Unjustified Non-Use Uses and Misuses Mistaken Use (incompetence, uncritical acceptance, unawareness) Mischievous Use (manipulation, coercion) Instrumental Use Conceptual Use Persuasive Use Rational Non-Use Political Non-Use Abuse (inappropriate suppression of findings)

Professional Standards Utility Feasibility Propriety Accuracy Evaluation Accountability

Shadish, Cook, & Leviton’s Elements of “Good Theory for Social Program Evaluation” 1.Social programming – Ways that social programs and policies develop, improve, and change, especially in regard to social problems 2.Knowledge construction – Ways researchers/evaluators construct knowledge claims about social programs 3.Valuing – Ways values can be attached to programs 4.Knowledge use – Ways social science information is used to modify programs and policies 5.Evaluation practice – Tactics and strategies evaluators follow in their professional work, especially given the constraints they face

Encyclopedia Entries Bias Causation Checklists Conceptual Use Consumer Effectiveness Efficiency Epistemology Evaluation Use Experimental Design Experimental Society Impartiality Independence Instrumental Use Intended Uses Judgment Merit Modus Operandi Ontology Outcomes Paradigm Positivism Postpositivism Process use Quantitative Weight and Sum Recommendations Synthesis Transdiscipline Treatments Worth