 Systematic determination of the quality or value of something (Scriven, 1991)  What can we evaluate?  Projects, programs, or organizations  Personnel.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Good Evaluation. Good Evaluation Should … be simple be fair be purposeful be related to the curriculum assess skills & strategies set priorities use multiple.
Introduction to Monitoring and Evaluation
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Laura Pejsa Goff Pejsa & Associates MESI 2014
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
 How to infer causation: 8 strategies?  How to put them together? S519.
Introduction to Research Methodology
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Screen 1 of 24 Reporting Food Security Information Understanding the User’s Information Needs At the end of this lesson you will be able to: define the.
Introduction to Program Evaluation Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
Writing Formal Reports
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 501 – Analytical Methods in Administration Lecture 1d – Selecting a Research Topic.
Week 4 (Feb.13, 07) Frame & Refine Research Questions.
Week 3 (Sep12. 06) Introduction to Action Research.
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.
Lecture Nine Database Planning, Design, and Administration
Evaluation. Practical Evaluation Michael Quinn Patton.
Learning Outcomes from Report-Writing Unit
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
RESEARCH DESIGN.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 2: Developing an evaluation plan.
AIM-IRS Annual Business Meeting & Training Seminar Decision Making and Problem Solving.
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Planning & Writing Laboratory Reports A Brief Review of the Scientific Method.
Too expensive Too complicated Too time consuming.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
S519: Evaluation of Information Systems
1 L643: Evaluation of Information Systems Week 5: February 4, 2008.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
The Conclusion and The Defense CSCI 6620 Spring 2014 Thesis Projects: Chapters 11 and 12 CSCI 6620 Spring 2014 Thesis Projects: Chapters 11 and 12.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVALUATION THEORY AND MODEL Theory and model should have symbiotic relationship with practice Theory and model should have symbiotic relationship with.
EXtension Evaluation Community of Practice Plan of Work.
S519: Evaluation of Information Systems Result D-Ch10.
 Now we are ready to write our evaluation report.  Basically we are going to fill our content to the checklist boxes we learned in lec2. S519.
S519: Evaluation of Information Systems Meta-evaluation D-Ch11.
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Social Media Policies and Evaluation Tools. We can acquire a sense of who makes up our community We can gain more direct information about what people.
Introductions and Conclusions CSCI102 - Systems ITCS905 - Systems MCS Systems.
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
By Mario Carrizo. Definition Instructional design is define basically as the person who teaches, designs or develops instructions. Instructional designers.
Helpful hints for planning your Wednesday investigation.
Developing Smart objectives and literature review Zia-Ul-Ain Sabiha.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Abstract  An abstract is a concise summary of a larger project (a thesis, research report, performance, service project, etc.) that concisely describes.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Right-sized Evaluation
Data is your friend: how to incorporate evaluation into your research
EVALUATION THEORY AND MODEL
Resource 1. Evaluation Planning Template
S519: Evaluation of Information Systems
Chapter 21 Formal Reports
TITLE Business Case YOUR LOGO BUSINESS CASE PRESENTATION 00/00/0000
Evaluation Research Defined as the process of making judgments about the merits, value, or worth of any component of education. (e.g. best text books to.
Presentation transcript:

 Systematic determination of the quality or value of something (Scriven, 1991)  What can we evaluate?  Projects, programs, or organizations  Personnel or performance  Policies or strategies  Products or services  Proposals, contract bids, or job application  Almost everything in our daily life because before you make decision, you do the evaluation first. Comparison is a kind of evaluation. Scriven, M. (1991). Evaluation thesaurus (4 th ed.). Newbury Park, CA: Sage

 Evaluand  That which is being evaluated (e.g. Program, policy, project, product, service, organization, almost everything)  In personnel evaluation the term is evaluee S519

 Evaluation is for  Find areas for improvement  Generate an assessment of overall quality  Answer the question of „Merit“ or „Worth“ (Scriven, 1991)  Merit is the „intrinsic“ value of something = „quality“  Worth is the value of something to an individual, an organization, an institution – contextualized merit -- = „value“ S519

 Accountability evaluation  It is important to conduct an independent evaluation  i.e. Nobody on the evaluation team should have a significant vested interest in whether the results are good or bad S519

 Step1: understanding the basic about evaluation (ch1)  Step2: defining the main purposes of the evaluation and the „big picture“ questions that need answers (ch2)  Step3: Identifying the evaluative criteria (ch3)  Step4: Organizing the list of criteria and choosing sources of evidence (collecting data) (ch4) S519

 Step5: analyzing data  dealing with the causation/correlation issue (which cause what, why), to avoid „subjectivity“ (ch5+6)  importance weighting: weight the results (ch7)  Meric determination: how well your evaluand has done on the criteria (good? Unacceptable?) (ch8)  Synthesis methodology: systematic methods for condensing evaluation findings (ch9)  Staticistical analysis: Salkind (2007) S519

 Step6: result  Putting it all together: fitting the pieces into the KEC framework (ch10)  Step7: feedback  Meta-evaluation: how to figure out whether your evlauation is any good (ch11) S519

I. Executive Summary II. Preface III. Methodology 1. Background & Context 2. Descriptions & Definitions 3. Consumers 4. Resources 5. Values 6. Process Evaluation 7. Outcome Evaluation 8 & 9. Comparative Cost-Effectiveness 10. Exportability 11. Overall Significance 12. Recommendations & Explanations 13. Responsibilities 14. Reporting & Follow-up 15. Meta-evaluation S519

 Identify the evaluand  Background and context of evaluand  Why did this program or product come into existence in the first place?  Descriptions and definitions  Describe the evaluand in enough detail so that virtually anyone can understand what it is and what it does  How: collect background information, pay a firsthand visit or literature review S519

 Some tips before you start  Make sure that your evaluand is not difficult to access (geolocation, inanimate objects)  Make your evaluand a clearly defined group (avoid abstract and complex system)  Avoid political ramification (assess your boss pet project, university administration)  To avoid your invovlement in the evaluand (to assess a class which you are teaching, etc.) S519

 Metadata discussion group, Brown Bag discussion group, Twitter, SLIS website, Information Visulization Lab website, Media and Reserve Services in Wells Library, IU CAT, SoE website, Chemistry library website  How about something else related to real-world problems  For social good (  Industry and Government talks at KDD214 (  Maybe this will inspire you with a great project to work on

 Utility  To ensure that an evaluation will serve the practical information needs of intended users  Feasibility  To ensure that an evaluation will be realistic, prudent, diplomatic, and frugal  Propriety  To ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results ( IRB: )  Accuracy  To ensure that an evaluation will reveal and convey technically adequate information about the features S519

 Systematic inquiry  Competence  Integrity/honesty  Respect for people  Responsibilities for general and public welfare S519

 Output: one or two page overview of the evaluand and findings  What is your evaluand  Background and context of your evaluand  Description of your evaluand  Try to be as detail as possible S519

 Who asked for this evaluation and why?  What are the main evaluation questions?  Who are the main audience? S519

 A. what is (are) the main purpose(s) of the evaluation?  To determine the overall quality or value of something (summative evaluation, absolute merit)  i.e. Decision making, funding allocation decision, benchmarking products, using a tool, etc.  To find areas for improvement (formative evaluation, relative merit)  To help a new „thing“ to start  To improve the existing „thing“ S519

 Big picture questions:  B. What is (are) the big picture question(s) for which we need answers?  Absolute merit  Do we want to invest this project?  relative merit  How does this project compare with the other options?  Ranking S519

 Your step2 output report should answer the following questions:  Define the evaluation purpose  Do you need to demonstrate to someone (yourself) the overall quality of something?  Or Do you need to find a file for improvement?  Or do you do both?  Once you answer above questions, figure out what are your big picture questions:  Is your evaluation related to the absolute merit of your evaluand?  Or the relative merit of your evaluand S519