S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Critical Reading Strategies: Overview of Research Process
Protocol Development.
Laura Pejsa Goff Pejsa & Associates MESI 2014
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
 How to infer causation: 8 strategies?  How to put them together? S519.
Introduction to Research Methodology
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Applying Assessment to Learning
 Systematic determination of the quality or value of something (Scriven, 1991)  What can we evaluate?  Projects, programs, or organizations  Personnel.
Introduction to Program Evaluation Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
Writing Formal Reports
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Getting on the same page… Creating a common language to use today.
PPA 501 – Analytical Methods in Administration Lecture 1d – Selecting a Research Topic.
Research Proposal and Dissertation Daing Nasir Ibrahim.
PPA 501 – Analytical Methods in Administration Lecture 2c – The Research Proposal.
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
AIM-IRS Annual Business Meeting & Training Seminar Decision Making and Problem Solving.
Reporting and Using Evaluation Results Presented on 6/18/15.
Proposal Writing for Competitive Grant Systems
The County Health Rankings & Roadmaps Take Action Cycle.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Planning & Writing Laboratory Reports A Brief Review of the Scientific Method.
Too expensive Too complicated Too time consuming.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
S519: Evaluation of Information Systems
1 L643: Evaluation of Information Systems Week 5: February 4, 2008.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Prof. Dr. Shehata El-Sewedy, Dean Dr. Tarek El Sewedy Dr. Hewida Fadel Prof. Dr. Shehata El-Sewedy, Dean Dr. Tarek El Sewedy Dr. Hewida Fadel.
Developing Proposals Responding to an RFP to establish a contract.
1 Business Communication Process and Product Brief Canadian Edition, Mary Ellen Guffey Kathleen Rhodes Patricia Rogin (c) 2003 Nelson, a division of Thomson.
S519: Evaluation of Information Systems Result D-Ch10.
 Now we are ready to write our evaluation report.  Basically we are going to fill our content to the checklist boxes we learned in lec2. S519.
Writing a Research Proposal 1.Label Notes: Research Proposal 2.Copy Notes In Your Notebooks 3.Come to class prepared to discuss and ask questions.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
The Elements of a Successful Proposal. # 1:The title Choose a title that conveys information about your project. Avoid acronyms that have negative connotations.
S519: Evaluation of Information Systems Meta-evaluation D-Ch11.
Program Evaluation for Nonprofit Professionals Unit 1 Part 1: Introduction to Nonprofit Evaluation.
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Writing Proposals Nayda G. Santiago Capstone CpE Jan 26, 2009.
The Risk Management Process
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
How do you promote a product? Promoting product and services Boy … is this guy dumb.
Social Media Policies and Evaluation Tools. We can acquire a sense of who makes up our community We can gain more direct information about what people.
Introductions and Conclusions CSCI102 - Systems ITCS905 - Systems MCS Systems.
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.
Proposal Writing. # 1:The title Choose a title that conveys information about your project. Avoid acronyms that have negative connotations. Make it Brief.
S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.
By Mario Carrizo. Definition Instructional design is define basically as the person who teaches, designs or develops instructions. Instructional designers.
Helpful hints for planning your Wednesday investigation.
Developing Smart objectives and literature review Zia-Ul-Ain Sabiha.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Staff development for technology- enhanced learning and assessment an institutional overview Carol Russell.
12. Reporting Template The template below outlines the key information you need to provide within your evaluation report. Note: use this as a guide as.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
HCS 465 OUTLET Experience Tradition /hcs465outlet.com FOR MORE CLASSES VISIT
2 Selecting a Healthcare Information System.
Orientation and Training
Resource 1. Evaluation Planning Template
Orientation and Training
S519: Evaluation of Information Systems
Decision Focus® 6.0 Tool Chooser Issue Description Tools Overview
Style You need to demonstrate knowledge and understanding beyond undergraduate level and should also reach a level of scope and depth beyond that taught.
Orientation and Training
Presentation transcript:

S519: Evaluation of Information Systems Understanding Evaluation Ch1+2

Definition of Evaluation Systematic determination of the quality or value of something (Scriven, 1991) What can we evaluate? Projects, programs, or organizations Personnel or performance Policies or strategies Products or services Processes or systems Proposals, contract bids, or job application Lessons-learned and methods are transdisciplinary

Terminology (Davison, Glossary) Evaluand That which is being evaluated (e.g. Program, policy, project, product, service, organization) In personnel evaluation the term is evaluee

Issues of Evaluation Evaluation is for Find areas for improvement Generate an assessment of overall quality Answer the question of „Merit“ or „Worth“ (Scriven, 1991) Merit is the „intrinsic“ value of something = „quality“ Worth is the value of something to an individual, an organization, an institution – contextualized merit -- = „value“

Choosing the right group Accountability evaluation It is important to conduct an independent evaluation i.e. Nobody on the evaluation team should have a significant vested interest in whether the results are good or bad Organization learning capability evaluation It can (better) be dependent evaluation i.e. Organizational staff, consultants, managers, customers, trainers, trainnees etc. can join.

The steps involved (D-p4) Step1: understanding the basic about evaluation (ch1) Step2: defining the main purposes of the evaluation and the „big picture“ questions that need answers (ch2) Step3: Identifying the evaluative criteria (ch3) Step4: Organizing the list of criteria and choosing sources of evidence (collecting data) (ch4)

The steps involved (D-p4) Step5: analyzing data dealing with the causation issue (which cause what, why), to avoid „subjectivity“ (ch5+6) importance weighting: weight the results (ch7) Meric determination: how well your evaluand has done on the criteria (good? Unacceptable?) (ch8) Synthesis methodology: systematic methods for condensing evaluation findings (ch9) Staticistical analysis: Salkind (2007)

The steps involved (D-p4) Step6: result Putting it all together: fitting the pieces into the KEC framework (ch10) Step7: feedback Meta-evaluation: how to figure out whether your evlauation is any good (ch11)

The Key Evaluation Checklist (Davidson, 2005, p. 6-7) I. Executive Summary II. Preface III. Methodology 1. Background & Context 2. Descriptions & Definitions 3. Consumers 4. Resources 5. Values 6. Process Evaluation 7. Outcome Evaluation 8 & 9. Comparative Cost-Effectiveness 10. Exportability 11. Overall Significance 12. Recommendations & Explanations 13. Responsibilities 14. Reporting & Follow-up 15. Meta-evaluation

Step 1: Understand the basic of evaluation Identify the evaluand Background and context of evaluand Why did this program or product come into existence in the first place? Descriptions and definitions Describe the evaluand in enough detail so that virtually anyone can understand what it is and what it does How: collect background information, pay a firsthand visit or literature review

Are you ready for your first evaluation project? Some tips before you start Make sure that your evaluand is not difficult to access (geolocation, inanimate objects) Make your evaluand a clearly defined group (avoid abstract and complex system) Avoid political ramification (assess your boss pet project, university administration) To avoid your invovlement in the evaluand (to assess a class which you are teaching, etc.)

Previous Projects Metadata discussion group Brown Bag discussion group Twitter SLIS website Information Visulization Lab website Media and Reserve Services in Wells Library IU CAT SoE website Chemistry library website

Step1: Output report Output: one or two page overview of the evaluand and findings What is your evaluand Background and context of your evaluand Description of your evaluand Try to be as detail as possible

Step2: Defining the Purpose of the Evaluation (D-Ch2) Who asked for this evaluation and why? What are the main evaluation questions? Who are the main audience?

Evaluation purposes A. what is (are) the main purpose(s) of the evaluation? To determine the overall quality or value of something (summative evaluation, absolute merit)  i.e. Decision making, funding allocation decision, benchmarking products, using a tool, etc. To find areas for improvement (formative evaluation, relative merit)  To help a new „thing“ to start  To improve the existing „thing“

Big picture questions Big picture questions: B. What is (are) the big picture question(s) for which we need answers? Absolute merit  Do we want to invest this project? relative merit  How does this project compare with the other options?  Ranking

Step2: Output report Your step2 output report should answer the following questions: Define the evaluation purpose Do you need to demonstrate to someone (yourself) the overall quality of something? Or Do you need to find a file for improvement? Or do you do both? Once you answer above questions, figure out what are your big picture questions: Is your evaluation related to the absolute merit of your evaluand? Or the relative merit of your evaluand