S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.

Slides:



Advertisements
Similar presentations
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Advertisements

Test Development.
Copyright © Allyn & Bacon 2011 Assessing Students and Texts Chapter 4 This multimedia product and its content are protected under copyright law. The following.
Chapter 3 – Data Exploration and Dimension Reduction © Galit Shmueli and Peter Bruce 2008 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Performance Evaluation
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
 How to infer causation: 8 strategies?  How to put them together? S519.
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
What do you already know about rubrics? What do you want to know?
Analyzing Assessment Data. A process to consider... Student Learning Outcomes identified for program. Courses identified as to where the outcomes will.
 Systematic determination of the quality or value of something (Scriven, 1991)  What can we evaluate?  Projects, programs, or organizations  Personnel.
Definitions Performance Appraisal
Copyright © 2006 Pearson Education Canada Inc Course Arrangement !!! Nov. 22,Tuesday Last Class Nov. 23,WednesdayQuiz 5 Nov. 25, FridayTutorial 5.
Performance Evaluation
SELECTING A DATA COLLECTION METHOD AND DATA SOURCE
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.
Educational Research by John W. Creswell. Copyright © 2002 by Pearson Education. All rights reserved. Slide 1 Chapter 8 Analyzing and Interpreting Quantitative.
Appraisal Types.
Chapter 1: Introduction to Statistics
Analyzing data: Synthesis
Chapter 8 Measuring Cognitive Knowledge. Cognitive Domain Intellectual abilities ranging from rote memory tasks to the synthesis and evaluation of complex.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
S519: Evaluation of Information Systems
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
Taking it to Another Level: Increasing Higher Order Thinking Session 3 BHCA PROFESSIONAL DEVELOPMENT SERIES.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Access to HE internal moderation and standardisation planning Workshop Session.
Raises, Merit Pay, Bonuses Personnel Decisions (e.g., promotion, transfer, dismissal) Identification of Training Needs Research Purposes (e.g., assessing.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
USING SCIENCE JOURNALS TO GUIDE STUDENT LEARNING Part 1: How to create a student science journal Part 2: How to assess student journals for learning.
Agenda for This Week Wednesday, April 27 AHP Friday, April 29 AHP Monday, May 2 Exam 2.
 What is synthesis methodology?, why do we need that?  What is synthesis for grading?  Quantitative  Qualitative  How to merge all the conclusions.
Chapter(3) Qualitative Risk Analysis. Risk Model.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
An overview of multi-criteria analysis techniques The main role of the techniques is to deal with the difficulties that human decision-makers have been.
 Employees are often frustrated about the appraisal process  Appraisals are too subjective  Possibility of unfair treatment by a supervisor  Way to.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
1 The “Perfect” Date Prepared for SSAC by Semra Kilic-Bahi - Colby-Sawyer College, New London, NH © The Washington Center for Improving the Quality of.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Chapter 7 Rewards and Performance Management
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Chapter 6: Analyzing and Interpreting Quantitative Data
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
CONFERENCE EVALUATION DATA ANALYSIS. DATA ANALYSIS  A credible amount of data has to be collected to allow for a substantial analysis  Information collected.
Authentic Assessment Using Rubrics to Evaluate Project-Based Learning Curriculum content created and presented by Dr. Alexandra Leavell Associate Professor.
© 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 1 Product Design Alternative Generation, Evaluation, and Selection.
Adeyl Khan, Faculty, BBA, NSU 1. Sorting, Categorizing, Rating and Evaluating a large quantity of ideas. Simple checklists To complex weighted scoring.
S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.
MATH 110 Sec 14-1 Lecture: Statistics-Organizing and Visualizing Data STATISTICS The study of the collection, analysis, interpretation, presentation and.
Research Problem Students with poor quantitative reasoning skills are less successful in STEM fields.
Part II – Chapters 6 and beyond…. Reliability, Validity, & Grading.
Student Presentations Developing rubrics for assessment Nancy Rees, Barbara White.
Designing Scoring Rubrics
NATA Foundation Student Grants Process
Classroom Assessment A Practical Guide for Educators by Craig A
Supplement S7 Supplier Selection.
Analytic Hierarchy Process (AHP)
Appraisal Types.
End of Year Calculus Assignments Name:______________________________
1234: AEC SCHOOL | 1234: RESIDING DISTRICT
EXCEL BOOKS 14-1 JOB EVALUATION.
Effective Use of Rubrics to Assess Student Learning
Performance Management
DESIGNING RUBRICS Hein van der Watt
Presentation transcript:

S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171

Synthesizing for „ranking“ What are „ranking“ evaluations? Examples? Difference comparing with „grading“ evaluation?

Qualitative and quantitative Qualitative Qualitative weight and sum (QWS) Quantitative Numerical weight and sum (NWS)

Numerical Weight and Sum (NWS) It is a quantitative synthesis method for summing evaluand performance across multiple criteria. It includes Assign numerical importance weight and a numerical performance score to each criteria (dimension) Multiply weights by performance scores Sum these products The summing result represents the overall merit of the evaluand

Numerical Weight and Sum (NWS) It fits for There are only a small number of criteria There is some other mechanism for taking bars into account (why) There is defensible needs-based strategy for ascribing weights.

Training program evaluation A comparative evaluation on three different interventions for training managers A mountain retreat featuring interactive sessions with multiple world-class management gurus An in-house training and mentoring program run by human resources, A set of videos and latest book on management from management guru Peter Drucker

Training program evaluation Needs assessment for this evaluation Bear in mind that this is a comparison evaluation How do you want to compare these programs, what are the key features of the programs Identify the dimension of merit (Process, Outcomes and Cost) Decide the importance of the merit (giving weights to merits, based on needs?) See Table 9.8

Training program evaluation Next steps Data collection (what are your experiences for your project data collection?) Data analysis Rate their performance based on pre-defined ratings: excellent, very good, good, fair, or poor) (see Table 9.9 for this example) Convert weights into numbers (see Table 9.10) Convert ratings into numbers (see Table 9.10) Synthesis step (how? See Table 9.11) How to interpret Table 9.11

Exercise Do it by your own hand: Converting Table9.9 to Table9.10 (defining your own numeric value for importance and grading scales) and try to find out which program is the best comparing with others. If suddenly, the cost criteria become extremely important, will this change the final result? Work on your own Form the pair and discussion Pros and cons for NWS?

Qualitative Weight and Sum (QWS) It is non-numerical synthesis methodology for summing the performances of an evaluand on multiple criteria to determine overall merit. It is a ranking method for determining the relative merit of two or more evaluands It is not suitable for grading It fits for Personnel selection, products/service/proposal selection

QWS Step1: Determine importance in terms of maximum possible value How (see Chapter 7, six strategies) Table 9.12 (compare with Table 9.8) Step2: Set bars Bar is the cut point between acceptable and unacceptable criteria. Such as: Too expensive to afford Too long away from their work

QWS Step3: Create value determination rubrics Rubrics are level-based (see Chapter 8) Description on each level, how to deal with bar?  Unacceptable  no noticeable value  marginally valuable  valuable  extremely valuable  Such as what performance would look like at each level Each dimension can have its own rubrics or each group of dimensions can have their own rubrics Each group of questions can have their own rubrics Synthesis step can have its own rubrics Example: Rubric for rating finanical cost of training (see table 9.14)

QWS Step4: Check equivalence of value levels across dimensions The validity of the QWS method is highly dependent on ensuring the rough equivalence on the value levels defined for each dimension For example, whether table 9.14 and table 9.15 have the roughly equivalent value levels How to do that? Put them into a matrix. See table 9.16

QWS Step5: rate value of actual performance on each dimension Rating table 9.9 according to rubric (table9.16) See Table 9.17 Step6: tally the number of ratings at each level and look for a clear winner For each program, how many symbols they got? Throw out programs with unacceptable ratings, see whether there is a clear winner?

QWS Step7: refocus Delete the rows with similar score (see table9.18) Count how many symbols each of them got Can we find the clear winner? Yes or no? Why? How should we go further?