S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Faculty T&L Forum on outcome-based curriculum and criterion- referenced assessment Presenter: W. K. Kong (BRE)
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Chapter 3 – Data Exploration and Dimension Reduction © Galit Shmueli and Peter Bruce 2008 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Introduction to Statistics
Performance Evaluation
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
 How to infer causation: 8 strategies?  How to put them together? S519.
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
What do you already know about rubrics? What do you want to know?
Applying Assessment to Learning
Analyzing Assessment Data. A process to consider... Student Learning Outcomes identified for program. Courses identified as to where the outcomes will.
 Systematic determination of the quality or value of something (Scriven, 1991)  What can we evaluate?  Projects, programs, or organizations  Personnel.
Definitions Performance Appraisal
Copyright © 2006 Pearson Education Canada Inc Course Arrangement !!! Nov. 22,Tuesday Last Class Nov. 23,WednesdayQuiz 5 Nov. 25, FridayTutorial 5.
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Usability Testing as a User- Centered Design Tool Judy Ramey Dept. of Technical Communication March 30, 2005.
Performance Evaluation
S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.
Chapter 4 Designing Significant Learning Experiences II: Shaping the Experience.
Chapter 1: Introduction to Statistics
Analyzing data: Synthesis
Step 6: Implementing Change. Implementing Change Our Roadmap.
1 Lesson 19 Creating Formulas and Charting Data Computer Literacy BASICS: A Comprehensive Guide to IC 3, 3 rd Edition Morrison / Wells.
S519: Evaluation of Information Systems
TNE Program Assessment Forum April 19 th, Glad you’re here!  Who’s Who… Design Team Representatives Program Assessment Team.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Systematic Reviews.
Creating Questionnaires. Learning outcomes Upon completion, students will be able to: Identify the difference between quantitative and qualitative data.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
Taking it to Another Level: Increasing Higher Order Thinking Session 3 BHCA PROFESSIONAL DEVELOPMENT SERIES.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Access to HE internal moderation and standardisation planning Workshop Session.
Let’s Talk Assessment Rhonda Haus University of Regina 2013.
Raises, Merit Pay, Bonuses Personnel Decisions (e.g., promotion, transfer, dismissal) Identification of Training Needs Research Purposes (e.g., assessing.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
USING SCIENCE JOURNALS TO GUIDE STUDENT LEARNING Part 1: How to create a student science journal Part 2: How to assess student journals for learning.
 What is synthesis methodology?, why do we need that?  What is synthesis for grading?  Quantitative  Qualitative  How to merge all the conclusions.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
An overview of multi-criteria analysis techniques The main role of the techniques is to deal with the difficulties that human decision-makers have been.
EDCI 696 Dr. D. Brown Presented by: Kim Bassa. Targeted Topics Analysis of dependent variables and different types of data Selecting the appropriate statistic.
The Scientific Method: A flipbook of the inquiry process! the steps you follow to do an experiment.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
BY LINDA CASTILLO If I have a pencil sharpening procedure will the classroom have fewer distractions?
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Chapter 6: Analyzing and Interpreting Quantitative Data
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Introduction to Statistics Chapter 1. § 1.1 An Overview of Statistics.
CONFERENCE EVALUATION DATA ANALYSIS. DATA ANALYSIS  A credible amount of data has to be collected to allow for a substantial analysis  Information collected.
Adeyl Khan, Faculty, BBA, NSU 1. Sorting, Categorizing, Rating and Evaluating a large quantity of ideas. Simple checklists To complex weighted scoring.
Developing Smart objectives and literature review Zia-Ul-Ain Sabiha.
Student Presentations Developing rubrics for assessment Nancy Rees, Barbara White.
 Set up is January 7, 2016  Judging is January 8, 2016.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 7 Assessing and Grading the Students.
PSY 325 AID Education Expert/psy325aid.com FOR MORE CLASSES VISIT
THE RESEARCH PROCESS How do Sociologists study society? STUDYING SOCIETY TOPIC Glue the Learning Journey into your book.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
+ Multiple Genre Lit Circle Project Dystopian Literature Circles.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
MASTER'S THESIS SEMINAR DR. SHUAIQIANG WANG DEPARTMENT OF CS-IS, JYU.
I. Introduction to statistics
Supplement S7 Supplier Selection.
Perspective Interview: Sofia Perez
EXCEL BOOKS 14-1 JOB EVALUATION.
Effective Use of Rubrics to Assess Student Learning
Introductions PSY 231: Research Methods in Psychology Dr. Cutting
FSV Interpretation (qualitative)
Presentation transcript:

S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171

Last week What is synthesis methodology?, why do we need that? What is synthesis for grading? What is the rubric for dimensions What is the rubric for components How to merge all the conclusions to get the final grade?

Synthesizing for „ranking“ What are „ranking“ evaluations? Examples? Difference comparing with „grading“ evaluation?

Qualitative and quantitative Qualitative Qualitative weight and sum (QWS) Quantitative Numerical weight and sum (NWS)

Numerical Weight and Sum (NWS) It is a quantitative synthesis method for summing evaluand performance across multiple criteria. It includes Assign numerical importance weight and a numerical performance score to each criteria (dimention) Mutliply weights by performance scores Sum these products The summing result represents the overall merit of the evaluand

Numerical Weight and Sum (NWS) It fits for There are only a small number of criteria (why) There is some other mechanism for taking bars into account (why) There is defensible needs-based strategy for ascribing weights.

Training program evaluation A comparative evaluation on three different interventions for training managers A mountain retreat featuring interactive sessions with multiple world-class management gurus An in-house training and mentoring program run by human resources, A set of videos and latest book on management from managment guru Peter Drucker

Training program evaluation Needs assessment for this evaluation Bear in mind that this is a comparison evaluation How do you want to compare these programs, what are the key features of the programs Identify the dimension of merit (Process, Outcomes and Cost) Decide the importance of the merit (giving weights to merits, based on needs?) See Table 9.8

Training program evaluation Next steps Data collection (what are your experiences for your project data collection?) Data analysis Rate their performance based on pre-defined ratings: excellent, very good, good, fair, or poor) (see Table 9.9 for this example) Convert weights into numbers (see Table 9.10) Convert ratings into numbers (see Table 9.10) Synthesis step (how? See Table 9.11) How to interpret Table 9.11

Exercise Do it by your own hand: Converting Table9.9 to Table9.10 (defining your own numeric value for importance and grading scales) and try to find out which program is the best comparing with others. If suddently, the cost criterias become extremely important, will this change the final result? Work on your own Form the pair and discussion Pros and cons for NWS?

Qualitative Weight and Sum (QWS) It is non-numerical synthesis methodology for summing the performances of an evaluand on multiple criteria to determine overall merit. It is a ranking method for determining the relative merit of two or more evaluands It is not suitable for grading It fits for Personnel selection, products/service/proposal selection

QWS Step1: Determine importance in terms of maximum possible value How (see Chapter 7, six strategies) Table 9.12 (compare with Table 9.8) Step2: Set bars Bar is the cut point between acceptable and unacceptable criteria. Such as: Too expensive to afford Too long away from their work

QWS Step3: Create value determination rubrics Rubrics are level-based (see Chapter 8) Description on each level, how to deal with bar?  Unacceptable  no noticeable value  marginally valuable  valuable  extremely valuable  Such as what performance would look like at each level Each dimension can have its own rubrics or each group of dimensions can have their own rubrics Each group of questions can have their own rubrics Synthesis step can have its own rubrics Example: Rubric for rating finanical cost of training (see table 9.14) How to set up similar rubric for Table 9.12 on extremely valuable criteria (see Table 9.15)

QWS Step4: Check equivalence of value levels across dimensions The validity of the QWS method is highly dependent on ensuring the rough equivalence on the value levels defined for each dimension For example, whether table 9.14 and table 9.15 have the roughly equivalent value levels How to do that? Put them into a matrix. See table 9.16

QWS Step5: rate value of actual performance on each dimension Rating table 9.9 according to rubric (table9.16) See Table 9.17 Step6: tally the number of ratings at each level and look for a clear winner For each program, how many symbols they got? Throw out programs with unacceptable ratings, see whether there is a clear winner?

QWS Step7: refocus Delete the rows with similar score (see table9.18) Count how many symbols each of them got? Can we find the clear winner? Yes or no? Why? How should we go further?

Exercise Form a group to work on this Library wants to subscribe to one of the three magazines and ask you to conduct an evaluation and propose the best solution: Journal of the American Society for Information Science and Technology Journal of Information Science Scientometrics (or choose some magezines you are familiar with) List the criteria you think are important Following the steps of NWS and QWS and tell me your findings and show me the justification of your findings.