S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Eduardo J Salazar-Vega, MPH CPH Jan Koehn, MS CIH.
Performance Evaluation
Chapter 7 Job Description Performance appraisal Job evaluation
Developing a Hiring System OK, Enough Assessing: Who Do We Hire??!!
Chapter 10 Decision Making © 2013 by Nelson Education.
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
 How to infer causation: 8 strategies?  How to put them together? S519.
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
 What are evaluation criteria?  What are step3 and step 4?  What are the step3 and step4 output report? S519.
Satisfacts Customer/Employee Evaluation Program Do you want to learn about what your clients, customers, members and employees are really thinking?
Introduction to Management Science
Definitions Performance Appraisal
Copyright © 2006 Pearson Education Canada Inc Course Arrangement !!! Nov. 22,Tuesday Last Class Nov. 23,WednesdayQuiz 5 Nov. 25, FridayTutorial 5.
Chapter 1 The Where, Why, and How of Data Collection
SE 450 Software Processes & Product Metrics Survey Use & Design.
Performance Evaluation
Introduction to Management Science
S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.
Chapter 1 Introduction to Statistics
Chapter 4 – Strategic Job Analysis and Competency Modeling
Quality of life 8th seminar Jaroslav Biolek. Objective x subjective -What is relation between objective and subjective indicators of quality of life (according.
Copyright © 2016 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Writing the Research Paper
Assessment Statements  The Internal Assessment (IA) Rubric IS the assessment statement.
9-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Multicriteria Decision Making Chapter 9.
Multicriteria Decision Making
«Enhance of ship safety based on maintenance strategies by applying of Analytic Hierarchy Process» DAGKINIS IOANNIS, Dr. NIKITAKOS NIKITAS University of.
STA 2023 Chapter 1 Notes. Terminology  Data: consists of information coming from observations, counts, measurements, or responses.  Statistics: the.
Chapter 1 Introduction to Statistics
Analyzing data: Synthesis
Evaluation of Quality of Learning Scenarios and Their Suitability to Particular Learners’ Profiles Assoc. Prof. Dr. Eugenijus Kurilovas, Vilnius University,
S519: Evaluation of Information Systems Week 14: April 7, 2008.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
S519: Evaluation of Information Systems
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
1.What is this graph trying to tell you? 2.Do you see anything misleading, unclear, etc.? 3.What is done well?
Undergraduate Dissertation Preparation – Research Strategy.
Chapter 11: Qualitative and Mixed-Method Research Design
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Chapter Eleven Compensation System Development. Copyright © Houghton Mifflin Company. All rights reserved. 11–2 Chapter Outline Employee Satisfaction.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
An Introduction to Measurement and Evaluation Emily H. Wughalter, Ed.D. Summer 2010 Department of Kinesiology.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Module 4: Systems Development Chapter 12: (IS) Project Management.
Chapter 9 - Multicriteria Decision Making 1 Chapter 9 Multicriteria Decision Making Introduction to Management Science 8th Edition by Bernard W. Taylor.
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
USING SCIENCE JOURNALS TO GUIDE STUDENT LEARNING Part 1: How to create a student science journal Part 2: How to assess student journals for learning.
 What is synthesis methodology?, why do we need that?  What is synthesis for grading?  Quantitative  Qualitative  How to merge all the conclusions.
S519: Evaluation of Information Systems Result D-Ch10.
Combining Test Data MANA 4328 Dr. Jeanne Michalski
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Personnel record and report is a statement describing an event, situation and happening in a clear manner. It provide both qualitative and quantitative.
McGraw-Hill/Irwin Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. Using Nonexperimental Research.
Lecture 1 Stat Applications, Types of Data And Statistical Inference.
OBSERVATIONAL METHODS © 2009 The McGraw-Hill Companies, Inc.
Research Methodology For AEP Assoc. Prof. Dr. Nguyen Thi Tuyet Mai HÀ NỘI 12/2015.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.
RES 320 expert Expect Success/res320expertdotcom FOR MORE CLASSES VISIT
Sociologists Doing Research Chapter 2. Research Methods Sociologists attempt to ask the “why” and “how” questions and gather evidence which will help.
I. Introduction to statistics
Analytic Hierarchy Process (AHP)
Academic Rubric Slides
Chapter 7 Job Description Performance appraisal Job evaluation
Job Analysis and Job Evaluation
Presentation transcript:

S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9

Synthesis methodology It is a tool to allow us to draw overall evaluative conclusions from multiple findings about a single evaluand. Synthesis is the process of combining a set of ratings or performances on several components or dimensions into an overall rating.

Synthesis methodology Merit determination To develop the rubrics To use rubrics to summarize the multiple findings Rubrics are one of the simplest methods to blend data. But when data is a bit more complex, it is difficult to use a rubric as the only tool Data are not equally important or reliable Multi dimensions or multi components Different nuances and combinations (such as Table8.3)

It is not It is not meta-analysis A special statistical technique to give a weighted average of effect sizes across multiple studies – for quantitative studies It is not literature review or a summary A judgment from a reviewer’s point of view.

Keep in mind Doing poorly on some minimal important criteria Doing poorly on some crucial criteria Are very different!

Evaluation Synthesis for „ranking“ If it is „ranking“ (relative) evaluation: Consider each alternative and make explicit comparisons Synthesis for „grading“ If it is „grading“ (absolute) evaluation: Consider different context settings and provide better interpretation of merit

Qualitative or quantitative Quantitative synthesis Using numerical weights Qualitative synthesis Using qualitative labels

Synthesis for “grading” The primary evaluation question is for absolute quality or value How well did the evaluand perform on this dimension or component? How effective, valuable, or meritorious is the evaluand overall? Is this worth the resources put into it?

Quantitative weighting example with „bars“ Case: Personnel evaluation in a small accounting firm 13 defined tasks (e.g., telephone, reception, data entry, etc.) Each employee has responsibility for 4-6 tasks Evaluation: Importance weighting (through the voting of the selected stakeholders) In-depth discussion with business owners Derive the importance metric and bars

Quantitative weighting example with „bars“ Evaluation Define the levels of importance: 3 to 5 levels work well in most case Do not go to too many levels (why? Is this useful?) For example task  1. minor task (1)  2. normal-priority task (2)  3. high-priority task (3)  4. extremely high-priority task (4)

Quantitative weighting example with „bars“ Evaluation Setting up rubrics for each 13 tasks Normally 4-6 level is sufficient Example: Performance Rubric  1. Totally unacceptable performance (1)  2. Mediocre (substandard) performance (2)  3. Good performance (expected level) (3)  4. Performance that exceeded expectations (4)  5. All-around excellent performance (5) Synthesis – draw the overall conclusion See Exhibit 9.2 (p158)

Exercise Personal evaluation in a small accouting firm TasksImportanceScore for Alice Telephone12 Data entry23 Tax data management 41 Client support45 Reporting33 Communicating13 How about Alice according to Exhibit 9.2? Lab

Exercise Personal evaluation in a small accouting firm TasksImportanceScore for John Telephone12 Data entry23 Tax data management 42 Client support45 Reporting33 Communicating13 How about John according to Exhibit 9.2? Lab

Exercise Perosnal evaluation in a small accouting firm TasksImportanceScore for Chris Telephone12 Data entry23 Tax data management 44 Client support45 Reporting33 Communicating13 How about Chris according to Exhibit 9.2? Lab

Exerice How about Chris Mean = 1*2+2*3+4*4+4*5+3*3+1*3/( ) =56/15 =3.73 What is Chris‘ performance?

Qualitative weighting example 1 (with no „bars“) Case: a school-based health program evaluation It contains 9 different components: nutrition education, mental health services, safer sex, legal service and others. How to evaluate these systems in low-budget and short period of time whether they are meeting important needs of the students and their families Evaluation: Interview Student surveys

School health system evaluation Survey question design: Two quantitative questions How useful was the program to you? (4-point response scale: not at all useful, somewhat useful, useful, very useful) How satisfied were you with the program? One qualitative question (open-end)? What other changes or events, good or bad, have happened to you or someone you know because of receiving the service?

School health system evaluation Survey result about nutrition system shows in Table 9.1 Look at table 9.1, think about: How can you draw a conclusion from this result about the nutrition system? Is it good or bad?

School health system evaluation Setting the importance for these three questions (1-strongest data, 3=weakest data) 1. Ratings of usefulness (directly related to needs) 2. Responses to the open-ended question 3. Satisfaction ratings Creating rubrics for each question Table 9.2 for question 1 and question 2 Table 9.3 for open-ended question

School health system evaluation How to grade the nutrition system based on the first two quantitative questions: Based on Table 9.1, come out with the rubric as Table 9.2 Why 90% is select, 70%-90%.. How to draw Table 9.2 from Table 9.1 and collected data?

School health system evaluation Table 9.3 Rubric for converting data from qualitative evaluation - open-ended responses into merit ratings Is that a good way to do this? Are you happy with this table? If not, how do you want to improve it?

School health system evaluation Synthesis to draw overall conclusion Step-by-step Start with the strongest data (question 1) Blend with open-ended comments Finally take the satisfaction ratings into account See table 9.4 for the whole process

School health system evaluation How to draw final conclusion? Usefulness ratings Satisfaction ratings Open-ended comments Final coclusion: Merit of the nutrition program See table 9.4 Discuss how to apply this to your group project Using quantitative ratings to draw the suggested results and using qualitative ratings to find the positive or negative facts to re-adjust the results

Qualitative (nonnumerical) weighting example 2 Bar A minimum level of performance on a specific dimension Performance below this cannot be compensated for by much better performance on other dimensions (see Exhibit 9.2) Hard hurdle (also referred as global bars) Overall passing requirement for an evaluand as a whole (see Exhibit 9.2) Soft hurdle Overall requirement for entry into a high rating category Place a limit on the maximum rating (e.g., I want all As for my classes)

Qualitative (nonnumerical) weighting example 2 Case: Evaluation of the learning capacity of a small biotechnology start-up company „biosleep“. Evaluation 27 subdimensions of organizational learning capacity (see table 9.5) Data collection: survey and interview Rubric: similar as Table 8.2 Importance is built by using strategy 6 in Chapter 7 Using program theory and evidence of causal linkages (p )

Biosleep Evaluation Synthesis Pack the ratings on the subdimensions into 8 main dimensions Combine the ratings on these 8 main dimensions to draw an overall conclusion

Biosleep Dimension by dimension Layer by layer Sub-dimnention1 Sub-dimnention2 Sub-dimnention3 Sub-dimnention4 Dimnention1 Dimnention2 Overall rating

Biosleep Synthesis Subdimensions  Dimensions Using Table 9.6 to draw conclusions of dimentions based on subdimensions Using Table 9.6 to judge Table 9.5 and come out the result as Exhibit 9.4 Dimensions  overall evaluation Based on Table 9.7 (created based on literature review, What is your conclusion for the evaluation of Biosleep? And why?

Exericse Form your group project Discuss on how are you going to grade your evaluation? Which example you would like to follow? How to develop rubric for dimension and overall? Lab