 What is synthesis methodology?, why do we need that?  What is synthesis for grading?  Quantitative  Qualitative  How to merge all the conclusions.

Slides:



Advertisements
Similar presentations
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Advertisements

Test Development.
Copyright © Allyn & Bacon 2011 Assessing Students and Texts Chapter 4 This multimedia product and its content are protected under copyright law. The following.
Chapter 3 – Data Exploration and Dimension Reduction © Galit Shmueli and Peter Bruce 2008 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Performance Evaluation
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
OCR GCSE Humanities Get Ahead - improving delivery and assessment of Unit 3 Unit B033 Controlled Assessment Approaches to Preparing Candidates for the.
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
 How to infer causation: 8 strategies?  How to put them together? S519.
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
What do you already know about rubrics? What do you want to know?
 Systematic determination of the quality or value of something (Scriven, 1991)  What can we evaluate?  Projects, programs, or organizations  Personnel.
Definitions Performance Appraisal
Selecting Preservation Strategies for Web Archives Stephan Strodl, Andreas Rauber Department of Software.
Copyright © 2006 Pearson Education Canada Inc Course Arrangement !!! Nov. 22,Tuesday Last Class Nov. 23,WednesdayQuiz 5 Nov. 25, FridayTutorial 5.
Performance Evaluation
SELECTING A DATA COLLECTION METHOD AND DATA SOURCE
SADC Course in Statistics Producing Good Tables In Excel Module B2 Sessions 4 & 5.
S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.
Objectives for Session Nine Observation Techniques Participatory Methods in Tanzania Hand back memos.
Appraisal Types.
Assessment Statements  The Internal Assessment (IA) Rubric IS the assessment statement.
Chapter 1: Introduction to Statistics
Analyzing data: Synthesis
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
S519: Evaluation of Information Systems
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Steps In Appraisal Performance The performance appraisal process contains three steps. 1.Define the job. 2.Appraise.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
Taking it to Another Level: Increasing Higher Order Thinking Session 3 BHCA PROFESSIONAL DEVELOPMENT SERIES.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Access to HE internal moderation and standardisation planning Workshop Session.
Raises, Merit Pay, Bonuses Personnel Decisions (e.g., promotion, transfer, dismissal) Identification of Training Needs Research Purposes (e.g., assessing.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
S519: Evaluation of Information Systems Analyzing data: Synthesis D-Ch9.
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
USING SCIENCE JOURNALS TO GUIDE STUDENT LEARNING Part 1: How to create a student science journal Part 2: How to assess student journals for learning.
Chapter(3) Qualitative Risk Analysis. Risk Model.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
An overview of multi-criteria analysis techniques The main role of the techniques is to deal with the difficulties that human decision-makers have been.
 Employees are often frustrated about the appraisal process  Appraisals are too subjective  Possibility of unfair treatment by a supervisor  Way to.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
1 The “Perfect” Date Prepared for SSAC by Semra Kilic-Bahi - Colby-Sawyer College, New London, NH © The Washington Center for Improving the Quality of.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
BY LINDA CASTILLO If I have a pencil sharpening procedure will the classroom have fewer distractions?
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Chapter 6: Analyzing and Interpreting Quantitative Data
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
CONFERENCE EVALUATION DATA ANALYSIS. DATA ANALYSIS  A credible amount of data has to be collected to allow for a substantial analysis  Information collected.
Authentic Assessment Using Rubrics to Evaluate Project-Based Learning Curriculum content created and presented by Dr. Alexandra Leavell Associate Professor.
© 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 1 Product Design Alternative Generation, Evaluation, and Selection.
Adeyl Khan, Faculty, BBA, NSU 1. Sorting, Categorizing, Rating and Evaluating a large quantity of ideas. Simple checklists To complex weighted scoring.
S519: Evaluation of Information Systems Analyzing data: Rank Ch9-p171.
Part II – Chapters 6 and beyond…. Reliability, Validity, & Grading.
Student Presentations Developing rubrics for assessment Nancy Rees, Barbara White.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 7 Assessing and Grading the Students.
Designing Scoring Rubrics
NATA Foundation Student Grants Process
Classroom Assessment A Practical Guide for Educators by Craig A
Supplement S7 Supplier Selection.
Analytic Hierarchy Process (AHP)
Appraisal Types.
1234: AEC SCHOOL | 1234: RESIDING DISTRICT
EXCEL BOOKS 14-1 JOB EVALUATION.
Effective Use of Rubrics to Assess Student Learning
Performance Management
DESIGNING RUBRICS Hein van der Watt
Presentation transcript:

 What is synthesis methodology?, why do we need that?  What is synthesis for grading?  Quantitative  Qualitative  How to merge all the conclusions to get the final grade? S519

 What are „ranking“ evaluations?  Examples?  Difference comparing with „grading“ evaluation? S519

 Qualitative  Qualitative weight and sum (QWS)  Quantitative  Numerical weight and sum (NWS) S519

 It is a quantitative synthesis method for summing evaluand performance across multiple criteria.  It includes  Assign numerical importance weight and a numerical performance score to each criteria (dimension)  Multiply weights by performance scores  Sum these products  The summing result represents the overall merit of the evaluand S519

 It fits for  There are only a small number of criteria  There is some other mechanism for taking bars into account (why)  There is defensible needs-based strategy for ascribing weights. S519

 A comparative evaluation on three different interventions for training managers  A mountain retreat featuring interactive sessions with multiple world- class management gurus  An in-house training and mentoring program run by human resources,  A set of videos and latest book on management from management guru Peter Drucker S519

 Needs assessment for this evaluation  Bear in mind that this is a comparison evaluation  How do you want to compare these programs, what are the key features of the programs  Identify the dimension of merit (Process, Outcomes and Cost)  Decide the importance of the merit (giving weights to merits, based on needs?)  See Table 9.8 S519

 Next steps  Data collection (what are your experiences for your project data collection?)  Data analysis  Rate their performance based on pre-defined ratings: excellent, very good, good, fair, or poor) (see Table 9.9 for this example)  Convert weights into numbers (see Table 9.10)  Convert ratings into numbers (see Table 9.10)  Synthesis step (how? See Table 9.11)  How to interpret Table 9.11 S519

 Do it by your own hand:  Converting Table9.9 to Table9.10 (defining your own numeric value for importance and grading scales) and try to find out which program is the best comparing with others.  If suddenly, the cost criteria become extremely important, will this change the final result?  Work on your own  Form the pair and discussion  Pros and cons for NWS? S519

 It is non-numerical synthesis methodology for summing the performances of an evaluand on multiple criteria to determine overall merit.  It is a ranking method for determining the relative merit of two or more evaluands  It is not suitable for grading  It fits for  Personnel selection, products/service/proposal selection S519

 Step1: Determine importance in terms of maximum possible value  How (see Chapter 7, six strategies)  Table 9.12 (compare with Table 9.8)  Step2: Set bars  Bar is the cut point between acceptable and unacceptable criteria. Such as:  Too expensive to afford  Too long away from their work S519

 Step3: Create value determination rubrics  Rubrics are level-based (see Chapter 8)  Description on each level, how to deal with bar?  Unacceptable  no noticeable value  marginally valuable  valuable  extremely valuable  Such as what performance would look like at each level  Each dimension can have its own rubrics or each group of dimensions can have their own rubrics  Each group of questions can have their own rubrics  Synthesis step can have its own rubrics  Example: Rubric for rating finanical cost of training (see table 9.14) S519

 Step4: Check equivalence of value levels across dimensions  The validity of the QWS method is highly dependent on ensuring the rough equivalence on the value levels defined for each dimension  For example, whether table 9.14 and table 9.15 have the roughly equivalent value levels  How to do that? Put them into a matrix.  See table 9.16 S519

 Step5: rate value of actual performance on each dimension  Rating table 9.9 according to rubric (table9.16)  See Table 9.17  Step6: tally the number of ratings at each level and look for a clear winner  For each program, how many symbols they got?  Throw out programs with unacceptable ratings, see whether there is a clear winner? S519

 Step7: refocus  Delete the rows with similar score (see table9.18)  Count how many symbols each of them got  Can we find the clear winner?  Yes or no?  Why?  How should we go further? S519