Student’s Assessment Dr. A. M. Kadri Associate Professor, PSM dept. PDU Govt. Medical College, Rajkot.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Curriculum Development and Course Design
The Research Consumer Evaluates Measurement Reliability and Validity
What is a Good Test Validity: Does test measure what it is supposed to measure? Reliability: Are the results consistent? Objectivity: Can two or more.
Assessment of Professionals M. Schürch. How do you assess performance? How do you currently assess the performance of your residents? What standards do.
CURRICULUM DEVELOPMENT
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
Objective vs. subjective in assessment Jaime Correia de Sousa, MD, MPH Horizonte Family Health Unit Matosinhos Health Centre - Portugal Health Sciences.
RESEARCH METHODS Lecture 18
By Dr Razia Khatoon Dr Noor Jahan 1 12/9/2014 5th Basic Course Workshop in Medical Education Technologies.
An overview of Assessment. Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of.
Constructing a test. Aims To consider issues of: Writing assessments Blueprinting.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Introducing Assessment
Classroom Assessment A Practical Guide for Educators by Craig A
Measuring Learning Outcomes Evaluation
Assessing and Evaluating Learning
Assessment of clinical skills Joseph Cacciottolo Josanne Vassallo UNIVERSITY OF MALTA ANNUAL CONFERENCE - OSLO - MAY 2007.
Standards and Guidelines for Quality Assurance in the European
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Evaluation: A Challenging Component of Teaching Darshana Shah, PhD. PIES
‘Do we need exams?’ Wendy Reid Medical Director HEE Past – Vice President RCOG.
Classroom Assessment and Grading
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Understanding Meaning and Importance of Competency Based Assessment
Prepare and Use Knowledge Assessments. IntroductionIntroduction Why do we give knowledge tests? What problems did you have with tests as a student? As.
Teaching Today: An Introduction to Education 8th edition
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
ASSESMENT IN OPEN AND DISTANCE LEARNING Objectives: 1.To explain the characteristic of assessment in ODL 2.To identify the problem and solution of assessment.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
International Diabetes Federation (IDF) East Mediterranean and Middle East Region (EMME) Workshop on Professional Educational Methodology in Diabetes
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Assessment Tools.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
Assessing Your Learner Lawrence R. Schiller, MD, FACG Digestive Health Associates of Texas Baylor University Medical Center, Dallas.
Assessment and Testing
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Assessing Learners The Teaching Center Department of Pediatrics UNC School of Medicine The Teaching Center.
Chapter 14: Affective Assessment
VALIDITY, RELIABILITY & PRACTICALITY Prof. Rosynella Cardozo Prof. Jonathan Magdalena.
Reliability EDUC 307. Reliability  How consistent is our measurement?  the reliability of assessments tells the consistency of observations.  Two or.
Monitoring and Assessment Presented by: Wedad Al –Blwi Supervised by: Prof. Antar Abdellah.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
INTRODUCTION TO ASSESSMENT METHODS USED IN MEDICAL EDUCATION AND THEIR RATIONALE.
Students Assessment: Essay Type Questions. Dr. G. U. Kavathia Associate Professor in Microbiology, P. D. U. Govt. Medical college, Rajkot.
Required Skills for Assessment Balance and Quality: 10 Competencies for Educational Leaders Assessment for Learning: An Action Guide for School Leaders.
Evaluation and Assessment Evaluation is a broad term which involves the systematic way of gathering reliable and relevant information for the purpose.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Copyright © 2005 Avicenna The Great Cultural InstituteAvicenna The Great Cultural Institute 1 Student Assessment.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Concept of Test Validity
Classroom Assessments Checklists, Rating Scales, and Rubrics
Clinical Assessment Dr. H
RESEARCH METHODS Lecture 18
Assessment 101 Zubair Amin MD MHPE.
Chapter 4 Characteristics of a Good Test
Student Assessment and Evaluation
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Student Assessment and Evaluation
Presentation transcript:

Student’s Assessment Dr. A. M. Kadri Associate Professor, PSM dept. PDU Govt. Medical College, Rajkot

“Students read, not to learn but to pass the examination. They pass the examination but they do not learn.” Huxley

What is an assessment? Any systemic method of obtaining evidence (from test, examination questionnaires, surveys and collateral source ) used to draw inference about competency of the students for a specific purpose

Evaluation Evaluation is a judgment regarding the quality or worth of the assessment results. This judgment is based upon multiple source of assessment information.

Qualitative and Quantitative Measurement of students behaviour. + Value judgment = Evaluation

For no matter how appealing the statement of goals, how logical the programme organization, how dazzling the teaching method, it was the examination that communicated most vividly to students what was expected of them”. George E. Miller

Your experiences with Assessment 8

Critical questions in assessment 1. WHY are we doing the assessment? 2. WHAT are we assessing? 3. HOW are we assessing it? 4. HOW WELL is the assessment working?

 Why we are doing assessment?

Purpose of assessment  To determine whether learning objectives prior sets are met.  Support of student learning.  Certification and judgment of competency.  Development and assessment of teaching program.  Understanding the learning process.  Predicting the future performance.

Purpose of assessment DEFINE THE MINIMUM ACCEPTED LEVEL OF COMPETENCE  Prove (that he/she is a competent doctor) - SUMMATIVE  Improve (provide feedback regarding shortcomings) – FORMATIVE

Purpose of assessment  For selection of a few students from a large number of students.  Preassessment of the need of a learner  For continued monitoring of learning activities for giving a feedback.  For competence to complete a course

2. WHAT are we testing?  Elements of competence Knowledge  factual  applied: clinical reasoning Skills  communication  clinical Attitudes  professional behaviour

3. How are we doing the assessment?  Essays  Short answer questions  Objective items – supply and selection type.  Simulated patient management problems  Assignments  Practical  Clinical  OSPE  OSCE  Rating scales  Checklists  Questionnaires  Diary and logbook

What we can assess? Take a coloscopy to diagnose a broken leg? Totally stupid? The PURPOSE determines the method chose!  Without clarifying the purpose The procedure is ineffective

What to Assess ? DomainMethodInstrument Cognitive (knowledge) Written tests Oral Open-ended or essay questions structured essay or MEQ short answer question objective MCQ simulated patient management problems Assignments questions Psychomotor (skills) ObservationsPractical – actual and model clinical cases Objective structured clinical/practical examination

What to Assess? DomainMethodInstrument Affective (Attitude) ObservationRating scales check lists questionnaire log book daily evaluation sheets

3. What we can assess? Test formats Knows Shows how Knows how Does Knows Factual tests: SBAs, Essay, SAQ Knows how (Clinical) Context based tests: SBAs, EMQs, SAQ Shows how Performance assessment in vitro: OSCEs OSPE Does Performance assessment in vivo: mini-CEX, DOPs

4. HOW WELL is the assessment working? Evaluation of assessment systems Is it valid? Is it reliable? Is it doing what it is supposed to be doing? To answer these questions, we have to consider the characteristics of assessment instruments

Characteristics of Assessment:  Relevance: Is it appropriate to the needs of the society or system.  Validity: Does the assessment tool really test what it intend to test.  Reliability – Accuracy and consistency  Objectivity: Will the scores obtained by the candidate be same if evaluated by two or more independent experts?  Feasibility: Can the process be implemented in practice?

RELEVANCE  Relevance refers to appropriateness of the process of evaluation with reference to the jobs to be performed by the student after qualification and therefore it should reflect the health needs of the society.  Relevance of the process should be obvious both to teachers and the students for the test to be taken seriously and for the results to reflect levels of achievement.

VALIDITY Refers to the degree to which a test measures what it intends to measure.  In choosing an instrument, the first question that the teacher should consider is the learning outcome sought to be measured.  Refers both to the results of the test as well as the instrument

Factors Influencing Validity of Evaluation Tool. Test factors  Unclear directions  Difficult and ambiguous wording of questions.  Poorly constructed items.  Inappropriate level of difficulty  Inappropriate question for the outcome being measured.  Inappropriate arrangements of items  Identifiable pattern of answers and clues.  Too short or too long a test.  Errors in scoring  Adverse classroom and environmental factors.

FACTORS INFLUENCING VALIDITY OF EVALUATION TOOL Student Factors  Adoption of unfair means.  Emotional disturbance.  Lack of motivation.

RELIABILITY Consistency with which an instrument measures the variable  Reliability is a measure of reproducibility of the test.  Reliability is a mathematical concept and is a measure of correlation between two sets of scores. To obtain two sets of scores one of three alternatives are available. a. Test-retest: b. Equivalent tests: Two tests of equivalent form can be administered to the students to obtain two sets of scores. c. Split half method: In this a single test is split into two halves (for example odd and even numbered MCQs) and the two sets of scores for each student compared

MEASURES TO IMPROVE RELIABILITY  Increase length of test to optimum level.  Use appropriate levels of difficulty and Appropriate levels of discrimination to ensure wide spread of scores.  Maintain conditions of test constant.  Ensure objectivity of scoring.  Ensure validity of the instrument used.

OBJECTIVITY  Degree of agreement between the judgment of Independent and competent examiners  Objectivity of the Evaluation process should be maintained Step to increase objectivity of scoring of conventional examinations Structuring of questions Preparation of model answers Agreeing on the marking scheme Having papers independently valued by two or more examiners

FEASIBILITY  Considering the ground realities, an evaluation process should be feasible  Factors to be considered in deciding feasibility are  Time and resources required  Availability of an equivalent form of the test for measuring reliability  Ease of administration, scoring and interpretation

Aligning Assessment with Objectives: There are two major reasons for aligning assessments with learning objectives. First, alignment increases the probability that we will provide students with the opportunities to learn and practice the knowledge and skills that will be required on the various assessments we design. Second, when assessments and objectives are aligned, “good grades” are more likely to translate into “good learning”. - When objectives and assessments are misaligned, many students will focus their efforts on activities that will lead to good grades on assessments, rather than focusing their efforts on learning what we believe is important.

Keep in mind the following questions: What will the student’s work on the activity (multiple choice answers, essays, project, presentation, etc) tell me about their level of competence on the targeted learning objectives? How will my assessment of their work help guide students’ practice and improve the quality of their work? How will the assessment outcomes for the class guide my teaching practice?

Systematically designed assessment  Relevant to the curriculum.  Focus on important skills.  Promote learning of skills.  Spell level of attainment.  Discriminate good and poor students.  Provide feedback.

Tips For Framing Questions  Clearly define the learning outcome to be assessed.  Be precise and clear  Use explicit terms –Identify, Compare and Contrast, Define, Give Reason etc.  Define the correct answer and scoring in advance  Ensure all the topics are covered  Avoid repetition  Have colleague critically review the question.

To sum up, Use about principle of assessment, Know Purpose of assessment, Know the Characteristics of Assessments tool Use Appropriate types of assessment, Align assessment with the objectives.

Changing the examination system without changing the curriculum have a much more performed impact upon the nature of learning than changing the curriculum without attending the examination” - G. E. Miller “ Whoever controls the examination controls the curriculum”

Thank you..