“Sterling Examples of Computer Simulations & OSCEs (Objective Structured Clinical Examinations)” Carol O’Byrne Jeffrey Kelley Richard Hawkins Sydney Smee.

Slides:



Advertisements
Similar presentations
Co-Teaching as a Model of Student Teaching: Common Trends and Levels of Student Engagement Co-Teaching as a Model of Student Teaching: Common Trends and.
Advertisements

Nursing Diagnosis: Definition
Performance Assessment
CLEAR 2008 Annual Conference Anchorage, Alaska Strategies and Outcomes of Pre-testing for Performance-Based Assessments Carol OByrne, PEBC.
Victorian Curriculum and Assessment Authority
US Office of Education K
PRIOR LEARNING ASSESSMENT RESEARCH PROJECT IN NURSING Mount Royal College, Calgary, Alberta Purpose of the Research: To assess the impact of.
Chapter 2 What Is Continuous Performance-Based Assessment?
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
Consistency of Assessment
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Research Paper Critical Analysis Research Paper Critical Analysis 10 ways to look at a research paper systematically for critical analysis.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne.
Overview: Competency-Based Education & Evaluation
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Assessment of Clinical Competence in Health Professionals Education
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
EVALUATION OF DR.MOHAMMED AL NAAMI, FRCSC, FACS, M Ed. Using O bjective S tructured C linical E xamination (OSCE)
Training the OSCE Examiners
© 2013 by Nelson Education1 Selection II: Testing.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Overview of Nursing Process, Clinical Reasoning, and Nursing Practice.
CRITICAL THINKING in Nursing Practice: chapter 14 “…active, organized, cognitive process used to carefully examine one’s thinking and the thinking of others.”
Measuring Learning Outcomes Evaluation
Assessing and Evaluating Learning
Assessment of clinical skills Joseph Cacciottolo Josanne Vassallo UNIVERSITY OF MALTA ANNUAL CONFERENCE - OSLO - MAY 2007.
What should be the basis of
1 Ohio’s Entry Year Teacher Program Review Ohio Confederation of Teacher Education Organizations Fall Conference: October 23, 2008 Presenter: Lori Lofton.
Standards for Education and Rehabilitation of Students who are Blind and Visually Impaired A general overview of accepted standards for Teachers of the.
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Torrington, Hall & Taylor, Human Resource Management 6e, © Pearson Education Limited 2005 Slide 7.1 Importance of Selection The search for the perfect.
Oslo 27 th September 2011 Interprofessional Education at UEA Overview of IPL delivery & Lessons learnt.
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
Designing and implementing of the NQF Tempus Project N° TEMPUS-2008-SE-SMHES ( )
Assessment tool OSCE AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
NATA Athletic Training Educational Competencies 4th Edition 2006.
Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Presented at the 2005 CLEAR Annual Conference.
Learner Assessment Win May. What is Assessment? Process of gathering and discussing information from multiple sources to gain deep understanding of what.
Understanding Meaning and Importance of Competency Based Assessment
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Teaching Today: An Introduction to Education 8th edition
Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst,
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
1 Chapter 8 Selection I: Models of Testing. © 2013 by Nelson Education2 1.When is the use of testing, psychological and/or physical, a distinct advantage.
Universiteit Maastricht Barcelona, 6 – 9 July th Ottawa conference on Medical Education.
Week 5 Lecture 4. Lecture’s objectives  Understand the principles of language assessment.  Use language assessment principles to evaluate existing tests.
Patricia A. Mahoney, MSN, RN, CNE
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Assessment Tools.
Student Learning Outcomes (Pharmacy) Susan S. S. Ho School of Pharmacy Faculty of Medicine The Chinese University of Hong Kong 9 September 2007.
Early Childhood Outcomes Center New Tools in the Tool Box: What We Need in the Next Generation of Early Childhood Assessments Kathy Hebbeler ECO at SRI.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
Chapter 6 - Standardized Measurement and Assessment
Assessing Competence in a Clinical Setting GRACE Session 12.
MRCGP The Clinical Skills Assessment January 2013.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Henry M. Sondheimer, MD Association of American Medical Colleges 7 August 2013 A Common Taxonomy of Competency Domains for the Health Professions and Competencies.
Copyright © 2005 Avicenna The Great Cultural InstituteAvicenna The Great Cultural Institute 1 Student Assessment.
PORTFOLIO ASSESSMENT (Undergraduate Medical Education)
MRCGP The Clinical Skills Assessment January 2013.
MUHC Innovation Model.
Clinical Assessment Dr. H
Chapter 7: Critical Thinking
Father Muller Medical College & Hospital, Mangalore, Karnataka.
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
6 Chapter Training Evaluation.
LEARNER-CENTERED PSYCHOLOGICAL PRINCIPLES. The American Psychological Association put together the Leaner-Centered Psychological Principles. These psychological.
Presentation transcript:

“Sterling Examples of Computer Simulations & OSCEs (Objective Structured Clinical Examinations)” Carol O’Byrne Jeffrey Kelley Richard Hawkins Sydney Smee Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Session Format  Introduction: 25+ years of Performance Assessment  Presentations Richard Hawkins, National Board of Medical Examiners overview of a new national OSCE Jeff Kelley, Applied Measurement Professionals development of a new real estate computer simulation Sydney Smee, Medical Council of Canada setting performance standards for a national OSCE Carol O’Byrne, Pharmacy Examining Board of Canada scoring performance and reporting results to candidates for a national OSCE  Q&A

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Session Goals Consider the role and importance of simulations in a professional qualifying examination context Explore development and large scale implementation challenges Observe how practice analysis results are integrated with the implementation of a simulation examination Consider options for scoring, standard setting and reporting to candidates Consider means to enhance fairness and consistency Identify issues for further research and development

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Defining ‘Performance Assessment’...the assessment of the integration of two or more learned capabilities …i.e., observing how a candidate performs a physical examination (technical skill) is not performance-based assessment unless findings from the examination are used for purposes such as generating a problem list or deciding on a management strategy (cognitive skills) (Mavis et al, 1996)

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Why Test Performance? To determine if individuals can ‘do the job’ integrating knowledge, skills and abilities to solve complex client and practice problems meeting job-related performance standards To complement MC tests measuring important skills, abilities and attitudes which are difficult to impossible to measure through MCQs alone reducing impact of factors, such as cuing, logical elimination & luck or chance that may confound MC test results

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona A 25+ Year Spectrum of Performance Assessment ‘Pot luck’ direct observation apprenticeship, internship, residency programs Oral and pencil-paper, short- or long- answer questions Hands-on job samples military, veterinary medicine, mechanics, plumbers Portfolios advanced practice, continuing competency

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Simulations Electronic: architecture, aviation, respiratory care, real estate, nursing, medicine, etc. Objective Structured Clinical Examination (OSCE): medicine, pharmacy, physiotherapy, chiropractic medicine, massage therapy and including the legal profession, psychology, and others

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Simulation Promotes Evidence- based Testing… 1900 Wright brothers flight test Flew manned kite 200 feet in 20 seconds 1903 Wright brothers flight test Flew manned glider 852 feet in 59 seconds, 8 to 12 feet in the air! In between they built a wind tunnel to simulate flight under various wind direction and speed conditions, varying wing shapes, curvatures and aspect ratios to test critical calculations and glider lift to assess performance in important and potentially risky situations without incurring actual risk

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Attitudes, Skills and Abilities tested through Simulations Attitudes: client centeredness alignment with ethical and professional values and principles Skills: interpersonal and communications clinical, e.g. patient / client care technical Abilities to: analyze and manage risk, exercise sound judgment gather, synthesize and critically evaluate information act systematically and adaptively, independently and within teams defend, evaluate and/or modify decisions/actions taken monitor outcomes and follow up appropriately

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Performance / Simulation Assessment Design Elements  Domain(s) of interest & sampling plan  Realistic context – practice-related problems and scenarios  Clear, measurable performance standards  Stimuli and materials to elicit performance  Administrative, observation and data collection procedures  Assessment criteria that reflect standards  Scoring rules that incorporate assessment criteria  Cut scores/performance profiles reflecting standards  Quality assurance processes  Meaningful data summaries for reports to candidates and others

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Score Variability and Reliability  Multiple factors interact and influence scores differential and compensatory aptitudes of candidates (knowledge, skills, abilities, attitudes) format, difficulty and number of tasks or problems consistency of presentation between candidates, locations, occasions complex scoring schemes (checklists, ratings, weights) rater consistency between candidates, locations, occasions  Designs are often complex (not crossed) examinees ‘nested’ within raters - within tasks – within sites, etc.  Problems and tasks are multidimensional

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Analyzing Performance Assessment Data Generalizability (G) studies – to identify and quantify sources of variation Dependability (D) studies – to determine how to minimize the impact of error and optimize score reliability Heirarchical linear modeling (HLM) studies – to quantify and rank sources of variation in complex nested designs

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Standard Setting What score or combination of scores (profile) indicates that the candidate is able to meet expected standards of performance, thereby fulfilling the purpose(s) of the test? What methods can be used to determine this standard?

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Reporting Results to Candidates Pass-fail (classification) May also include: Individual test score and passing score Sub-scores by objective(s) and/or other criteria Quantile standing among all candidates – or among those who failed Group data - score ranges, means, standard deviations) Reliability and validity evidence (narrative, indices and/or error estimates and their interpretation) Other

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Some Validity Questions Exactly what are we measuring with each simulation? Does it support the test purpose? To what extent is each candidate is presented with the same or equivalent challenges? How consistently are candidates’ performances assessed no matter who or where the assessor is? Are the outcomes similar to findings in other comparable evaluations? How ought we to inform & report to candidates about performance standards / expectations & their own performance strengths/gaps?

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Evaluation Goals Validity evidence Strong links from job analysis to interpretation of test results Simulation performance relates to performance in training and other tests of similar capabilities Reliable, generalizable scores and ratings Dependable pass-fail (classification) standards Feasibility and sustainability For program scale (number of candidates, sites, etc.) Economic, human, physical, technological resources Continuous evaluation and enhancement plan

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Wisdom Bytes Simulations should be as true to life as possible (fidelity) Simulations should test capabilities that cannot be tested in more efficient formats Simulation tests should focus on integration of multiple capabilities rather than on a single basic capability The nature of each simulation/task should be clear but candidates should be ‘cued’ only as far as is realistic in practice Increasing the number of tasks contributes more to the generalizability and dependability of results than increasing the number of raters

Presented at the 2005 CLEAR Annual Conference September Phoenix, Arizona Expect the Unpredictable… Candidate diversity Language Training Test format familiarity Accommodation requests Logistical challenges Technological glitches Personnel fatigue and/or attention gaps Site variations Security cracks Test content exposure in prep programs, study materials – in various languages