Use of critical appraisal as a tool for peer instruction and assessment in post-graduate Epidemiology Prof Philip Baker1, Dr Daniel Demant1,2, Daniel.

Slides:



Advertisements
Similar presentations
Evidence into Practice: how to read a paper Rob Sneyd (with help from...Andrew F. Smith, Lancaster, UK)
Advertisements

Project-Based vs. Text-Based
Peer Review: A Conduit for Developing Graduate Attributes? Judy Pate & Sheena Bell – The Business School Helen Purchase & John Hamer – Computing Science.
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
Gradual Release of Responsibility & Feedback
Peer Assessment of Oral Presentations Kevin Yee Faculty Center for Teaching & Learning, University of Central Florida Research Question For oral presentations,
Emergent Technology GRIT 685. What is it?  A set of handheld devices  “Clickers”  Allow all students to answer an instructor’s questions  Allows the.
Principles of Assessment
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Central concepts:  Assessment can measure habits of mind or habits of recall.  Tests have their limits.  It is important to know the purpose the test.
Service-Learning and Grant Writing Workshop Tennessee Technological University February 23, 2010 Presented by: Shelley Brown Department of Sociology and.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
Assessment Formats Charlotte Kotopoulous Regis University EDEL_450 Assessment of Learning.
Adapted from “Best Practices for Student Learning, Assessment in Online Courses”“Best Practices for Student Learning, Assessment in Online Courses”
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
Notes from Careers Guidance Practice A study into the impact of embedding practical career management skills within a module preparing students to apply.
Innovative Applications of Formative Assessments in Higher Education Beyond Exams Dan Thompson M.S. & Brandy Close M.S. Oklahoma State University Center.
REVIEW SECONDARY PROGRAM MODELS & THRUSTS General Education Target: Students with mildest of disabilities. Academic Focus Target: learners with mild learning.
Open Math Module 3 Module 3: Approaches to Integrating OER into Math Instruction Planning Instruction with OER 1.0 Introduction.
Dr. Christine Tom Griffith University School-based Assessment for and in Learning.
Good teaching for diverse learners
FLIPPED CLASSROOM FOR AUTHENTIC LEARNING SUCCESS
ARIZONA STATE UNIVERSITY
Inquiry-Based Instruction
Critically Appraising a Medical Journal Article
Evaluating Student-Teachers Using Student Outcomes
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
What is an Internship and what is expected of me in this placement?
Active learning Flipped Classrooms
EFFECTIVE LESSON PLANNING Teacher Academy
Evaluation of An Urban Natural Science Initiative
ASSESSMENT OF STUDENT LEARNING
Lesson Design Study Leading Lesson Study.
Classroom test and Assessment
The Concept of INTERDISCIPLINARY TEACHING
AXIS critical Appraisal of cross sectional Studies
Maths Counts Insights into Lesson Study
Grade descriptors, Criteria, Learning Outcomes & Briefs
Chapter 7 The Hierarchy of Evidence
ELT. General Supervision
Changes to the Educator Evaluation System
Teaching and Educational Psychology
EDU 695Competitive Success/snaptutorial.com
EDU 695 Education for Service-- snaptutorial.com
EDU 695 Teaching Effectively-- snaptutorial.com
Effective Feedback, Rubrics, and Grading
12/5/2018 2:31 AM Value Added: The Benefits of Enhancing Program Assessment Using Indirect Methods © 2007 Microsoft Corporation. All rights reserved. Microsoft,
Revisit Differentiation and Reflection Standard
Analyze Student Work Sample 2 Instructional Next Steps
Component 4 Effective and Reflective Practitioner
Literature searching & critical appraisal
Revisit Differentiation and Reflection Standard Prepare for Submission
Making learning active
Analyzing Student Work Sample 2 Instructional Next Steps
Leanne Havis, Ph.D., Neumann University
History Assessments – Paper 2
Assignment Design Workshop
Welcome to the overview session for the Iowa Core Curriculum
McNeese State University Professional Development Opportunity
Building Leadership Capacity Difficult Discussions
Building Leadership Capacity Difficult Discussions
IDEA Student Ratings of Instruction
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
The Impact of Project Based Learning on High School Biology SOL Scores
Curriculum Coordinator: Janet Parcell Mitchell January 2016
Media Project 4 Assessment and Learning
Diverse use of the web-based peer assessment tool, Aropä, across the University of Glasgow. Elaine Huston, Mary McVey, Susan Deeley, Chris Finlay, John.
A Moodle-based Peer Assessment Tool
Presentation transcript:

Use of critical appraisal as a tool for peer instruction and assessment in post-graduate Epidemiology Prof Philip Baker1, Dr Daniel Demant1,2, Daniel Francis1 & Abby Cathcart1 1- Queensland University of Technology, Brisbane 2-University of Technology Sydney

Presentation Critical appraisal - what is it, why use it for assessment, pitfalls Peer instruction Integration of critical appraisal and peer instruction Results Future work

Aroma?

Trustworthiness = critical appraisal May be defined as: The process of systematically examining research evidence to assess its validity, results and relevance before using it to inform a decision. Alison Hill, Critical Appraisal Skills Programme, Institute of Health Sciences, Oxford http://www.evidence-based-medicine.co.uk What this means : An assessment of the validity/claims of public health problems and interventions An evaluation of the validity of past research studies/evaluations (not evaluating an intervention itself)

Why use critical appraisal for evaluation?

Pitfalls Essay No need for discussion Potential for contract cheating Group work, uneven contribution

Critical appraisal in real-world Need to appraise what we read Journal clubs Research work Work-places where results are crucially discussed Systematic reviews Other?

Pedagogy of Peer Instruction Engages students in constructing own understanding of concepts Students individually respond to a question Discuss with peers Respond to the same question (Mazur, 1997; Crouch & Mazur, 2001)

Peer instruction (Maznur 1997) Questions posed Students given time to think Students record individual answers Student’s convince their neighbour (peer instruction) Students record revised answers Feedback to teacher Instructors explanation of correct answer

Peer instruction Concept questions: higher order questions yields better student results (Finkelstein) rather than recall Individual thinking? (Nichol and Boyle 2003, Smith 2009) Are Clickers necessary? Extension to a critical appraisal exercise? The best type of questions for peer instruction are concept questions. Evidence shows student learn more effectively with these questions when peer instruction is applied In the second step of peer instruction students are asked to think individually (Nichol and Boyle 2003) showed that 82% students found a preference to individual response to force them to think about and identify an answer. This leads to more active and engaged during the peer instruction. 90% lead to deeper thinking about the topic. A group discussion after individual thinking leads to deeper thinking about the topic.’ Peer instruction often classroom; but other approaches with flashcards, also evidence of paper based approaches to the method

Post-graduate Epidemiology at QUT Can the PI principles be replicated to paper exercise to enhance a critical appraisal assessment?

Introducing...... Quality Assessment Tool for Quantitative Studies (Effective Public Health Practice Project) https://merst.ca/wp-content/uploads/2018/02/quality-assessment-tool_2010.pdf Dictionary: https://merst.ca/wp-content/uploads/2018/02/qualilty-assessment-dictionary_2017.pdf

Selection bias Recruiting study population Differences in the way patients are accepted or rejected for a trial, and the way in which interventions are assigned to individuals Difficult in public health studies The major difference between trials and observational studies has to do with the selection bias and the need to identify and account for potential confounders in observational studies. Non-randomised studies – may produce groups that are unbalanced at the beginning of the study, thereby differences in outcomes cannot confidently be attributed to the effects of the intervention. Not possible to adjust for unknown or unmeasured confounding variables. 14

Component A: Selection Bias A) Are the individuals selected to participate in the study likely to be representative of the target population? 1. Very likely 2. Somewhat likely 3.Not likely 4. Can’t tell

Component A: Selection Bias B) What percentage of the selected individuals/schools, etc agreed to participate? 1. 80 - 100% agreement 2. 60 – 79% agreement 3. less than 60% agreement 4. Not applicable 5. Can’t tell

SELECTION BIAS (see EPHPP dictionary) Good / Strong: The selected individuals are very likely to be representative of the target population (Q1 is 1) and there is greater than 80% participation (Q2 is 1). Fair / Moderate : The selected individuals are at least somewhat likely to be representative of the target population (Q1 is 1 or 2); and there is 60 - 79% participation (Q2 is 2). ‘Moderate’ may also be assigned if Q1 is 1 or 2 and Q2 is 5 (can’t tell). Poor / Weak : The selected individuals are not likely to be representative of the target population (Q1 is 3); or there is less than 60% participation (Q2 is 3) or selection is not described (Q1 is 4); and the level of participation is not described (Q2 is 5).

Classroom with PI Assessment with PI Teach study design and risks of bias with Peer Instruction using ‘clickers’, students learn discussion Find discussion partner and sign-up as a pair Undertake individual completion of assigned study with EPHPP tool Upload substantial pre-discussion draft to Turnitin by the due date Introduce EPHPP critical appraisal tool Discuss with peer discussion each item of the EPHPP tool (peer instruction) In-class RCT simulation ‘Live the Trial’ Combine discussion and results and submit as a pair (same grade) Turnitin Opportunity to modify work and submit individually (individual) Turnitin Practice use of tool during RCT Peer Instruction

Approaches to Learning & Teaching in public health Learning epidemiology can be “dull and boring”: Make it engaging and fun. Mock randomised controlled trial Ref: Baker 2017 APJPH Move past-passive learning environments

Peer instruction Independent work, structured essay. Prepare an essay using a critical appraisal tool (EPHPP) Work is first done independently like systematic reviewers All students must upload a copy of their draft to Turnitin Students then swap and then discuss agreements and disagreements Students can then submit combined or as individual

Peer instruction Must understand and apply core concepts to problems, not memorise Must argue their differences with peers, Pedagogy (Maznur) says students with misunderstanding have a tendency to undo themselves when they try to explain themselves. Students learn through discussion Reflection of process is required

Peer instruction Mimics the Systematic review process Drafts are checked Students who didn’t do the work can’t engage in a meaning full discussion Stealing from other student easy to identify through Turnitin (several students caught for misconduct). Purchased essay: Reported by partner “inability to engage in a meaningful discussion.” ? No change from draft to final submission. Must have consistent writing style throughout the essay.

Analysis – What are the possible effects of discussion upon students grades? Student who did well on midterm exam (>65%) paired with student who did poorly (<60%) (n=19 of 94 enrolled students) 21% higher grade (>5%) 47.4% lower grade (< -5%) 31.6% no change in their grade (-4% to +4%) Grade may go down (why?)

Analysis – What are the possible effects of discussion upon students grades? Student who did ‘poorly’ on midterm exam (<60%) paired with student who did ‘well’ (>65%) (n=18 of 94 enrolled students) 81.5% higher grade (>5%) 3.7% lower grade (< -5%) 14.8% no change in their grade (-4% to +4%) Benefits previously poor performing students.

Further research In depth analysis of grades over 4 semesters Student on-line survey on perceptions and experience

Conclusions Critical appraisals present an authentic form of assessment for students Potential to improve student understanding of key concepts Pitfalls can be reduced