Researching medical education Imperial School of Medicine Faculty Teaching Forum 2 May 2012 Dylan Wiliam.

Slides:



Advertisements
Similar presentations
Toward a Vision for a National System of Natural and Environmental Resource Indicators.
Advertisements

© Myra Young Assessment All rights reserved. Provided for the use of participants in AM circles in North Lanarkshire Council.
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Learning, assessment and technology: in that order Keynote address to AMEE conference September 2009; Malaga, Spain Dylan Wiliam Institute of Education,
What is argument? Beyond hair pulling, dish throwing, yelling, and other in-your-face actions.
Deriving Biological Inferences From Epidemiologic Studies.
A2 Ethics How to assess arguments and theories. Aims  To discuss various methods of assessing arguments and theories  To apply these methods to some.
Formative assessment: questioning and feedback Dylan Wiliam
Causality Causality Hill’s Criteria Cross sectional studies.
The Essence of Critical Thinking the reasoned identification and evaluation of evidence to guide decision making analysis the form and content of evidence.
Logic, observation, representation, dialectic and ethical values: what counts as evidence in educational research? Dylan Wiliam Annual conference of the.
Bradford Hill’s Criteria for Inferring Causality
Integrating assessment with instruction: what will it take to make it work? Dylan Wiliam.
Aim to provide key guidance on assessment practice and translate this into writing assignments.
Research perspectives and formative assessment ASME Conference: Researching Medical Education, November 2009: RIBA, London Dylan Wiliam.
1 Procedural Analysis or structured approach. 2 Sometimes known as Analytic Induction Used more commonly in evaluation and policy studies. Uses a set.
Standards for Qualitative Research in Education
Why teaching will never be a research-based profession and why that’s a Good Thing Dylan Wiliam 1.
Analysing information and being ‘critical’
Epidemiology Kept Simple
Scientific method - 1 Scientific method is a body of techniques for investigating phenomena and acquiring new knowledge, as well as for correcting and.
Reaching a Verdict.
Grade 12 Subject Specific Ministry Training Sessions
The Modeling Method of Physics Teaching Taken from the MM Web Site.
Getting Started. Decide which type of assessment –Input assessment –Process assessment –Outcomes assessment –Impact assessment.
Preparing for the Verbal Reasoning Measure. Overview Introduction to the Verbal Reasoning Measure Question Types and Strategies for Answering General.
Terms to Know Knowledge Claim - Something known through observation or inference Syllogism - A formula of logic consisting of two propositions,
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
Scientific Method January 11, 2006.
Copyright ©2011 Pearson Education
Chapter 13 Science and Hypothesis.  Modern science has had a profound impact on our lives— mostly for the better.  The laws and principles of science.
Causation and the Rules of Inference Classes 4 and 5.
LeMoyne-Owen College December 16, 2009 Mimi Czarnik, Professor of English and Dean of Humanities Becky Burton, Associate Professor of Biology Alverno College,
How can assessment support learning? Keynote address to Network Connections Pittsburgh, PA; February 9th, 2006 Dylan Wiliam, Educational Testing Service.
October ISIS – Cluster Moderation Assessment and moderation in CfE is a process, not an event! Aims of the morning: To further inform participants in the.
Research !!.  Philosophy The foundation of human knowledge A search for a general understanding of values and reality by chiefly speculative rather thanobservational.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Learning Progressions: Some Thoughts About What we do With and About Them Jim Pellegrino University of Illinois at Chicago.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
AIT, Comp. Sci. & Info. Mgmt AT02.98 Ethical, Legal, and Social Issues in Computing September Term, Objectives of these slides: l What ethics is,
Hypothesis Testing Hypothesis Testing Topic 11. Hypothesis Testing Another way of looking at statistical inference in which we want to ask a question.
LOOKING FOR EVIDENCE? A CASE FOR THE CASE STUDY DR. GURU NAGARAJAN DR. SARA BHATTACHARJI.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Office of School Improvement Differentiated Webinar Series Formative Assessment – Feedback February 28,2012 Dr. Dorothea Shannon, Thomasyne Beverly, Dr.
A short introduction to epidemiology Chapter 10: Interpretation Neil Pearce Centre for Public Health Research Massey University, Wellington, New Zealand.
RELIABILITY AND VALIDITY OF ASSESSMENT
Reading Health Research Critically The first four guides for reading a clinical journal apply to any article, consider: the title the author the summary.
Assessing instructional and assessment practice: What makes a lesson formative? CRESST conference, September 2004 UCLA Sunset Village, CA Dylan Wiliam.
Chapter 12 Informal Fallacies II: Assumptions and Induction Invitation to Critical Thinking First Canadian Edition Joel.
LEARNING RESEARCH AND DEVELOPMENT CENTER © 2004 University of Pittsburgh 1 Principles of Learning: Accountable Talk SM Accountability to the Learning Community.
One Form of Argument… “Argument” in NGSS In science, the production of knowledge is dependent on a process of reasoning from evidence that requires a.
Research Methods Chapter 2.
A Signature Tool of The Institute for Learning
Why teaching will never be a research-based profession (and why that’s a Good Thing) Dylan Wiliam 1.
How to structure good history writing Always put an introduction which explains what you are going to talk about. Always put a conclusion which summarises.
Helmingham Community Primary School Assessment Information Evening 10 February 2016.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
Philosophy, Logic and Human Existence ETHICS AND HUMAN CONDUCT IN THE SOCIETY.
Professional Communication: The Corporate Insider’s Approach Chapter Five Reasoning: Framing the Sound Business Argument.
Approaches to social research Lerum
Using Cognitive Science To Inform Instructional Design
Types of Arguments.
Causation Analysis in Occupational and Environmental Medicine
Detecting Causal Relations
Lesson Using Studies Wisely.
Unit 7: Instructional Communication and Technology
Critical Appraisal วิจารณญาณ
Making Good Progress? “But the PISA tests of mathematics, literacy and science are drawing from the same domains as the GCSEs in those subjects. If pupils.
Providing feedback to learners
Presentation transcript:

Researching medical education Imperial School of Medicine Faculty Teaching Forum 2 May 2012 Dylan Wiliam

Pasteur’s quadrant

Educational research “An elusive science” (Lagemann)  A search for disciplinary foundations Making social science matter (Flyvbjerg, 2001)  Contrast between analytic rationality and value-rationality  Physical science succeeds when it focuses on analytic rationality  Social science  fails when it focuses on analytic rationality, but  succeeds when it focuses on value-rationality  reasonableness, rather than rationality (Toulmin) as the key criterion

Research methods 101: causality  Does X cause Y?  Given X, Y happened (factual)  Problem: post hoc ergo propter hoc  If X had not happened, Y would not have happened (counterfactual)  Problem: X did happen  So we need to create a parallel world where X did not happen  Same group different time (baseline measurement)  Need to assume stability over time  Different group same time (control group)  Need to assume groups are equivalent  Randomized contolled trial

Plausible rival hypotheses  Example: Smoking cigarettes causes lung cancer  Randomized controlled trial not possible  Have to rely on other methods  Logic of inference-making  Establish the warrant for chosen inferences  Establish that plausible rival interpretations are less warranted

Criteria for causal inferences The environment and disease: association or causation? (Hill, 1967)  Criteria for determining a causal association: 1.strength 2.consistency 3.specificity 4.temporality 5.biological gradient 6.plausibility 7.coherence 8.experimental evidence 9.analogy.

Knowledge Not justified-true-belief Discriminability (Goldman, 1976) Elimination of plausible rival hypotheses Building knowledge involves:  marshalling evidence to support the desired inference  eliminating plausible rival interpretations ‘Plausible’ determined by reference to a theory, a community of practice, or a dominant discourse

Inquiry systems (Churchman, 1971) SystemEvidence LeibnizianRationality LockeanObservation KantianRepresentation HegelianDialectic SingerianValues, ethics and practical consequences

The Lockean inquirer displays the ‘fundamental’ data that all experts agree are accurate and relevant, and then builds a consistent story out of these. The Kantian inquirer displays the same story from different points of view, emphasising thereby that what is put into the story by the internal mode of representation is not given from the outside. But the Hegelian inquirer, using the same data, tells two stories, one supporting the most prominent policy on one side, the other supporting the most promising story on the other side (Churchman, 1971 p. 177). Inquiry systems

The ‘is taken to be’ is a self-imposed imperative of the community. Taken in the context of the whole Singerian theory of inquiry and progress, the imperative has the status of an ethical judgment. That is, the community judges that to accept its instruction is to bring about a suitable tactic or strategy [...]. The acceptance may lead to social actions outside of inquiry, or to new kinds of inquiry, or whatever. Part of the community’s judgement is concerned with the appropriateness of these actions from an ethical point of view. Hence the linguistic puzzle which bothered some empiricists—how the inquiring system can pass linguistically from “is” statements to “ought” statements— is no puzzle at all in the Singerian inquirer: the inquiring system speaks exclusively in the “ought,” the “is” being only a convenient façon de parler when one wants to block out the uncertainty in the discourse. (Churchman, 1971: 202). Singerian inquiry systems

Educational research …can be characterised as a never-ending process of assembling evidence that:  particular inferences are warranted on the basis of the available evidence;  such inferences are more warranted than plausible rival inferences;  the consequences of such inferences are ethically defensible. The basis for warrants, the other plausible interpretations, and the ethical bases for defending the consequences, are themselves constantly open to scrutiny and question.

Effects of feedback Kluger & DeNisi (1996) Review of 3000 research reports Excluding those:  without adequate controls  with poor design  with fewer than 10 participants  where performance was not measured  without details of effect sizes left 131 reports, 607 effect sizes, involving individuals On average feedback does improve performance, but  Effect sizes very different in different studies  38% (50 out of 131) of effect sizes were negative

Getting feedback right is hard Response typeFeedback indicates performance… exceeds goalfalls short of goal Change behaviorExert less effortIncrease effort Change goalIncrease aspirationReduce aspiration Abandon goalDecide goal is too easyDecide goal is too hard Reject feedbackFeedback is ignored

Kinds of feedback (Nyquist, 2003) Weaker feedback only  Knowledge or results (KoR) Feedback only  KoR + clear goals or knowledge of correct results (KCR) Weak formative assessment  KCR+ explanation (KCR+e) Moderate formative assessment  (KCR+e) + specific actions for gap reduction Strong formative assessment  (KCR+e) + activity

Effects of formative assessment (HE) Kind of feedbackCountEffect/sd Weaker feedback only Feedback only Weaker formative assessment Moderate formative assessment Strong formative assessment160.56