The search for internal validity in improvement Frank Davidoff Learning Lab – 2013 IHI Forum.

Slides:



Advertisements
Similar presentations
Theory-Based Evaluation:
Advertisements

Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model.
An integrated model for improvement: Implications for study design Frank Davidoff, MD, MACP Editor Emeritus, Annals of Internal Medicine Executive Editor,
Doug Altman Centre for Statistics in Medicine, Oxford, UK
8. Evidence-based management Step 3: Critical appraisal of studies
Joshua Kayiwa INRUD-IAA, Uganda. Session Objectives Narrate the experience of the Uganda INRUD-IAA team in collecting, cleaning, summarizing and analyzing.
The Art and Science of Teaching (2007)
Research Methods in Psychology (Pp 45-59). Observations Can be used in both experimental and nonexperimental research; can be used quantitatively or qualitatively.
Reliability, Validity, Trustworthiness If a research says it must be right, then it must be right,… right??
MYERS QUALITATIVE RESEARCH IN BUSINESS AND MANAGEMENT Sage Publications Limited © 2008 Michael D. Myers All Rights Reserved OVERVIEW OF QUALITATIVE RESEARCH.
DECO3008 Design Computing Preparatory Honours Research KCDCC Mike Rosenman Rm 279
Evidenced Based Practice; Systematic Reviews; Critiquing Research
Evaluating and Revising the Physical Education Instructional Program.
Title I Needs Assessment and Program Evaluation
Guidelines for Making Decisions about IEP Services IEP Services 8 of 8 Implement the Special Education Services Evaluate the Impact of Services.
Yr 13 mock exam KQ: Can you answer questions on how we should acquire knowledge from witnesses?
The phases of research Dimitra Hartas. The phases of research Identify a research topic Formulate the research questions (rationale) Review relevant studies.
Delivering clinical research to make patients, and the NHS, better Local, national, global: the challenge of workforce planning for nurses. Dr Susan Hamer.
Methods used to validate qualitative
Social Research Methods
STrengthening the Reporting of OBservational Studies in Epidemiology
Work based assessment Challenges and opportunities.
Chapter 1: Introduction to Statistics
X AXIS LOWER LIMIT UPPER LIMIT CHART TOP Y AXIS LIMIT v MEASURING IMPACT: USING QUANTITATIVE RESEARCH Eibhlín Ní Ógáin London, November 2012 NPC - title.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
Workshop Planning Sustainable Innovation 11:00-12:30.
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
Copyright © Allyn & Bacon 2008 Locating and Reviewing Related Literature Chapter 3 This multimedia product and its contents are protected under copyright.
CHAPTER III IMPLEMENTATIONANDPROCEDURES.  4-5 pages  Describes in detail how the study was conducted.  For a quantitative project, explain how you.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 2 Evidence-Based Nursing: Translating Research Evidence Into Practice.
Nursing Diagnosis Research for Students Chapter Five.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 2 Translating Research Evidence Into Nursing Practice: Evidence-Based Nursing.
The subject of a scholarly article is based on original research.
Title of Scholar Project Month day, year Presenter: Supervisor(s): Critical Care Western.
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
Low Emission Strategies in Practice Rob Pilling (London, 12 th May 2009) web:
Health Roundtable – 2008 – Improving Clinical Management Presenter: (delegate name) Hospital: (your hospital code name) Key contact person for this project.
Post, post, post Structuralism to Post-structuralism Context is relevant Parts are as important as the whole and can change meaning of the whole How.
Chapter 4 In the chapter, it reveals of three main philosophical positions in relations to research: Positivism Interpretivism Realism Comparison Figure.
Janet Maher November 1,  Curriculum Development: design and monitor a collaborative process for achieving consensus on core competencies  Participant/Learner.
Evaluating your EQUIP Initiative Helen King. Objectives To enable teams to develop a shared understanding of the purpose, use and stakeholders for evaluation;
APPROACHES TO DATA COLLECTION & ANALYSIS
Setting the scene 9 September 2010 Setting the scene Alan Willson 9 September 2010.
Evaluation design and implementation Puja Myles
WHA Improvement Forum For July    “Data Driven Improvement”   Presented by Stephanie Sobczak Courtesy Reminders: Please place your phones on MUTE.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
2 nd Grade Language Arts World Class Outcome Create meaning strategically in: Reading Writing Speaking Listening Evaluate how authors are strategic in.
The School Effectiveness Framework
Validity and utility of theoretical tools - does the systematic review process from clinical medicine have a use in conservation? Ioan Fazey & David Lindenmayer.
Research design and methods. What’s within your research design and method? –What research design will guide your study? –What is the scope/ location.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Research methods revision The next couple of lessons will be focused on recapping and practicing exam questions on the following parts of the specification:
Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.
Innovation Compass. A recommendation of Innovation Health and Wealth was to demonstrate how NHS organisations and.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
1 Auditing Your Fusion Center Privacy Policy. 22 Recommendations to the program resulting in improvements Updates to privacy documentation Informal discussions.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Assessing Impact: Approaches and Designs
Framework for Getting Results at Scale
DATA COLLECTION METHODS IN NURSING RESEARCH
MUHC Innovation Model.
Data Collection Methods
Methods – use of case studies
Critical Reading of Clinical Study Results
Alignment Dr. Mary Clisbee
CLINICAL RESEARCH: An Introduction
Improved Patient Outcomes Best Available Clinical Evidence Patient’s
Project collaborators’ names
Quality and Impact AIM PRIMARY DRIVERS SECONDARY DRIVERS INTERVENTIONS
Presentation transcript:

The search for internal validity in improvement Frank Davidoff Learning Lab – 2013 IHI Forum

Improvement has a two-part mantra Part 1: – All improvement involves change Part 2: – Not all change is improvement

Local project, part 1: make change “Here’s how we made (system-level) change happen…” – Identified a dysfunction in the system – Came up with an innovation (better process; change strategy for getting there) – Implemented our strategy in local context – Used small tests of change to refine innovation – Spread and maintained changes (Way different from giving a pill…) So what’s next?

Local project, part 2: find out whether change is improvement “Here’s how we learned whether our change was improvement…” – Chose outcomes: processes, patients’ clinical condition – Developed outcome measures – Created informal systems for collecting, displaying, using outcomes data (quantitative, qualitative) – Used these data locally to study the impact of changes, modify change strategy

Yes, Virginia, there is “study” in local improvement projects Informal study is an inherent part of all meaningful improvement – Used to check on impact of change (“Did it work?”) – Especially visible in “Plan-Do-Study-Act” cycles (originally called “Plan-Do-Check-Act” cycles) – Not related to whether project is meant “for publication”

Where have we gotten in our local project? In part 1: we made change happen In part 2: we produced informal outcomes data to demonstrate improvement – Good enough: allows project staff to modify, spread, maintain change BUT data quality (completeness, accuracy) is uncertain; no control for confounders, biases – “lite” study data Result: weak internal validity! – i.e., unlikely to convince skeptics elsewhere about improvement

How can we strengthen the evidence? Shift “up” to formal planning and study – Identify plausible theory of performance change – Adopt specific study design – Select/define relevant outcomes – Develop reliable data collection process, robust data quality control – Analyze results (e.g., grounded theory; statistics; time series, esp. statistical process control) Creates “research level” data Result: stronger internal validity! – i.e., more likely to convince editors, peer reviewers, rest of the world – that our change was really an improvement

Summing up Making change locally includes informal study of outcomes – Useful: makes project possible – but data somewhat “fuzzy” – Result: Internal validity is weak Formal study of change process and outcome requires “research level” methods – Scholarly: contributes to general knowledge – Not every formal study feature is required, but the more features the better – Result: internal validity is stronger

Internal validity in improvement studies: key reference Solberg L, et al. The three faces of performance measurement: improvement, accountability, and research. Joint Comm J Qual Improvement 1997;23: