Program in Policy Decision-Making McMaster University John N. Lavis, MD, PhD Associate Professor and Canada Research Chair in Knowledge Transfer and Uptake.

Slides:



Advertisements
Similar presentations
2008 Global Ministerial Forum on Research for Health Bamako, Mali
Advertisements

Is research working for you? A self-assessment tool and discussion guide Maria Judd, MSc Senior Program Officer Knowledge Transfer and Exchange.
Michael G. Wilson Doctoral Candidate, Health Research Methodology Programme McMaster University Program in Policy Decision-Making McMaster University 18.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Knowledge transfer to policy makers (with apologies to John Lavis!)
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
HSRU is funded by the Chief Scientist Office of the Scottish Government Health and Social Care Directorates. The author accepts full responsibility for.
Knowledge translation tool: A workbook for the contextualization of global health systems guidance at the national or subnational level _ CPHA, Toronto.
Theme 6. Cochrane Reviews: innovative reviews and methodology.
Measuring Ethical Goals of Research Oversight Holly Taylor, PhD, MPH Department of Health Policy and Management Bloomberg School of Public Health Berman.
Philip Davies International Initiative for Impact Evaluation [3ie] Getting Evidence Into Policy 3ie-LIDC Seminar Series 'What Works In.
Evidence for ‘excellence in care’
Does it Matter — Timeliness or Accuracy of Results? Results of a Research Program on Rapid Reviews Andrea C. Tricco PhD MSc CADTH Symposium 2015.
Program in Policy Decision-Making McMaster University John N. Lavis, MD, PhD Associate Professor and Canada Research Chair in Knowledge Transfer and Uptake.
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2.
Evidence Aid: A resource for those preparing for and responding to natural disasters, humanitarian crises and major healthcare emergencies Claire Allen.
1 Identifying Effective Approaches to Translate Knowledge on Health Equity to Health System Decision and Policy-Makers.
International Conference on Enhancement and Innovation in Higher Education Crowne Plaza Hotel, Glasgow 9-11 June 2015 Welcome.
Bracknell Forest Council: Evaluation of the Domestic Abuse Perpetrator Service (DAPS) Liz Phillips.
Evidence Informed Practice Donna Ciliska, RN, PhD May 2011.
Impact assessment framework
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Too expensive Too complicated Too time consuming.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Methodological challenges for patient safety Jeremy Grimshaw MD, PhD Cochrane Effective Practice and Organisation of Care group Clinical Epidemiology Program,
Systematic Reviews.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Leukemia & Lymphoma Society (LLS) Information Resource Center (IRC) Planning for a Service Program Evaluation - Case Study Public Administration 522 Presented.
Follow How to integrate evidence into practice and policy: Knowledge translation resources for practitioners with limited.
Locating EPIC/PHSI in a Sea of Acronyms
Using Guidelines: The Need for Adaptation Ian D Graham, PhD, FCAHS December 10, 2012 E-GAPPS.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
TEACH LEVEL II: CLINICAL POLICIES AND GUIDELINES STREAM Craig A Umscheid, MD, MSCE, FACP Assistant Professor of Medicine and Epidemiology Director, Center.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
“Evidence-Informed Health Policymaking Teaching Evidence Assimilation for Collaborative Healthcare New York Academy of Medicine, 7 August 2014 Andy Oxman,
Confidential & Proprietary Copyright © 2008 The Nielsen Company KAPs / Image Analysis & Delivering as One Tanzania Stakeholder report January 2009.
Trusted evidence. Informed decisions. Better health. Cochrane_QuickRefBooklet.indd 114/8/15 5:41 pm.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
Developing evidence-based guidelines at WHO. Evidence-based guidelines at WHO | January 17, |2 |
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Linking SEA and City Development Strategy (CDS) in Vietnam Maria Rosário Partidário, Michael Paddon, Markus Eggenberger, Minh Chau, and Nguyen Van Duyen.
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
Collaboration Between Researchers and State Policymakers: Models for Achieving Evidence-Informed Policy Andrew Coburn, Ph.D Muskie School of Public Service.
Translating Emerging Evidence: Lessons from the MATCH Study (Methods for Developing Actionable Evidence for Consumers of Health Services Research) AcademyHealth.
The purpose of evaluation is not to prove, but to improve.
John N. Lavis, MD, PhD Professor and Director, McMaster Health Forum McMaster University Program in Policy Decision-Making McMaster University 7 June 2012.
How can researchers and research organizations more effectively transfer research knowledge to decision makers? Jeremy Grimshaw Clinical Epidemiology Program,
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
Integrating Qualitative Research Into Health Technology Assessment in Canada The CADTH Experience Laura Weeks, PhD Scientific Advisor Kristen.
Comparative Effectiveness Research (CER) and Patient- Centered Outcomes Research (PCOR) Presentation Developed for the Academy of Managed Care Pharmacy.
Understanding Healthcare Provider and Decision-Maker Perspectives on Health Technology Reassessment: A Qualitative Research Study LESLEY J.J. SORIL, GAIL.
CONVEYING PRIORITIES THROUGH POLICY BRIEFS This session will cover: 1.The purpose of policy briefs 2.Understanding the audience 3.Characteristics of policy.
Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research.
Safety in Medicines: Raising the profile with the Royal Pharmaceutical Society Liz Rawlins Communications Officer 9 May 2011.
Evaluation What is evaluation?
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
Supporting Research Use by Health System Managers and Policymakers
Knowledge Translation for Policymakers
Palliative Care Matters Initiative
Research from the NCCSD: What’s new and exciting?
Approaches for Supporting Evidence- and Values-Informed Policymaking
MUHC Innovation Model.
Supplementary Table 1. PRISMA checklist
JET Education Services: Innovations in Teacher Support and Curriculum Development Presentation to the Care and Support for Teaching and Learning Regional.
Touring the World of Evidence
Sabine Wollscheid, Senior Researcher, Dr. phil.
STEPS Site Report.
Presentation transcript:

Program in Policy Decision-Making McMaster University John N. Lavis, MD, PhD Associate Professor and Canada Research Chair in Knowledge Transfer and Uptake McMaster University 27 June 2005 Towards Systematic Reviews That Inform Healthcare Management and Policymaking AcademyHealth Annual Research Meeting Boston, MA, USA

2 Co-investigators Huw Davies, University of St. Andrews Andy Oxman, Norwegian HSR Centre Jean-Louis Denis, Université de Montréal Karen Golden-Biddle, University of Alberta Ewan Ferlie, University of London Funders Canadian Health Services Research Foundation NHS Service & Delivery Organization R&D Program Acknowledgements

3 Background Research objective Study design and population studied Principal findings Conclusions and implications Overview

4 Healthcare managers and policymakers face lots of questions that can be answered in part by research evidence Finding effective (and cost-effective) solutions to the most burdensome health problems Fitting these solutions into health systems (i.e., governance, financial, and delivery arrangements) Bringing about change in health systems Background

5 Background (2) Systematic reviews of research evidence Reduce the likelihood that managers and policymakers will be misled by research (by being more systematic and transparent in the identification, selection, appraisal and synthesis of studies) Increase confidence among managers and policymakers about what can be expected from an intervention (by increasing the number of units for study)

6 Background (3) Systematic reviews of research evidence (2) Allow managers, civil servants and political staff to focus on appraising the local applicability of systematic reviews and on collecting and synthesizing other types of evidence, such as evidence about political acceptability and feasibility – i.e., allow them to focus on the apex of the research knowledge pyramid while doing the rest of their jobs Allow for more constructive contestation of research evidence by stakeholders

7 Actionable messages Systematic reviews of research Individual studies, articles, and reports Basic, theoretical and methodological innovations Background (4)

8 To identify ways to improve the usefulness of systematic reviews for healthcare managers and policymakers that could then be evaluated prospectively, which we identified by exploring: Nature of decision-making and approach to research evidence Types of questions asked How research evidence is assessed How much value is placed on recommendations Optimal presentation of research evidence Research Objective

9 Study design Systematic review of studies of decision-making by healthcare managers and policymakers Interviews with a purposive sample of healthcare managers and policymakers in Canada and the United Kingdom (N=29) Websites of research funders, producers/purveyors of research, and journals that include healthcare managers and policymakers among their target audiences (N=45) Study Design and Population Studied

10 Study Design and Population Studied Population studied Purposive sample of healthcare managers (or the senior staff of associations that seek to inform managers) in Ontario and England and healthcare policymakers in the Canadian federal and Ontario provincial governments and the United Kingdom government Study participants were almost always drawn from the top ranks of their respective organizations (in the case of healthcare managers), department (in the case of civil servants) or office (in the case of political advisors)

11 Principal Findings Systematic review Individual-level interactions between researchers and healthcare policymakers increased the prospects for research use Timing and timeliness increased (and poor timing or lack of timeliness decreased) the prospects for research use Individuals’ negative attitudes towards research evidence decreased the prospects for research Individuals’ lack of skills and expertise decreased the prospects for research use

12 Principal Findings (2) Interviews Most do not highly value systematic reviews as an information source Many have used systematic reviews to address many different types of questions Some identified that they would benefit from having contextual factors highlighted in order to inform assessments of a review’s local applicability All would value information about the benefits, harms (or risks), and costs of interventions, the uncertainty associated with estimates, and variation in estimates by subgroup

13 Principal Findings (3) Interviews (2) Disagree about whether researchers should provide recommendations Almost all would value reports presented using something like a 1:3:25 format Some identified that they would value systematic reviews being made more readily available for retrieval when they are needed

14 Principal Findings (4) Website review Attributes of the context in which the research was conducted were rarely provided Recommendations were often provided Reports using a graded-entry format (e.g., 1:3:25) were rare

15 Conclusions and Implications Provisional answers to question 1 lead us to argue for Thinking broadly about healthcare managers and policymakers as target audiences Demonstrating to them the value of systematic reviews Engaging them in the production and adaptation of systematic reviews Building their capacity to identify quality-appraised sources of systematic reviews and to appraise their local applicability

16 Conclusions and Implications (2) Provisional answers to question 2 lead us to argue for Producing reviews that address a broad array of questions Provisional answers to question 3 lead us to argue for Making available an online source of all types of quality-appraised reviews Identifying the benefits, harms (or risks) and costs of interventions, highlighting uncertainty, and describing any differential effects by sub-group Identifying contextual factors that may affect assessments of local applicability

17 Conclusions and Implications (3) Provisional answers to question 4 lead us to argue for Not providing recommendations Avoiding the use of jargon Provisional answers to question 5 lead us to argue for Producing user-friendly “front ends” for reviews (e.g., one page of take-home messages and a three-page executive summary) to facilitate rapid assessments of the relevance of a review and, when the review is deemed highly relevant, more graded entry into the full details of the review

18 Conclusions and Implications (4) Researchers could make three changes to how they produce and update systematic reviews Involve healthcare managers and policymakers in posing questions, reviewing approach, and interpreting results For systematic reviews about “what works,” identify the benefits and harms (or risks) of interventions, highlight uncertainty, and describe any differential effects by sub-group Identify contextual factors that may affect assessments of local applicability

19 Conclusions and Implications (5) Research funders could support three types of local adaptation processes Develop more user-friendly “front ends” for reviews Add additional local value to systematic reviews about “what works” by describing the benefits, harms (or risks) and costs that can be reasonably expected locally and to any type of systematic review by using language that is locally applicable Make user-friendly “front ends” of systematic reviews available through an online database that can be linked to the full reviews through other sources, such as The Cochrane Library

20 References Lavis JN, Davies HTO, Oxman A, Denis J-L, Golden- Biddle K, Ferlie E. Towards systematic reviews that inform healthcare management and policymaking. Journal of Health Services Research and Policy; in press Lavis JN, Becerra Posada F, Haines A, Osei E. Use of research to inform public policymaking. The Lancet 2004; 364:

21 Contact Information John N. Lavis Program in Policy Decision-Making, McMaster University