Download presentation
Presentation is loading. Please wait.
Published byMarian Armstrong Modified over 6 years ago
1
From Implementing Science to Implementation Science: Optimizing Benefits of Treat All Era in the HIV Response Elvin Geng MD MPH Associated Professor UCSF
2
Overview Public health practitioners (governments, funders, community, NGO’s) are entering an ambitious era of treating all Stakeholders in the implementation process are uniquely positioned to carry out implementation science Generate knowledge that can optimize public health practice both locally and in other settings Right partnerships can catalyze relevant, timely, useful knowledge to address challenges with treating all
3
From Implementing Science…
Traditional view: “Implementing Science” Do “science” (like the insight START randomized trial) Implement the results (WHO Guidelines to treat all) N Engl J Med 2015; 373: August 27, 2015DOI: /NEJMoa
4
….to a Science of Implementation?
But trials don’t tell us important things about implementation Implementation science Demand - patient perspective, desires, priorities Supply - Implement-ability; fit with health systems Real world effects (“effectiveness”) Intended and unintended effects Paradigm shift No “lab” no “system” - the system is the lab
5
From Implementing Science: Treat All
Randomized trials show that treating all persons has clinical benefits Insight START Trial HTPN 052 Other studies WHO Guidelines - Treat all countries adopting WHO guidelines
6
….Implementation Science for Treat All
Assess and create demand for ART among patients starting with high CD4 levels Assess and strengthen health systems for delivery Supply chain Differentiated service delivery Human resources for health Real world effects of treat all? Good effects (e.g., IPT) Unintended consequences (same day rushed counseling poor retention?) What is the Is the change package correct? Variability in uptake Can we identify factors associated with success with fast start? What kinds of support to put in place?
7
Case Study Implementation Research: Demand
What does treatment mean? Prospect of side effects, opportunity costs, unfriendly interactions with staff …or… Freedom from fear of disease progression; guidance from and engagement with professionals? How do they want to get treatment? Which differentiated service delivery model is most desired and by how much? Does it differ by patient group?
8
Barriers among Patients Lost to Follow up in Zambia (N = 603)
Survey Random sample of patients lost from Structural Psychosocial Clinic-based Sikazwe TUPED1291
9
“Do you prefer going to Clinic A, Clinic B, or would you rather not go to either one, given the circumstances? In a further attempt to quantify preferences, In a sub sample of the patients we sought, we also conducted a choice experiment, which can provide further information about relative strength of patient preferences. In this choice experiment patients were provided with a series of two clinics based on five attributes: time spent at the facility, distance, supply or ART, hours, attitude… Zanolini IHEA 2017
10
Preferences among HIV Patients in Zambia: Choice Experiment in Zambia (N=272)
Here the actual numbers aren’t crucial, but rather we want to look at the relative magnitude. For example, staff is nice is the most important quality of a clinic according to patients’ stated preferences. Next in line is longer refill lengths. So, patients preferred receiving a refill every 3 months versus every month and also every 5 months was preferred to every 3 months. Additional opening hours and shorter distance and wait time were also preferable, but not nearly as much as nice staff and longer refill periods. This was not an altogether expected finding and really brought our attention to this issue Model was Mixed logit. Zanolini IHEA 2017
11
Dommaraju 2017
12
Implementation research: causal inference about implementation
Randomized trials? Expensive; length; no context (sometimes) Natural experiments Naturally arise when policy changes Can be (almost) as rigorous as a randomized trail if well designed Emerge from practice / context and therefore potentially more relevant Implementation to treat all represents an opportunity Supplement to routine M and E activities
13
Effect of Guideline Change in Real World
The effects of changing eligibility What is change in ART uptake among those newly eligible? Among all patients? Percent starting same day? Effect of eligibility on retention? The effects of starting ART due to change in guidelines Does starting influence retention? Does starting (same day) influence retention? Are there spillover effects Causal attribution Investment in causal attribution Perhaps not everything needs causal attribution Guideline change on Somethings do Same day start - is that the best way to get people on treatment? More on but faster loss to follow-up? Fewer people on treatment? Does it vary? (Is same day incrase retention in some circumstances while it decreases retention in others) Very plausible (how you start)
14
Regression discontinuity – strategy for understanding effects of changing eligibility
New policy implemented (e.g., guideline change) January 2016 March 2016 July 2016 September 2016 Within narrow window, people are similar on either side of the event, and comparison plausibly causal Mody TUAD0101
15
Instrumental variable - the effect of same-day ART Initiation
Change in ART eligibility Randomized trials suggested same day ART start taken up and improved outcomes Controlled setting? Effect of same day ART start due to change in guidelines hard to know Common causes of same day start and retention (e.g., social support) unknown “IV analysis” enables us to know the effect of ART initiation on retention under assumption: Change in eligibility is related to retention only through ART initiation Rapid ART initiation ? Retention Unmeasured confounders (e.g., social support) We also leveraged this design and used implementation of new guidelines as an instrumental variable to estimate the real world effect of ART initiation on retention in care. An IV analysis can provide unbiased estimates when there is expected to be unmeasured confounding between a treatment—in this case ART initiation—and the outcome—retention in care—when certain assumptions are met. The instrument…in this case implementation of guidelines…is associated with the treatment There are no common causes of the instrument and the outcome The instruments affects the outcome only through the treatment of interest In this situation, we can estimate the effect of ART initiation on retention in care for those patients who initiated ART in response to the change in guidelines. This is also called the local average treatment effect Mody TUAD0101
16
Conclusion Treating all is clinically beneficial
Implementation science to optimize benefits in real world Stakeholders and actors in implementation process are uniquely positioned to produce relevant and rigorous science (as well as consume scientific findings) to do so Investments in collaborations between implementers – governments – researchers can catalyze this change
17
From “Lab” and “System” Gap to…
Will it be the same?
18
…Learning Systems for Health
19
Thank you Karen Webb and others from OPHID
Paul Volberding, Warner Greene, Lauren Sterling and UCSF CFAR Charles Holmes, Izukanji Sikawze, Kombatende Sikombe, Njekwa Mukamba, Carolyn Bolton and others at CIDRZ in Zambia Aaloke Mody, Monika Roy, David Glidden, Diane Havlir and others UCSF Funders: NIH, Bill and Melinda Gates Foundation, UCSF CFAR Dr. Nancy Czaicki
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.