AHRQ Annual Conference September 2011. Thank-you to:  Survey respondents  Focus group participants  Key informants  AMIA and ACCP  AHRQ  Mary Nix.

Slides:



Advertisements
Similar presentations
Comparative Study of MOPAN and EvalNet Approaches to Assessing Multilateral Organizations’ Development Effectiveness James Melanson Director, Development.
Advertisements

Collecting Citizen Input Management Learning Laboratories Presentation to Morrisville, NC January 2014.
Partners in Mind Workshop 17 November 2009
High-Level Data Analysis Presentation Slide Deck
Service User Survey 2011 Service User Survey Results 2011 Toni Martin – Senior Consultant Quality Health.
The SHARE Approach Essential Steps of Shared Decision Making
1 SNP Educational Session – January 13, 2014 Model of Care Scoring Guidelines SNP Educational Session - January 13, 2014 Brett Kay, AVP, SNP Assessment,
DMM Implementation Teams Institutes of Higher Education.
Second Legislated Review of Community Treatment Orders Ministry of Health and Long-Term Care November 9, 2012.
2011 Physician Satisfaction Survey Results September 2, 2011.
CLRN Evaluation Update Educational Support Systems May 28, 2009
Return On Investment Integrated Monitoring and Evaluation Framework.
EPIC Online Publishing Use and Costs Evaluation Program.
1 Ben George – Poet, Al Zantua & David Little Raven – Drummers.
1 Improving Access to Public Health Information: A Study of Information Needs in a State Health Department E. Hatheway Simpson, MPH Nancy R. La Pelle,
Supporting Data Management Across Disciplines Katherine McNeill Massachusetts Institute of Technology IASSIST Annual Conference 2010.
Quality evaluation and improvement for Internal Audit
Value of Library and Information Services in Patient Care Study Executive Summary Talbot Research Library Fox Chase Cancer Center Philadelphia, Pennsylvania.
Federal Consulting Group August 2004 Department of Labor Civil Rights Center 2004 Satisfaction Study - Recipients.
What Attracts Nurse Faculty & What Keeps Them in Education? Preliminary findings Jane D. Evans BSN RN MHA University of Arkansas for Medical Sciences College.
Effective dissemination and evaluation
One Health Plan’s Initiatives to Improve Patient Experiences: What the Physicians Had to Say Ron D. Hays, Ph.D. Professor of Medicine, UCLA CAHPS PI, RAND.
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
Quality Improvement Prepeared By Dr: Manal Moussa.
“Strengthening the National Statistical System of RM” Joint Project By 2011, public institutions with the support of civil society organizations (CSOs)
Proposed Cross-center Project Survey of Federally Qualified Health Centers Vicky Taylor & Vicki Young.
What is Business Analysis Planning & Monitoring?
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
1 Simon Bradstreet: SRN Allison Alexander: NHS Education for Scotland/SRN Scottish Recovery Indicator.
CERCA Practitioner Survey Report FTA FedState E-File Symposium May 3, 2000.
American College of Healthcare Executives ACHE Update Leadership Knowledge Relationships Marketability.
Health Connection: Evaluating the quality and impact of a public health telephone response service CHNC Conference: Knowledge to Action - June 17 th, 2010.
The Evaluation Plan.
NASA Earth Observing System Data and Information Systems
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
August 7, Market Participant Survey Action Plan Dale Goodman Director, Market Services.
Organization and guideline development April 2010 ACCC The Netherlands.
TEACH LEVEL II: CLINICAL POLICIES AND GUIDELINES STREAM Craig A Umscheid, MD, MSCE, FACP Assistant Professor of Medicine and Epidemiology Director, Center.
Next Steps: Implementation Workshop on Standards for Systematic Reviews and Clinical Practice Guidelines Institute of Medicine Sandra Zelman Lewis, PhD.
Barriers to EBP Prepared by: Dr. Hoda Abed El-Azim.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Chapter Fourteen Communicating the Research Results and Managing Marketing Research Chapter Fourteen.
L11 Exercise and fitness training after stroke Service implementation and evaluation: how it works in practice Dr. Catherine Best, Dr. Frederike van Wijck,
A project implemented by the HTSPE consortium This project is funded by the European Union SECURITY AND CITIZENSHIP RIGHT AND CITIZENSHIP
Sara Lovell, CPCS Education Coordinator Providence Alaska Medical Center.
Kerry Cleary An evaluation of the impact of Values Based Interviewing at the OUH Values Based Conversations and wider engagement strategies.
Impact of Two Studies on Future of NGC AHRQ Annual Conf Sept 19, :30 – 3:00pm.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
Guiding the Grey: The Implementation and Evaluation of a Journal Club amongst a Librarian and Clinical Practice Guideline Developers: A Cancer Care Case.
Accountability & Effectiveness Innovation Network, Inc. January 22, 2003.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
Speech, Language and Communication Therapy Action Plan: Improving Services for Children and Young People (2011/ /13) Mary Emerson AHP Consultant.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Yvonne Abel, Abt Associates Inc. November 2010, San Antonio, TX Enhancing the Quality of Evaluation Through Collaboration Among Funders, Programs, and.
Michael Celestin, MA,CHES,CTTS 3/6/2013 R2R MENTORSHIP EXPERIENCE.
Workshop on Standards for Clinical Practice Guidelines Institute of Medicine January 11, 2010 Vivian H. Coates, Vice President, ECRI Project Director,
Karen Cheung, MPH, Pamela Luna, DrPH, MST, Sarah Merkle, MPH American Evaluation Association Annual Meeting November 11, 2009 The findings and conclusions.
© 2012 Behavioral Tech KEY COMPONENTS IN DBT IMPLEMENTATION: A SURVEY FROM THE GROUND UP Linda A. Dimeff, Ph.D. 1, Andre Ivanoff, Ph.D. 2, 3, & Erin Miga,
Conferenceboard.ca Aligning, Foreseeing, and Optimizing HTA in Canada 2016 CADTH Symposium April 12, 2016 Dr. Gabriela Prada Director, Health Innovation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
THE SOUND OF SILENCE: AN EVALUATION OF CDC’S PODCAST INITIATIVE Quynh-Chau, M., Myers, Bradford A. (2013). The Sound of Silence: an evaluation of CDC's.
Janet Harrison and Samuel Nikoi
Basics of Knowledge Translation
Mary Alexander, MA, RN, CRNI®, CAE, FAAN Chief Executive Officer
An Institutional Perspective
Journal evaluation and selection journal
The impact of small-group EBP education programme: barriers and facilitators for EBP allied health champions to share learning with peers.
Presentation transcript:

AHRQ Annual Conference September 2011

Thank-you to:  Survey respondents  Focus group participants  Key informants  AMIA and ACCP  AHRQ  Mary Nix  Judi Consalvo  Gov Delivery Staff  Marjorie Shofer  HIT Portfolio  AMA and AHIP  ECRI Institute  NGC Evaluation PET  Florence Chang  Belinda Ireland  Richard Shiffman  Katrin Uhlig  Cally Vinz Evaluation Project Team Members AFYA, Inc. Michelle Tregear Jenice James Debra Dekker Craig Dearfield Ajay Bhardwaj Robin Pugh-Yi The Lewin Group Carol Simon Jaclyn Marshall Jacob Epstein

 Gain a better understanding of how NGC:  Is used by its stakeholders (including AWARENESS among key stakeholders)  Supports dissemination of evidence-based clinical practice guidelines and related documents  Has influenced efforts in guideline development, implementation, and use  Can be improved

 Quantitative Data  Web-based Survey ▪ Skip logic and Branching ▪ Respondents solicited by e- mail lists for AHRQ, AMA, and AHIP  Qualitative Data  Focus groups (4) ▪ Stakeholder-specific  Key informant interviews (26) ▪ Mix of stakeholders Mixed-Methods Approach Key Project Milestones CIPP evaluation framework – Logic model to develop key questions Developed instruments which were Informed by the PET Received OMB clearance Feb 2011 Conducted survey, focus groups, and interviews (Mar – Jul 2011)

Referral Source SourceCount Total 9,389 AHRQ 9,298 (99%) AHIP 42 AMA 49 Respondent Demographics Survey Reach  Majority from the U.S. (87%) Occupation Mix  56% - Providers, clinicians, nurses  13% - Researchers, librarians, or similar  12% - Consultants, managers, administrators  19% - Other Survey Sample Majority familiar with AHRQ ( 99% )

Section / ModuleCount Total9,389 (100%) NGC Unaware2,075 (22.1%) NGC Aware7,314 (77.9%) Non NGC User1,395 (19.3%) NGC User5,828 (80.7%) Guideline Developer1,076 (18.5%) Health Provider3,271 (56.1%) Medical Librarian204 (3.5%) Informaticians292 (5.0%) Researcher1,219 (20.9%) Policymaker1,219 (20.9%) Measure Developer351 (6.0%) Stakeholder-specific questions NGC questions NGC awareness and use, demographics, other guideline source questions

Key Notes Majority of respondents derived from AHRQ invitation Awareness of NGC greatest among AHRQ and AHIP respondents AHRQ Opportunity to increase physician awareness of NGC Awareness of NGC in 2011 substantially higher than in 2001 evaluation N=9,289N=42 N=49

 Survey Findings  Most use multiple sources to find CPGs (3-5 most common) ▪ PubMed, government sources, general search engines, medical/ professional societies  Qualitative Findings:  NGC often cited as “first go-to source” Key Finding Most NGC users equally satisfied or more satisfied with NGC compared to other guideline sources

Key Finding NGC is doing well on these needs “indicate the degree to which use of the NGC Web site fulfills your needs for:”

Finding NGC does better or about equal compared to other sources This is supported by qualitative findings

 Survey Findings  75% -very good or good ▪80% - would definitely, very likely, or probably recommend NGC to colleague No differences by key stakeholder group  Qualitative Findings: Differences by key stakeholder group ▪ Guideline developers and informaticians less trusting ▪ Others note that they “trust” the content because “it comes from AHRQ”

 Survey Findings  63% - appropriate ▪ By length-of-use: longer users more likely to find the criteria too loose No differences by key stakeholder group  Qualitative Findings: Differences by key stakeholder group ▪ Guideline developers/informatics specialists said they are ‘too loose’ ▪ Policymakers, medical librarians, researchers generally satisfied

 Qualitative Findings (Guideline Developers and Informaticians): ▪ “…the quality of the criteria was good when NGC started out but it has gotten more complicated...” ▪ “…If the goal is to be ‘all inclusive,’ then the criteria are fine. [but]….there needs to be some other ways to separate the wheat from the chaff.” ▪ “They could raise the bar.” ▪ “I’d say too loose…. I think there’s a belief that: A) NGC creates these guidelines…and then, B) there’s also a belief that NGC somehow has a very rigorous process for only allowing certain guidelines in, or certain types of very high quality guidelines, or that it’s an endorsement of these guidelines. And it isn’t.” AHRQ Opportunity to revisit NGC’s Inclusion Criteria

AHRQ Opportunity to revisit NGC’s Age Criterion  Survey Findings  ~Equal # say 5 years is “appropriate” or “too long” ▪ Reduce to 3 years most common selection  No differences by key stakeholder group  Qualitative Findings:  Common theme: the age criterion is too long

NGC Influences

 Respondents, by group, indicated NGC greatly influenced:  Guideline developers’ ability to identify guidelines, and develop and use quality measures  Health providers’ ongoing learning activities, clinical decision- making processes, and ability to identify guidelines  Medical librarians’ ability to meet their client’s needs  Medical librarians’ and researchers’ ability to identify current and high quality guidelines  Measure developers’ measure development activities and approach to identifying guidelines  Policymakers’ and purchasers’ ability to identify guidelines and convert clinical information

 Respondents, by group, indicated NGC greatly influenced:  Guideline developers’ ability to identify guidelines, and develop and use quality measures  Health providers’ ongoing learning activities, clinical decision- making processes, and ability to identify guidelines  Medical librarians’ ability to meet their client’s needs  Medical librarians’ and researchers’ ability to identify current and high quality guidelines  Measure developers’ measure development activities and approach to identifying guidelines  Policymakers’ and purchasers’ ability to identify guidelines and convert clinical information

 Survey Findings (n=199)  NGC guideline submitters reported greater NGC influence for guideline updating frequency, and how organizations document or report their guidelines  Qualitative Findings (n=24)  NGC serves primarily as a source for locating guidelines  NGC’s age criterion applies some “pressure” to stay current  NGC has had little influence on how guideline developers do their work– e.g., methodology, reporting AHRQ Opportunity to increase knowledge among guideline developers about how to create and report trustworthy guidelines

 Noteworthy finding  65% said excellent or good  21% were neutral  14% said fair or poor Question: How would you rate NGC's dissemination of your organization’s guidelines? AHRQ Opportunity to identify additional efforts to enhance the dissemination of guidelines

 Enhancements  Two thirds said they would “definitely,” “very likely,” or “probably” use the following NGC enhancements if available: Ratings of guideline quality Subject-specific alerts  Enhancements for providers  72% - having NGC content at point-of-care would be useful  66% - would take CME if available  Enhancements for informaticians  52% - NGC Summaries as XML file according to the Guideline Elements Model (GEM) or other similar Examples of Potential Enhancements Ratings of guideline quality Subject-specific alerts Access to archived guidelines Additional data download options and xml formats AHRQ Opportunity to invest in major enhancements that will increase the value of NGC

Additional Common Themes  Guideline Developers  Commentary/responses to guidelines from users  Guideline developer conferences / methodology workshops  Informatics Specialist  Assessment of the executability of guidelines  Medical Librarians  Integration with other Web sites (PubMed)  Quicker access to “new” guidelines  Researchers  Assessment of attributes in IOM standards for guidelines  More information about treatment in multi-morbidity or comorbid conditions

 AHRQ Opportunities  AHRQ Opportunities include: Build on NGC’s user base, including with health providers Revisit the NGC inclusion criteria Revisit the guideline age criterion Increase knowledge among guideline developers about how to create and report trustworthy guidelines Enhance guideline dissemination efforts Invest in major enhancements to the NGC Website that will provide significant added value