Workgroup Clinical Quality Measure Workgroup Jim Walker & Karen Kmetik, Co-Chairs May 7, 2012 - 4:30 pm – 5:30 pm.

Slides:



Advertisements
Similar presentations
Donald T. Simeon Caribbean Health Research Council
Advertisements

Clinical Quality Workgroup Update Jim Walker, MD, Chair August 15, 2012 Office of the National Coordinator for Health Information Technology.
Quality Measures Vendor Tiger Team December 13, 2013.
ELTSS Alignment to Nationwide Interoperability Roadmap DRAFT: For Stakeholder Consideration in response to public comment.
HITSC Clinical Quality Workgroup Jim Walker March 27, 2012.
Longitudinal Coordination of Care (LCC) Workgroup (WG)
5/10/2015DRAFT1 Public Health & Clinical LOINC - Feb 17 th 2012 CDC Vocabulary Team Office of Surveillance, Epidemiology and Laboratory Services (OSELS)
HIT Standards Committee Clinical Quality Measures Workgroup Report Jim Walker Karen Kmetik June 22, 2011.
Connecting People With Information DoD Net-Centric Services Strategy Frank Petroski October 31, 2006.
Interoperability and Health Information Exchange Workgroup March 10, 2015 Micky Tripathi, chair Chris Lehmann, co-chair.
10.5 Report Performance The process of collecting and distributing performance information, including status reports, progress measurements and forecasts.
Interoperability Standards Advisory Summary of Public Comments and Next Steps June 24, 2015 Chris Muir.
Discussion of 2015 Ed. NPRM Certification/Adoption Workgroup HIT Policy Committee April 2, 2014.
RDA Data Foundation and Terminology (DFT) IG: Introduction Prepared for RDA Plenary San Diego, March 9, 2015 Gary Berg-Cross, Raphael Ritz, Co-Chairs DFT.
HIT Policy Committee Accountable Care Workgroup – Kickoff Meeting May 17, :00 – 2:00 PM Eastern.
RDA Data Foundation and Terminology (DFT) IG: Introduction Prepared for RDA Plenary San Diego, March 9, 2015 Gary Berg-Cross, Raphael Ritz, Co-Chairs DFT.
ADC Meeting ICEO Standards Working Group Steven F. Browdy, Co-Chair ADC Workshop Washington, D.C. September, 2007.
Meaningful Use: Clinical Quality Measures Dwane J. McGowan 18 th April, 2013.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Query Health Concept-to-Codes (C2C) SWG Meeting #8 January 31,
Query Health Operations Workgroup HQMF & QRDA Query Format - Results Format February 9, :00am – 12:00am ET.
Query Health Distributed Population Queries Implementation Group Meeting October 25, 2011.
Standards Analysis Summary vMR – Pros Designed for computability Compact Wire Format Aligned with HeD Efforts – Cons Limited Vendor Adoption thus far Represents.
Standards and Interoperability Public Health Tiger Team (PHTT) Jim Daniel, MPH John Saindon, DrHSc, MT Public Health Lead Health Scientist ONC CDC 1.
Public Health Reporting Initiative: Stage 2 Draft Roadmap.
Lisa A. Lang, MPP Assistant Director for Health Services Research Information Head, National Information Center on Health Services Research and Health.
Larry Wolf, chair Marc Probst, co-chair Certification / Adoption Workgroup March 19, 2014.
Page 0 10/19/201510/19/2015 Meaningful Use of Health IT: Laboratory Data Capturing and Reporting Nikolay Lipskiy, MD, DrPH, MBA CDC, PHITPO.
IBIS-Admin New Mexico’s Web-based, Public Health Indicator, Content Management System.
Public Health Tiger Team we will start the meeting 3 min after the hour DRAFT Project Charter April 15, 2014.
Call: Wed, 10/05/11 4-5pm ET Agenda: 4:00 – 4:05 Welcome, Riki Merrick, co-lead [Note: attendance will be taken by livemeeting login] 4:05 – 4:15 Initiative.
Understanding eMeasures – And Their Impact on the EHR June 3, 2014 Linda Hyde, RHIA.
Larry Wolf, chair Marc Probst, co-chair Certification / Adoption Workgroup March 6, 2014.
Larry Wolf Certification / Adoption Workgroup May 13th, 2014.
Health eDecisions Use Case 2: CDS Guidance Service Strawman of Core Concepts Use Case 2 1.
School of Health Sciences Week 8! AHIMA Practice Briefs Healthcare Delivery & Information Management HI 125 Instructor: Alisa Hayes, MSA, RHIA, CCRC.
Clinical Quality Workgroup Update Jim Walker MD, Chair July 25, 2012 Office of the National Coordinator for Health Information Technology.
HIT Standards Committee Overview and Progress Report March 17, 2010.
HIT Standards Committee Vocabulary Task Force Task Force Report and Recommendation Jamie Ferguson Kaiser Permanente Betsy Humphreys National Library of.
Data Quality Improvement This material was developed by Johns Hopkins University, funded by the Department of Health and Human Services, Office of the.
Draft Provider Directory Recommendations Begin Deliberations re Query for Patient Record NwHIN Power Team July 10, 2014.
Public Testimony on Subsets & Value Sets to Taskforce on Vocabulary Clinical Operations Workgroup, Health IT Standards Committee from EHR Vendors, Terminology.
The SharePoint Shepherd’s Course for End Users Based on the book by Robert L. Bogue Copyright 2011 AvailTek LLC All Rights Reserved.
Draft Recommendations Patient Matching Power Team July 1, 2011.
Discussion - HITSC / HITPC Joint Meeting Transport & Security Standards Workgroup October 22, 2014.
The Data Sharing Working Group 24 th meeting of the GEO Executive Committee Geneva, Switzerland March 2012 Report of the Data Sharing Working Group.
Standards Analysis Summary vMR – Pros Designed for computability Compact Wire Format Aligned with HeD Efforts – Cons Limited Vendor Adoption thus far Represents.
Data Gathering HITPC Workplan HITPC Request for Comments HITSC Committee Recommendations gathered by ONC HITSC Workgroup Chairs ONC Meaningful Use Stage.
Clinical Quality Workgroup April 10, 2014 Commenting on the ONC Voluntary 2015 Edition Proposed Rule Marjorie Rallins– co-chair Danny Rosenthal –co-chair.
Query Health Distributed Population Queries Implementation Group Meeting January 4, 2012.
Lab Results Interface Validation Suite Workgroup and Pilots Workgroup Vision, Charter, NIST Collaboration, July 8,
Public Health Reporting Initiative July 25, 2012.
Quality Measures Workgroup Recommendations QUALITY MEASURE WORKGROUP RECOMMENDATIONS Mar 2, 2011.
Query Health Operations Workgroup Standards & Interoperability (S&I) Framework October 13, :00am – 12:00pm ET.
Clinical Documentation Hearing Recommendations Meaningful Use and Certification and Adoption Workgroups Paul Tang, MU Workgroup Chair Larry Wolf, C&A Workgroup.
1 DATA Act Information Model Schema (DAIMS) Version 1.0 Briefing June 2016.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Structured Data Capture (SDC) FHIR SDC Pilots Template
Lab Results Interfaces S&I Framework Initiative Bi-Weekly Initiative Meeting July 18, 2011.
HL7 C-CDA Survey and Implementation-A- Thon Final Report Summary Presentation to the HL7 Structured Documents Work Group on July 14, 2016.
Definition and Use of Clinical Pathways and Case Definition Templates
Overview of Request for Comments ONC 2017 Interoperability Standards Advisory ONC looking for comments on any last revisions, additions, or recommendations.
Setting Actuarial Standards
Software Measurement Process ISO/IEC
Health IT Policy Committee Workgroup Evolution
Updating GML datasets S-100 WG TSM September 2017
UFNPT Planning and Development Work Plan, Milestones and Timelines
LO4 - Be Able to Update Websites to Meet Business Needs
Component 11 Unit 7: Building Order Sets
Presentation transcript:

Workgroup Clinical Quality Measure Workgroup Jim Walker & Karen Kmetik, Co-Chairs May 7, :30 pm – 5:30 pm

Agenda 4:30 p.m.Call to Order/Roll Call MacKenzie Robertson, Office of the National Coordinator 4:35 p.m.Review of Agenda Jim Walker, Chair 4:40 p.m.Review: CQM Essential Components Tiger Team Value Set Recommendations 5:10 p.m.Update: Characteristics of Optimal Clinical Quality Measures for Health IT Tiger Team 5:25 p.m.Public Comment 5:30 p.m.Adjourn 05/07/2012 Office of the National Coordinator for Health Information Technology 1

Essential Components Update Developed recommendations regarding usable and useful value sets for MU Stage 2: – Review draft recommendations with full WG – final poll, week of May 16 – Report to Standards Committee May 24 Office of the National Coordinator for Health Information Technology 205/07/2012

Scope of Recommendations In scope: Imperative Value Set infrastructure to support MU2 – Validation of vocabulary codes – Internet hosting & delivery of value sets – Content standard for serving value sets – Transfer standard for serving value sets Out of scope: Longer term infrastructure – Discoverability – Curation, including harmonization & maintenance of codes and verification of semantic validity – Governance, content management, versioning Office of the National Coordinator for Health Information Technology 305/07/2012

Recommendations Recommendation 1.0: Establish NLM as a single authority for the validation of value sets used in Stage 2 quality measures. NLM should serve as a single source of truth for MU2 value sets, and should publish periodic updates to reflect changes within the underlying vocabularies and/or changes made by value set stewards. ONC should coordinate with other agencies, value set stewards, and consensus organizations as needed for value set hosting and serving/delivery. NLM will cross-check the accuracy of Stage 2 Clinical Quality Measure value sets by comparing value set codes and descriptors against appropriate source vocabularies to assess value set validity, and will suggest edits to value set stewards to ensure the validity of vocabulary codes, names, and vocabulary system version. Recommendation 2.0: ONC should expedite recommendations of the Implementation Workgroup (Jan 2012) and Vocabulary Task Force (April 2010) related to establishment of a publicly available value set repository. Office of the National Coordinator for Health Information Technology 405/07/2012

Recommendations Recommendation 3.0: The value set repository established by NLM should build upon the IHE Sharing Value Sets (SVS) profile for storing and serving value sets, and incorporate Common Terminology Service 2 (CTS2) methods for managing vocabularies referenced by value sets. Recommendation 4.0: Establish a web service for human and machine consumption of Meaningful Use 2 value sets. Consider NLM, AHRQ, or CDC as the Internet host for validated value sets. Provide output in commonly used formats, e.g., tab-delimited, spreadsheet or XML formats, suitable for import into SQL tables, and web service delivery. Support the creation of web-based views based on quality measure and value set names and numerical identifiers, QDM Category, code systems & code system versions used. Office of the National Coordinator for Health Information Technology 505/07/2012

6 MU2 Value Set Validation & Delivery Controlled Value Sets Publicly Available Controlled Value Sets Publicly Available Validation Delivery Controlled Value Sets Publicly Available Controlled Value Sets Publicly Available Human readable web page Machine readable web services 05/07/2012 Office of the National Coordinator for Health Information Technology

7 Swim Lanes Quality Measure / Value Set Developer Consensus Org Create value set Deliver value sets to NLM (non-endorsed measures) Receive feedback from NLM re: code validity Provide clarification as needed Incorporate edits into base value set Deliver value sets to NLM for endorsed measures Value Set Harmonization Receive value sets Store value sets in publicly available value set repository Provide feedback to developers re: code validity Request clarification from developers as needed Make validity edits to value sets Serve value sets in human & machine readable form NLM 05/07/2012 Office of the National Coordinator for Health Information Technology

What a repository might look like Office of the National Coordinator for Health Information Technology 8

What a repository might look like 905/07/2012

Characteristics of Optimal Clinical Measures for Health IT Update The Characteristics of Optimal Clinical Quality Measures for Health IT Tiger Team will focus on identifying the attributes of optimal clinical quality measures that are created or “re-tooled” for use in Health IT. Office of the National Coordinator for Health Information Technology 1005/07/2012

Tiger Team Scope The characteristics of optimal clinical quality measures evaluated by this Tiger Team are from a technical lens, not from the perspective of the importance of the quality measure per se. We are interested in applying this technical lens to measures we have and those we seek (e.g., longitudinal, patient-reported, clinical outcomes). Office of the National Coordinator for Health Information Technology 1105/07/2012

Goals & Timeline Identify the attributes of optimal clinical quality measures that are created or “re-tooled” for use in Health IT. Emphasis on “re-thinking” vs. “re-tooling.” – Draft report to Tiger Team May 9 – distribution to full WG May 16 – Report to Standards Committee May 24 Office of the National Coordinator for Health Information Technology 1205/07/2012

Tiger Team Scope The characteristics of optimal clinical quality measures evaluated by this Tiger Team are from a technical lens and a workflow lens, not from the perspective of the importance of the quality measure per se. We are interested in applying this technical lens to measures we have and those we seek (e.g., longitudinal, patient-reported, clinical outcomes). Office of the National Coordinator for Health Information Technology 1305/07/2012

What Makes an Optimal Quality Measure? Office of the National Coordinator for Health Information Technology 14 Usability Availability of data Reduces re-entry of data by reusing data where possible Feasibility EHR feasibleEHR Enabled Accuracy Data reported is captured and queried correctly Process has few errors Data is known to be accurate Assumptions are not made on method of capture Standard Terminology Reduces variations in interpretation Reduces workarounds and hard-coding of choices 05/07/2012

Usability Definitions The data may be available now or could be available with reasonable workflow changes. Redundancy – The data capture should reduce re-entry unless entering it again provides value, such as in clinical decision support, care coordination, or verification process. Office of the National Coordinator for Health Information Technology 1505/07/2012

Feasibility Definitions Office of the National Coordinator for Health Information Technology 16 EHR Feasibility – Functionality to support the quality measure exists in most EHRs or could exit within reason for stretch quality measures (data accessible). EHR Enabled – The quality measure is enabled due to data being in electronic format. These items are difficult to measure on paper or non-electronic formats. 05/07/2012

Accuracy Definitions Accuracy – For clinical quality measures to be optimal, they need to be accurate, and accuracy has four parts: – Data are captured correctly and queried correctly (clear, detailed specifications) – Process of collection has few errors and does not require re-entry of data unless it provides value (eg, verification, care coordination, clinical decision support) – Knowledge that the data itself are accurate irrespective of capture mechanism – Assumptions are not made about how the collection happens, but instead guidance is provided Office of the National Coordinator for Health Information Technology 1705/07/2012

Standard Terminology Definitions Standard Terminology Usage (shared meaning) - Data needed for quality measures should be captured using standard terminology to reduce variations in interpretation and to reduce hard-coding of choices and workarounds. – We want confidence that practice A/EHR A and practice B/EHR B are using the same terminology for data elements – The data should be easily aggregated because the data are using common standards as dictionaries. For example, everyone uses the same value set to identify the population of patients with diabetes for a particular measure. Office of the National Coordinator for Health Information Technology 1805/07/2012

Applying These Criteria to Different Types of Measures: Process Clinical outcome Patient-reported outcome Change over time (delta) Interpreting results from study (e.g., feasibility testing at practice sites, data sets, surveys) 05/07/ Office of the National Coordinator for Health Information Technology

Discussion 2005/07/2012 Office of the National Coordinator for Health Information Technology