Presentation is loading. Please wait.

Presentation is loading. Please wait.

Usability and Human Factors

Similar presentations


Presentation on theme: "Usability and Human Factors"— Presentation transcript:

1 Usability and Human Factors
Electronic Health Records and Usability Welcome to Usability and Human Factors, Electronic Health Records and Usability. This is Lecture a. In this unit we will apply principles of usability and design to critiquing EHR systems and to making recommendations for iterative improvement. Lecture a This material(Comp 15 Unit 6) was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC This material was updated by The University of Texas Health Science Center at Houston under Award Number 90WT0006. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. To view a copy of this license, visit Health IT Workforce Curriculum Version 4.0

2 Electronic Health Records and Usability Lecture a – Learning Objectives
Define usability as it pertains to the EHR (Lecture a) Challenges of EHR design and usability in typical workflow (Lecture a) By the end of this unit students will be able to: 1. Describe and define usability as it pertains to the EHR 2. Explain the challenges of EHR design and usability in typical workflow

3 Why? 3 reports (AHRQ, HIMSS, NRC) in 2010
strong, often direct relationship with  clinical productivity error rate user fatigue user satisfaction, effectiveness, efficiency In 2010, three major organizations, the Association for Research on Healthcare and Quality (AHRQ), the National Research Council (NRC), and the Health information Management Systems Society (HIMSS) published reports on usability. NRC was the first, and after a two year study, in which experts travelled around the country looking at some of the institutions with the best healthcare IT, came to this conclusion: That while computing science has adequately met the needs of the back-end systems; what is needed is better front-end development that provides cognitive support to clinicians. Usability is a critical part of the user experience. The other reports focused on usability also, pointing out its influence in errors (which in medicine can be fatal), user satisfaction, and productivity.

4 Usability and EHR Certification
2014: Certification criteria revised to include safety-enhanced design 2015: US DHHS recommends following NIST and ISO standards for user centered design and usability evaluation: NISTIR 7804 “Technical Evaluation, Testing, and Validation of the Usability of Electronic Health Records”  provided examples of method(s) that could be employed for UCD, including ISO , ISO 13407, ISO 16982, ISO/IEC 62366, ISO and NISTIR 7741. In 2014, certification criteria were revised to include safety-enhanced design. In 2015, the United States Department of Health and Human Services, in the final rule published in the Federal Register on October 16, 2015, recommended following specific standards and guidelines for testing and evaluation methods published by the National Institute of Standards and Technology (or NIST) and by the International Organization for Standardization.

5 Why? (Cont’d – 1) Lack of usability and accessibility will result in:
Lack of trust Potential abuse Lessons from electronic voting: No election has been proven to have been hacked However, usability has altered the outcome of elections1 User’s view of system conditioned by interface experience Tognazzini, B. (2001). Users form their impression of software from their experience above all; poor experiences can lead to profound dissatisfaction (including refusal to use the system), abuse, dangerous workarounds, and other serious consequences. For example, we have seen the results of poor usability affect the outcome of elections

6 HIMSS Usability Criteria (2009)
1. Simplicity 2. Naturalness 3. Consistency 4. Minimizing cognitive load 5. Efficient interactions 6. Forgiveness 7. Feedback 8. Effective use of language 9. Effective information presentation 10. Preservation of context In 2009, HIMSS published ten aspects of usability: simplicity, naturalness, consistency, minimizing cognitive load, efficient interactions, forgiveness, feedback, effective use of language, and effective information presentation, and preservation of context. HIMSS EHR Usability Task Force (2009).

7 National Center for Cognitive Informatics General Design Principles for EHRs
Consistency and standards Visibility of system state Match between system and world Minimalist design Minimize memory load Informative feedback Flexibility and efficiency Good error messages Prevent errors Clear closure Reversible actions Use the user’s language Users in control Help and documentation Current research by the National Center for Cognitive Informatics and Decision Making in Healthcare (the NCCD) has found that a great user interface follows established human interface design principles that are based on the way users (doctors, nurses, patients etc.) think and work. There are 14 general design principles that can be applied to the development of EHRs, and they are an expansion and elaboration of Nielsen’s 10 principles that are discussed in other units in this component. We will use these guidelines in the remainder of this unit.

8 The State of the Art Egregious bad design exists in current EHRs
Complicated by: Vendor contracts forbidding customer to talk about their experience Lack of ability to publish (e.g. screenshots) can hinder scholarly research Let’s look at some reasons why EHR usability is not yet optimal. Vendor contracts may forbid customers (even customers of the same EHR) to discuss their experiences. Publication of screenshots and other information may be forbidden by copyright; this hinders research.

9 State of the Art (Cont’d – 1)
AHRQ Report on Vendor Practices (2010) We’re not there yet Many legacy systems >10 years old (AHRQ, 2010) Standards are borrowed Best practices not defined Expectations unclear Communication is limited Formal usability testing rare Usability is perceived to be overly subjective The AHRQ report found that many legacy systems currently in use are more than 10 years old, and implementation plans can take decades. Best practices have not been defined yet, though AHRQ and other associations are working on this. Expectations are unclear, communication limited, and many vendors do not do formal usability testing, or only do it to a limited extent. Because of the lack of formal standards and training usability may be perceived to be overly subjective and therefore difficult to measure. As we will show later, this is not the case.

10 AHRQ: Report on Vendor Practices
But…. Users involved in EHR design/review Vendors compete on usability [is this something that should be competed on? Not a basic requirement?] Users demand better products Plans for formal usability testing increasing Vendors willing to collaborate However, the increased interest and focus on this problem means that there is increasing involvement of users in design. The AHRQ report on vendor practices found that vendors attempt to compete on usability, users demand better products, and plans for formal usability testing are increasing. Vendors also say they are amenable to changing design if given guidelines.

11 State of the Arguments Some feel clinicians have ‘given up’ due to difficulty of getting things changed Learned helplessness Political and power struggle Administration vs. staff Vendor vs. users… Lack of clinician input at design Or too limited clinician input at all phases, from design to rollout Some users and researchers are discouraged at the extremely poor usability of some systems, which has led to errors (including fatal errors). Political and power struggles in implementations can ensue, as the introduction of technology can also change power relationships, as well as radically alter workflow and work practices. Lack of appropriate clinician input at design has sometimes resulted in systems which are at best difficult to use and at worst, dangerous.

12 AHRQ: Report on Vendor Practices (Cont’d – 1)
“The field is competitive so there is little sharing of best practices to the community. The industry should not look towards vendors to create these best practices. Other entities must step up and define [them] and let the industry adapt.” “Products are picked on the amount of things they do, not how well they do them.” “There are no standards most of the time, and when there are standards, there is no enforcement of them. The software industry has plenty of guidelines and good best practices, but in HIT, there are none.” According to the AHRQ report on vendor practices below are some quotes. “The field is competitive so there is little sharing of best practices to the community. The industry should not look towards vendors to create these best practices. Other entities must step up and define [them] and let the industry adapt.” Products are picked on the amount of things they do, not how well they do them.” “There are no standards most of the time, and when there are standards, there is no enforcement of them. The software industry has plenty of guidelines and good best practices, but in HIT, there are none.”

13 Vendor Testing A review of 41 vendor reports in 2015 found:
“A lack of adherence to ONC certification requirements and usability testing standards among several widely used EHR products...” only 22% had used at least the minimum number of participants with clinical backgrounds Ratwani (2015). A study published in 2015 by Ratwani and colleagues found that there was a lack of adherence to ONC certification requirements. For example, only 22% of the vendor reports had used at least 15 participants with clinical backgrounds for usability tests. Ratwani and colleagues stated, “The lack of adherence to usability testing may be a major factor contributing to the poor usability experienced by clinicians. Enforcement of existing standards, specific usability guidelines, and greater scrutiny of vendor UCD processes may be necessary to achieve the functional and safety goals for the next generation of EHRs.”

14 The Bad and the Ugly Examples of egregious usability problems (Silverstein, 2009) Related data far apart, requires user to click multiple times (‘clickorrhea’) e.g. diastolic blood pressure 4 screens from systolic Diagnosis (Dx) lists with rare Dx at top, common at bottom, hidden terms incorrect selection Let’s look at some examples of egregious usability problems, prepared by Scot Silverstein, an informatician who had to prepare mock screenshots (based on real systems) because of copyright restrictions. His website contains more examples. Some basic examples are the placement of related data far apart, such as one real system, which required the user find the different components of blood pressure (systolic and diastolic) four screens apart. Another example is diagnosis lists, which make rare diagnoses more easily clickable than common ones.

15 What is Wrong With This Picture?
Take a look at this mock screenshot. What do you see that is suboptimal or could lead to error? Silverstein, S. (2009).

16 What is Wrong With This Picture? (Cont’d – 1)
Instead of just programming a highlight for information that should be alerted, the system states there are no indicator flags. Note the warning that there are no warnings about abnormal results "There are no indicator flags” Silverstein, S. (2009).

17 What is Wrong With This Picture? (Cont’d – 2)
The results section says that the result is negative, and the result is final. Most clinicians who are busy would likely stop reading here. Results section says "negative" and "results final” Most busy clinicians' eyes would stop there, especially in the wee hours Silverstein, S. (2009).

18 What is Wrong With This Picture? (Cont’d – 3)
Then, there is an addendum saying that the culture is actually positive for MRSA, a dangerous infection that often spreads in hospitals. Addendum to the report that the result is actually positive for MRSA, a drug-resistant infection No flag on that addition, yet during data entry at the lab, a flag was requested and seen by the reporting technician Silverstein, S. (2009).

19 What’s Wrong? Clinician forced to hunt around every result for indications of normalcy or abnormally Disparity between what is seen at the lab (abnormal addendum is flagged) and the clinician’s view Not fictitious: treatment delayed >24 hours until someone later noticed the addendum System is CCHIT certified On paper, the wrong item could have been crossed out This sort of bad design has several consequences. It forces clinicians to search for indications of normalcy or danger. It presents a disparity from the lab system, which normally flags abnormal results. This can lead to miscommunication between personnel. The case is a real case in which the patient was not treated for a dangerous infection for 24 hours. The system is CCHIT certified despite the bad design. This example also shows one of the changes from paper to computer - in a paper system the erroneous first test could have been crossed out, preventing the mistake.

20 More of the Bad Convenient for programmer, not for the doctor
Alphabetical problem list: Convenient for programmer, not for the doctor Should prioritize by importance This slide shows an alphabetized problem lists from a real system. It does not meet the needs of clinicians, who would want to see the problems in order of severity or importance. Silverstein, S. (2009).

21 More of the Bad (Cont’d – 1)
List auto-populated by system Not editable by clinician Patient does not have atrial fibrillation (entered by nurse to speed order) Requires vendor to remove The list is created by the system automatically, and the clinician does not have the ability to edit or delete entries. The entries can be incorrect because many people put information into the system, and may make selections for convenience, such as the nurse who entered the atrial fibrillation diagnosis to speed up the order fulfillment. Unbelievably, the wrong entry can only be removed by the vendor. Silverstein, S. (2009).

22 More of the Bad (Cont’d – 2)
Multiple diabetes entries (incorrect) Lack of controlled terminology mapping Useless information: ‘Medication use, long term’ Clutters screen Thus the multiple diabetes diagnoses, only one of which is accurate. Lack of controlled terminology makes term management difficult. The list also includes useless information such as the 'medication use, long term' item. Silverstein, S. (2009).

23 More of the Bad (Cont’d – 3)
This screen shows excessive density, complexity, lack of organization or marking that could make it easier to read, extraneous information, and general clutter. Repetition, extraneous information, lack of focus and clarity Lack of any symbolic or diagrammatic representations, and general clutter Silverstein, S. (2009).

24 More of the Bad (Cont’d – 4)
This is a grid which the user must scroll to be able to see some information. However, when the user scrolls... Silverstein, S. (2009).

25 More of the Bad (Cont’d – 5)
...the row and column headers that tell what information belong to each column disappears. Thus the user must keep track either mentally or (more likely) by placing fingers on the screen. Otherwise it would be easy to lose track of columns or misread information, potentially causing errors. Forces user to keep track of columns with finger on screen Easy to confuse columns Silverstein, S. (2009).

26 More of the Bad (Cont’d – 6)
This screen has excessive repetitious information that is not needed and is distracting, such as including the units in every cell instead of in the header rows. There is a lack of focus and clarity; lab panel components are scattered. Units (e.g. mg/dL) in every line, repetitious, distracting Lab panel components scattered Silverstein, S. (2009).

27 Electronic Health Records and Usability Summary – Lecture a
State of the Art AHRQ repots on vendor practices Examples of how wrong data are input into EHR systems This concludes lecture a of Usability and Human Factors, Electronic Health Records and Usability. In this unit we examined vendor practice reports by the Agency for Healthcare Research and Quality. This provided key rules and roles for vendors. In addition this lectured provided examples on how wrong data can be input into EHR systems (error). In the next lecture we will continue by discussing usability concepts.

28 Electronic Health Records and Usability References – Lecture a
2014 Edition Release 2 Electronic Health Record (EHR) Certification Criteria and the ONC HIT Certification Program; Regulatory Flexibilities, Improvements, and Enhanced Health Information Exchange. 79 Federal Register. (September 11, 2014) 45 CFR Part 170. Final Rule. Retrieved on June 27, 2016 from 2015 Edition Health Information Technology (Health IT) Certification Criteria, 2015 Edition Base Electronic Health Record (EHR) Definition, and ONC Health IT Certification Program Modifications. 80 Federal Register. (October 16, 2015). 45 CFR 170. Pages Final Rule. Retrieved on June 27, 2016 from Safety-enhanced design. Usercentered design processes must be applied to each capability an EHR technology includes that is specified in the following certification criteria: § (a)(1), (2), (6) through (8), (16) and (18) through (20) and (b)(3), (4), and (9).” Page HIMSS EHR Usability Task Force (2009). Defining and testing EMR usability: principles and proposed methods of EMR usability evaluation and ratings.. Retrieved on September 4th, 2011 from (Link updated June 27, 2016) McDonnell C, Werner K, Wendel L. Electronic Health Record Usability: Vendor Practices and Perspectives. AHRQ Publication No. 09(10) EF. Rockville, MD: Agency for Healthcare Research and Quality. May 2010. National Center for Cognitive Informatics & Decision Making in Healthcare. (n.d.) General Design Principles for EHRs. Retrieved June 27, 2016 from Ratwani, R. M., Benda, N. C., Hettinger, A. Z., & Fairbanks, R. J. (2015). Electronic health record vendor adherence to usability certification requirements and testing standards. JAMA, 314(10), Retrieved from No audio

29 Electronic Health Records and Usability References – Lecture a
Silverstein, S. (2009). Are Health IT Designers, Testers and Purchasers Trying to Harm Patients? Part 2 of a Series Healthcare Renewal Blog, Sunday, February 22, Retrieved on August 11th, 2010 from Tognazzini, B. (2001). The Butterfly Ballot: Anatomy of a Disaster. Retrieved from No audio

30 Electronic Health Records and Usability References – Lecture a (Cont’d – 1)
Images Slide 15, 16, 17, 18, and 20, 21, 22, 23, 24, 25, 26 : Silverstein, S. (2009). Are Health IT Designers, Testers and Purchasers Trying to Harm Patients? Part 2 of a Series Healthcare Renewal Blog, Sunday, February 22, Retrieved on August 11th, 2010 from No audio

31 Usability and Human Factors Electronic Health Records and Usability Lecture a
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC This material was updated by The University of Texas Health Science Center at Houston under Award Number 90WT0006. No Audio. Health IT Workforce Curriculum Version 4.0


Download ppt "Usability and Human Factors"

Similar presentations


Ads by Google