This page contains a Flash digital edition of a book.
or service, the end user is at the center of the design process, which results in a product that accommodates users, rather than forcing them to adapt to the system. The research team visited 11 EHR


vendors of varying size. The small- est participating vendor had about 10 employees, while three employed more than 6,000 people and generated revenues of more than $1 billion each. The research team conducted inter- views with each vendor to ask about its UCD processes. Interviewers found the vendors fell


into one of three categories of UCD refinement:


• Well-developed UCD: Vendors have a refined UCD process and an extensive staff devoted to usability;


• Basic UCD: Vendors understand the importance of UCD and are working toward UCD processes but face resource constraints and employ few usability experts;


• Misconceptions of UCD: Vendors


have no UCD process in place, gen- erally misunderstand the concept of UCD, and generally have no us- ability experts on staff.


“What we were finding … is that


many of these vendors didn’t really know about how to design usable software,” Dr. Walji said. “Oftentimes, they were designed by computer peo- ple. Sometimes they would get clini- cian feedback on it, but … that’s not really enough. You really need to have usability designers in the mix, as well. That’s why we said it’s really a shared responsibility. We need to get all these stakeholders in place.” Midway through the project, new


ONC requirements for vendor certifi- cation gave researchers hope for bet- ter development and testing on the vendor end. Federal rule revisions the Department of Health and Human Services passed in 2012 now require EHR vendors to perform a process known as safety-enhanced design to obtain certification.


Under the current requirements,


vendors must use a formal UCD pro- cess during development of their EHR systems and perform usability testing with “real” users, or physicians who would use the system in a real-world setting. On top of that, the vendors have to make a report of their testing publicly available to earn certification. Dr. Walji says the requirement for


vendors to publicize their results was a positive sign for the SHARPC re- searchers. But once vendors started posting their reports, he said, it turned out many of the reports “weren’t par- ticularly useful, in the sense that they didn’t really disclose the information they were supposed to. And the certi- fication bodies seemed to let them get away with that.” “Some of the vendors did [the re- porting] very well, and some of them had a very cursory statement that they had actually conducted this testing, and that was it,” Dr. Walji said. “So I’d suggest that they probably didn’t want to disclose that information and tried to do the absolute minimum without being forced to.” Houston internist Hardeep Singh,


USABILITY SEVERITY RANKINGS


SHARPC researchers who performed the rapid usability assessment ranked the usability problems they found in each electronic health record system on a severity scale from one to four. Here are examples of each type of violation:


• One (cosmetic issue): minor typographical or spelling error. • Two (minor violation): cluttered user interface screen that presents too much information at once.


• Three (major violation): inappropriate use of autofill functionality or defaults, such as when the system automatically enters correspond- ing parameters once the user enters drug information.


• Four (catastrophic violation): a display box that isn’t wide enough to show the full name of a prescription medication, causing potential confusion over drugs with similar names.


42 TEXAS MEDICINE September 2015


MD, briefly participated in one of the three other SHARPC subprojects out- side of the EHR usability and work- flow research, but became familiar with the work researchers were doing on the EHR effort. He says the ONC requirements for safety-enhanced de- sign may need to become more strin- gent and hold everyone to a better standard for testing. Dr. Singh notes that vendors men-


tioned they had a hard time recruit- ing physicians to help them perform the required testing. Dr. Singh says he could vouch for that problem and wonders how vendors could make the testing worth the doctors’ time. “Physicians are stretched from ev-


ery direction,” Dr. Singh said. “Now, if you want physicians to contribute to usability testing or user-centered design principles, we would need to somehow incentivize or pay them.”


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68