Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Healthcare IT Job Satisfaction – Fun Friday

Posted on February 23, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

It’s Friday and so as we head into the weekend it’s time for some fun. This is especially needed with HIMSS only 10 days away. This first cartoon hits on the impact of technology on our health, but also on the impact of EHR and technology on doctors. Especially healthcare IT software with really bad UIs. You know what I’m talking about.

And this one for my coffee loving friends:

Are Improved EMR UI Designs On The Way? I Doubt It

Posted on December 4, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

More or less since EMRs were first deployed, providers have been complaining about the poor quality of the interface they’ve had to use.  Quite reasonably, clinicians complained that these interfaces weren’t intuitive, required countless extra keystrokes and forced their work processes into new and uncomfortable patterns.

Despite many years of back and forth, EMR vendors don’t seem to be doing much better. But if a new story appearing in Modern Healthcare is to be believed, vendors are at least trying harder. (Better late than never, I suppose.)

For example, the story notes, designers at Allscripts create a storyboard to test new user interface designs on providers before they actually develop the coded UI. They use the storyboard to figure out where features should sit on a given screen.

According to the magazine, designers at several other EMR vendors have begun going through similar processes. “They are consulting with and observing users inside and outside of their natural work environments to build EHRs for efficient – and pleasant – workflows, layouts and functionality,” the magazine reports.

Reporter Rachel Arndt says that major EHR vendors now rely on a mix of approaches such as formal user testing and collection of informal feedback from end-users to meet their products more usable for clinicians. In some cases, this has evolved into official UI design partnerships between EHR vendors and customers, the story says.

Okay. I get it. We’re supposed to believe that vendors have finally gotten their heads together and are working to make end-users of their products happier and more productive. But given the negative feedback I still get from clinicians, I find myself feeling rather skeptical that the EHR vendors have suddenly gotten religion where UI design is concerned.

For what it’s worth, I have no doubt that Ms. Arndt reported accurately what the vendors were telling her. If any of us would ask vendors they are partnering with customers – especially end-users – to make their products more intuitive to work with, they will swear on a stack of user manuals that they’re improving usability every day.

Until I hear otherwise, though, I’m not going to assume that conditions have changed much out there where EHR usability is concerned. Today, all the feedback I get suggests that EHRs are still being designed to meet the needs of senior management within provider organizations, not the doctors and nurses that have to use them every day.

Of course, I hope I’m wrong, and that the story is accurate in ways that offer some hope to clinicians. But for now, color me very doubtful that EMR vendors are making any earth-shattering UI improvements at present.

Improving the EHR Interface and Topol Saves Patient’s Life on Flight Home

Posted on March 5, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

As I thought through my day at HIMSS, a theme started to emerge from all the dozens of meetings I’ve already had at the show (with many more still to come). The theme I saw coming out was ways to improve the EHR interface. This is a much needed change in EHR’s, so it was interesting to see a whole series of companies working on ways to make the EHR interface better. Here are some of the highlights from companies I talked to at HIMSS.

SwiftKey – While the SwiftKey product can be used in the consumer space as well, it was interesting to see the technology applied to healthcare. SwiftKey is basically a replacement for your mobile device keyboard. In fact, I’d call SwiftKey a smart keyboard for your mobile device. What does it do to make your mobile device keyboard smart?

First, it offers word suggestions you can easily choose as you start to type. Most people are familiar with this base functionality because it exists in some form in most mobile keyboards (or at least it does on my Android). However, they’ve taken it a couple steps further. They actually use the context of what you’ve typed to predict what word you may want to type next. For example, if you type, “nausea and” then it predicts that you’ll want to type vomiting. If you type “urinary” then it will predict tract and then infection. Plus, they told me their algorithm will also learn your own colloquial habits. Kind of reminds me of Dragon voice recognition that learns your voice over time. SwiftKey learns your language habits over time.

I’m sure some of these predictive suggestions could lead to some hilarious ones, but it’s an interesting next step in the virtual keyboards we have on mobile devices. I’ll be interested to hear from doctors about what they think of the SwiftKey keyboard when it’s integrated with the various EHR iPad apps.

M*Modal and Intermountain – Thinking back on the demos and products I’ve seen at HIMSS 2013, I think that the app M*Modal has created for Intermountain might be the coolest I’ve seen so far. In this app, a doctor would say an order for a prescription, and the M*Modal technology would apply voice recognition and then parse the words into the appropriate CPOE order fields. It was pretty impressive to see it in action. Plus, the time difference between speaking the order and trying to manually select the various order fields on the mobile device was incredible.

I was a little disappointed it was only a demo system, but it sounds like Intermountain is still doing some work on their end to make the CPOE happen. I’m also quite interested to see if a simple mobile app like this will see broad adoption or if more features will need to be added to get the wide adoption. However, it was almost like magic to see it take a recorded voice and convert it into 5-7 fields on the screen. I’d be interested to see the accuracy of the implementation across a large set of doctors, but the possibilities are quite interesting for transforming the CPOE interface.

Cerner Mobile – One of the new Cerner ambulatory EHR features is an iPad interface for the doctor. I’m sure that many will think this is old news since so many other iPad EHR interfaces are out there. In some ways it is, but there was a slickness to their app that I hadn’t seen a lot of places. In fact, the demo of their ambulatory EHR iPad app reminded me a lot of the features that I saw in this video Jonathan Dreyer from Nuance created (bottom video) that demonstrated some of the mobile voice capabilities. Plus, the app had a nice workflow and some crazy simple features like doing a refill. One swipe and the med was refilled. Almost makes it too easy.

Canon – This is a little different than some of the other EHR interface things I talk about above. In the case of Canon it was interesting to see the tight integration that’s possible between the Canon scanners and EHR software. Instead of the often laborious process of scanning to your EHR and assigning it to a patient, Canon has a scan direct to EMR option including analyzing the cover sheet to have the scanned document attached to the right patient and EHR chart location. While we’d all love to have paper gone, it will be a part of healthcare for the forseeable future. The scan direct to EMR is a pretty awesome feature.

Those are a number of the EHR interface things that I’ve seen so far at HIMSS. I’m sure there are dozens of others out there as well. I think this is a great trend. Sure, each of these things is only a small incremental change, but with hundreds of EHR vendors all doing small incremental changes we’re going to see great things. That’s good, because many of the current EHR interfaces are terribly unusable.

In an related topic, Eric Topol gave a keynote address at HIMSS today. He had glowing reviews from what I could tell. Although, what’s an even more powerful story is to see the message he shared at HIMSS in action. On Topol’s flight home to San Diego a patient was having some medical issue. He did the ECG right on the plane using his smartphone and the passenger was able to make it safely to the destination. You can read the full story here. What’s even more amazing is that this is the second time something like this has happened to Topol. This probably means he flies too much, but also is an incredible illustration of the mHealth technology at work. Truly amazing!

Full Disclosure: Cerner and Canon are advertisers on this site.

Are “User” And “Process” – Centered EMR Design On A Collision Course?

Posted on April 3, 2012 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Most of the critiques I read of EMR design ding the EMR for its difficulty to use or its inability to accomodate the workflow of the institution that bought it — and of course, sometimes both. What I’ve never heard suggested, however, is the following idea proposed by Chuck Webster, a guy who clearly doesn’t stop short when he decides to study something. (He’s an MD, an MSIE and an MSIS in intelligent systems design, which is only one of the reasons I think he’s onto something here.)

In a thoughtful and nuanced blog entry, Dr. Webster outlines the work of a pioneer in usability design, Donald Norman, and comes away with the conclusion that the current trend toward “human-centered design” might actually be a mistake.  What a pain — health IT limps along catching  up with a trend from the 1980s, and now may be too late to catch the bus.

In any event, Dr. Webster argues instead of focusing on human/user-centered design, EMR vendors should be focused on activity- or process-centered design. I love what he says about one of the potential problems with human-centered UIs:

Optimization around a user, or user screen, risks the ultimate systems engineering sin: suboptimization. Individual EHR user screens are routinely optimized at the expense of total EHR system workflow usability…I’ve seen EHR screens, which, considered individually, are jewel-like in appearance and cognitive science-savvy in design philosophy, but which do not work together well.

It’s better, he suggests, to have EMRs model “interleaved and interacting sequences of task accomplishment” first and foremost. For example, he writes, key task collections that should be considered as a whole include workflow management systems, business process management, case management and process-aware information systems.

While there’s much more to say here, of course, I’ll close with Dr. Webster’s words, who once makes his point with wonderful clarity:

User-centered EHR design does help get to good EHRs. Good isn’t good enough. If EHRs and HIT are going to help transform healthcare they need to be better than world-class (compared to what?). They need to be stellar. Traditional user-centered design isn’t going to get us there.

The question I’m left with, readers, is whether you can have your cake and eat it too. Does one side of UI/UX design literally have to be jettisoned to support the other?