Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Possible Future EHR UIs at CES

Posted on January 16, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

If you haven’t been following all of my CES Digital Health coverage, you might want to check out some of the following articles:
Initial CES 2015 Observations
Wearables Explosion at CES 2015
A Video Look at the Digital Health, Fitness and Wellness Section of CES 2015
A Look at Digital Health at CES 2015

While I was mostly focused on the Digital Health section of CES, I also took note of a number of new user interface approaches that various companies were demoing at CES. Since it’s CES, some of these are still conceptual, but they got my EHR UI thoughts going.

Finger Mouse
The Motix Touch Mouse was one of the most intriguing new user interfaces I’ve ever seen in the 10 years I’ve been attending CES. Your hands basically stay on the keyboard and a motion capture device follows your finger which works like a mouse on screen. It was a really interesting evolution of the mouse. Unfortunately, they didn’t have a great form example which would replicate the EHR world in which I live. So, I’m not sure how well this finger mouse would work filling out the long forms that many have in their EHR. However, the concept was really intriguing to consider.

Here’s a video demo of the Motix Touch Mouse:

3D Rudder
The 3D Rudder really blew my mind when I tried it out. I’m not exactly sure of its application in the EHR and healthcare IT world, but the experience of controlling your computer with your feet was really amazing. Plus, the foot control was able to work in 3 dimensions which made it really unique. It took me a second to learn, but I’d love the new way to look at how an input control could work.

You can see the 3D Rudder’s Indiegogo campaign, and here’s a video demo of the 3D Rudder:

While the mouse and keyboard have been tremendously powerful input devices for computers, I’m fascinated to consider how the evolution of computer input will go. We’ve seen the amazing growth of voice and touch over the past couple years. However, I think and hope we’re just getting started with how simple it will be to control the computers of the future. I believe the small innovations like the two mentioned above are part of the process of improving computer UIs as we know them.

Amazing Time Lapse EHR UX Design Video

Posted on May 16, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

A big thanks to L-J Cunningham (@UXforHealth) for tweeting out this really cool time lapse video that shows SoftServe‘s work doing the UX design for the mEMR application. While the process they use is really cool to watch, it’s also interesting to see what a mobile EHR UI could look like.

Getting Your EMR’s UI/UX RIght

Posted on June 4, 2013 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

A couple of weeks ago, someone posted an interesting question on the buzzing question and answer site Quora.com: Is there room for any more new EMRs in the insanely crowded marketplace we have today? According to one very sharp medical student who’s keeping an eye on the field, the best response isn’t “yes,” or “no,” but “you’ve got the wrong question.”

His answer, which I’d like to share with you, argues that there’s no point whatsoever trying to introduce a new EMR with a shiny new feature set when none of the existing field have a decent UI/UX right now. Jae Won Joh then lays out the steps he believes vendors should take if they want to get the basic UI/UX right (steps excerpted for brevity):

Step 0: Architect the patient data structure carefully
I mention this because you’re going to need to be able to pass this patient data around for clinical use, billing, research, auditing, etc, so design for flexibility and expandability from the get-go. Too many EMRs make it painfully obvious that things were thrown in as afterthoughts.

Step 1: Decide on your market…
…because you need to do everything possible to totally kill it. It’s the only way to go. If you’re going to take on group practices, great, take on group practices. If you’re going to work the hospital scene, fine, work the hospital scene. Stop trying to make something that does everything everywhere. This is not a feature, it’s a horrible bug.

Step 2: Analyze what your market does
If it’s a hospital, you need multiple classes of user, ranging all the way from student to nurse to physician to administrator. You’ll also want a competent notification system, because inpatient things tend to be more urgent and if the ICU patient’s potassium is critically high, you probably want to warn the physician immediately instead of waiting for the physician to check on it manually, because gee, the patient might code and die before that happens….The concerns are different for an outpatient scenario: you don’t need a lot of the stuff that hospitals require in an office. Less orders, more scripts, greater throughput in terms of number of patients, scheduling functionality, etc.

Step 3a: Abstract workflows to a very high level first
In other words, they are as follows: 
1) read data
2) interpret data
3) input data

There’s really not much else to it. Every workflow is a permutation of those three. For example: a physician orders a lab, and it’s performed. The result is read by the tech who provides the input to the system, where it is then read and interpreted by the physician so they can go from there. Figure out how each workflow revolves around these three abstractions.

<excerpted>

Step 3c: Design for a 5-year-old
If a five-year old couldn’t use your UI, you screwed up. Period.

There’s a lot more to Joh’s answer, and I suggest you hit Quora youself and read his entire piece. When it comes to usability, most EMRs have barely scratched the surface, and talking about these issues more is always a Good Thing.

EMR UX, Flagler Hospital EMR Video, and Antiquated EMR Screenshots

Posted on May 5, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.


I don’t think Nick was being a cynic with this comment. He’s right. Nothing has pushed EHR vendors to do a good UX and I don’t see that changing any time soon. The #EHRBacklash hasn’t reached a critical mass (yet?). As a side note, I recorded a video chat with Nick Dawson this week as well. He’s a smart man.


Kudos to Flagler Hospital for celebrating their EHR in a fun way for their staff. It’s interesting to see their list of EHR benefits in thew video as well. Although, the dancing is absolutely the best part of the video. Nice work Flagler and a great follow up to your EHR launch video.


I’ll admit that this tweet is quite nostalgic for me. The Medical Quack is one of the first people I met in my online EMR world. I still remember an early Skype video chat we did just after Skype video came out. We were both surprised that it actually worked. It was almost midnight and for some reason we were both up doing EMR related stuff. It was amazing for me to consider one lady on her own trying to develop an EMR. My how far we’ve both come since then. It’s great to see the screenshots from her original EMR. Too bad I didn’t have my EHR screenshots website back then.

Are “User” And “Process” – Centered EMR Design On A Collision Course?

Posted on April 3, 2012 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Most of the critiques I read of EMR design ding the EMR for its difficulty to use or its inability to accomodate the workflow of the institution that bought it — and of course, sometimes both. What I’ve never heard suggested, however, is the following idea proposed by Chuck Webster, a guy who clearly doesn’t stop short when he decides to study something. (He’s an MD, an MSIE and an MSIS in intelligent systems design, which is only one of the reasons I think he’s onto something here.)

In a thoughtful and nuanced blog entry, Dr. Webster outlines the work of a pioneer in usability design, Donald Norman, and comes away with the conclusion that the current trend toward “human-centered design” might actually be a mistake.  What a pain — health IT limps along catching  up with a trend from the 1980s, and now may be too late to catch the bus.

In any event, Dr. Webster argues instead of focusing on human/user-centered design, EMR vendors should be focused on activity- or process-centered design. I love what he says about one of the potential problems with human-centered UIs:

Optimization around a user, or user screen, risks the ultimate systems engineering sin: suboptimization. Individual EHR user screens are routinely optimized at the expense of total EHR system workflow usability…I’ve seen EHR screens, which, considered individually, are jewel-like in appearance and cognitive science-savvy in design philosophy, but which do not work together well.

It’s better, he suggests, to have EMRs model “interleaved and interacting sequences of task accomplishment” first and foremost. For example, he writes, key task collections that should be considered as a whole include workflow management systems, business process management, case management and process-aware information systems.

While there’s much more to say here, of course, I’ll close with Dr. Webster’s words, who once makes his point with wonderful clarity:

User-centered EHR design does help get to good EHRs. Good isn’t good enough. If EHRs and HIT are going to help transform healthcare they need to be better than world-class (compared to what?). They need to be stellar. Traditional user-centered design isn’t going to get us there.

The question I’m left with, readers, is whether you can have your cake and eat it too. Does one side of UI/UX design literally have to be jettisoned to support the other?