Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Can AI Inspire Medical Creativity?

Posted on November 16, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

As the capabilities of healthcare AI tools grow, vendors continue to insist that as with previous generations of technology, AI will simply do the grunt work and free doctors’ minds up for higher uses.

For what it’s worth, the research strongly suggests that this is true.  By all accounts, we are incalculably far from creating technology that can think like a trained human or apply empathy and insight to complex problems.

In the meantime, we may see some unexpected benefits from watching AI tackle healthcare problems. According to Nick Peters, a professor of cardiology and head of cardiac electrophysiology at Imperial College London, observing AI at work may jog physicians’ creativity and push them in directions they never would’ve gone otherwise.

Peters, whose article appears on the World Economic Forum website, believes that because AIs, well, think different(ly), they can sometimes inspire their human partners to try new things. “Machines are beginning to challenge human imagination in a way that may not have been anticipated, and which could…unleash a revolution in creativity,” he asserts.

Among the first changes this revolution may bring in a shift in how we track health. Peters argues that while we currently assess a patient’s status by measuring phenomena like blood pressure, respiration, and pulse, AI will replace these measures with subtler approaches.

Over time, we will use machine learning to identify other signals derived from the use of consumer devices which serve the care process better, Peters argues. “It will enable entirely new fields of cheaper, better and more cost-effective clinical science to emerge that may supersede blunt measurements such as the likes of blood pressure,” he writes.

He predicts that the data which will identify these pathways will spring in part from devices like the Apple Watch 4, which incorporates an ECG. These smart consumer devices, in turn, will eventually be able to alert and recruit a nearby citizen who has registered their competence to deliver CPR, he notes. This could have a major impact on survival rates for time-sensitive problems like cardiac arrest, Peters writes.

As interesting as his observations are, the article is too short. I do wish Peters had extended his argument further and attempted to answer more questions about the impact of AI and analytics on medical practice.

For example, if we are poised to discover health measures which take the place of basic metrics like blood pressure checks, how will we determine whether these new measures deliver the kind of results the old-fashioned ones do? What other medical processes will be transformed, and how?  Also, should we focus AI development on finding alternative approaches to traditional care processes or are they just side benefits that might evolve out of other useful analysis?

Still, merely by envisioning AI as a spur to healthcare creativity, Peters has done us a service. Perhaps physicians will benefit from inevitable differences in which humans and AI software process information rather than working at cross-purposes.

Competition Heating Up For AI-Based Disease Management Players

Posted on May 21, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Working in collaboration with a company offering personal electrocardiograms to consumers, researchers with the Mayo Clinic have developed a technology that detects a dangerous heart arrhythmia. In so doing, the two are joining the race to improve disease management using AI technology, a contest which should pay the winner off handsomely.

At the recent Heart Rhythm Scientific Sessions conference, Mayo and vendor AliveCor shared research showing that by augmenting AI with deep neural networks, they can successfully identify patients with congenital Long QT Syndrome even if their ECG is normal. The results were accomplished by applying AI from lead one of a 12-lead ECG.

While Mayo needs no introduction, AliveCor might. While it started out selling a heart rhythm product available to consumers, AliveCor describes itself as an AI company. Its products include KardiaMobile and KardiaBand, which are designed to detect atrial fibrillation and normal sinus rhythms on the spot.

In their statement, the partners noted that as many as 50% of patients with genetically-confirmed LQTS have a normal QT interval on standard ECG. It’s important to recognize underlying LQTS, as such patients are at increased risk of arrhythmias and sudden cardiac death. They also note that that the inherited form affects 160,000 people in the US and causes 3,000 to 4,000 sudden deaths in children and young adults every year. So obviously, if this technology works as promised, it could be a big deal.

Aside from its medical value, what’s interesting about this announcement is that Mayo and AliveCor’s efforts seem to be part of a growing trend. For example, the FDA recently approved a product known as IDx-DR, the first AI technology capable of independently detecting diabetic retinopathy. The software can make basic recommendations without any physician involvement, which sounds pretty neat.

Before approving the software, the FDA reviewed data from parent company IDx, which performed a clinical study of 900 patients with diabetes across 10 primary care sites. The software accurately identified the presence of diabetic retinopathy 87.4% of the time and correctly identified those without the disease 89.5% of the time. I imagine an experienced ophthalmologist could beat that performance, but even virtuosos can’t get much higher than 90%.

And I shouldn’t forget the 1,000-ton presence of Google, which according to analyst firm CBInsights is making big bets that the future of healthcare will be structured data and AI. Among other things, Google is focusing on disease detection, including projects targeting diabetes, Parkinson’s disease and heart disease, among other conditions. (The research firm notes that Google has actually started a limited commercial rollout of its diabetes management program.)

I don’t know about you, but I find this stuff fascinating. Still, the AI future is still fuzzy. Clearly, it may do some great things for healthcare, but even Google is still the experimental stage. Don’t worry, though. If you’re following AI developments in healthcare you’ll have something new to read every day.

Improving the EHR Interface and Topol Saves Patient’s Life on Flight Home

Posted on March 5, 2013 I Written By

John Lynn is the Founder of the blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of and John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

As I thought through my day at HIMSS, a theme started to emerge from all the dozens of meetings I’ve already had at the show (with many more still to come). The theme I saw coming out was ways to improve the EHR interface. This is a much needed change in EHR’s, so it was interesting to see a whole series of companies working on ways to make the EHR interface better. Here are some of the highlights from companies I talked to at HIMSS.

SwiftKey – While the SwiftKey product can be used in the consumer space as well, it was interesting to see the technology applied to healthcare. SwiftKey is basically a replacement for your mobile device keyboard. In fact, I’d call SwiftKey a smart keyboard for your mobile device. What does it do to make your mobile device keyboard smart?

First, it offers word suggestions you can easily choose as you start to type. Most people are familiar with this base functionality because it exists in some form in most mobile keyboards (or at least it does on my Android). However, they’ve taken it a couple steps further. They actually use the context of what you’ve typed to predict what word you may want to type next. For example, if you type, “nausea and” then it predicts that you’ll want to type vomiting. If you type “urinary” then it will predict tract and then infection. Plus, they told me their algorithm will also learn your own colloquial habits. Kind of reminds me of Dragon voice recognition that learns your voice over time. SwiftKey learns your language habits over time.

I’m sure some of these predictive suggestions could lead to some hilarious ones, but it’s an interesting next step in the virtual keyboards we have on mobile devices. I’ll be interested to hear from doctors about what they think of the SwiftKey keyboard when it’s integrated with the various EHR iPad apps.

M*Modal and Intermountain – Thinking back on the demos and products I’ve seen at HIMSS 2013, I think that the app M*Modal has created for Intermountain might be the coolest I’ve seen so far. In this app, a doctor would say an order for a prescription, and the M*Modal technology would apply voice recognition and then parse the words into the appropriate CPOE order fields. It was pretty impressive to see it in action. Plus, the time difference between speaking the order and trying to manually select the various order fields on the mobile device was incredible.

I was a little disappointed it was only a demo system, but it sounds like Intermountain is still doing some work on their end to make the CPOE happen. I’m also quite interested to see if a simple mobile app like this will see broad adoption or if more features will need to be added to get the wide adoption. However, it was almost like magic to see it take a recorded voice and convert it into 5-7 fields on the screen. I’d be interested to see the accuracy of the implementation across a large set of doctors, but the possibilities are quite interesting for transforming the CPOE interface.

Cerner Mobile – One of the new Cerner ambulatory EHR features is an iPad interface for the doctor. I’m sure that many will think this is old news since so many other iPad EHR interfaces are out there. In some ways it is, but there was a slickness to their app that I hadn’t seen a lot of places. In fact, the demo of their ambulatory EHR iPad app reminded me a lot of the features that I saw in this video Jonathan Dreyer from Nuance created (bottom video) that demonstrated some of the mobile voice capabilities. Plus, the app had a nice workflow and some crazy simple features like doing a refill. One swipe and the med was refilled. Almost makes it too easy.

Canon – This is a little different than some of the other EHR interface things I talk about above. In the case of Canon it was interesting to see the tight integration that’s possible between the Canon scanners and EHR software. Instead of the often laborious process of scanning to your EHR and assigning it to a patient, Canon has a scan direct to EMR option including analyzing the cover sheet to have the scanned document attached to the right patient and EHR chart location. While we’d all love to have paper gone, it will be a part of healthcare for the forseeable future. The scan direct to EMR is a pretty awesome feature.

Those are a number of the EHR interface things that I’ve seen so far at HIMSS. I’m sure there are dozens of others out there as well. I think this is a great trend. Sure, each of these things is only a small incremental change, but with hundreds of EHR vendors all doing small incremental changes we’re going to see great things. That’s good, because many of the current EHR interfaces are terribly unusable.

In an related topic, Eric Topol gave a keynote address at HIMSS today. He had glowing reviews from what I could tell. Although, what’s an even more powerful story is to see the message he shared at HIMSS in action. On Topol’s flight home to San Diego a patient was having some medical issue. He did the ECG right on the plane using his smartphone and the passenger was able to make it safely to the destination. You can read the full story here. What’s even more amazing is that this is the second time something like this has happened to Topol. This probably means he flies too much, but also is an incredible illustration of the mHealth technology at work. Truly amazing!

Full Disclosure: Cerner and Canon are advertisers on this site.