Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Practice’s EMR Implementation Drove Up Costs For Six Months

Posted on September 28, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Everyone knows that providers incur EMR-related costs until well after it is implemented. According to a new study, in fact, one medical incurred higher costs for six months after its implementation.

The study, which appeared recently in The Journal of Bone & Joint Surgery, calculated the impact of an EMR implementation on labor costs and productivity at an outpatient orthopedic clinic. The researchers conducting the study used time-driven activity-based costing to estimate EMR-related expenses.

To conduct the study, the research team timed 143 patients prospectively throughout their clinic visit, both before implementation of the hospital system-wide EMR and then again at two months, six months and two years after the implementation.

The researchers found that after the first two months, total labor costs per patient had shot up from $36.88 to $46.04.

One reason for the higher costs was a growth in the amount of time attending surgeons spent per patient, which went up from 9.38 to 10.97 minutes, increasing surgeon cost from $21 to $27.01. In addition, certified medical assistants for spending what time assessing patients, with the time spent almost tripling from 3.42 to 9.1 minutes.

On top of all of this, providers were spending more than twice as much time documenting patient encounters as they had before, up to 7.6 minutes from 3.3 minutes prior to the implementation.

By the six-month mark, however, labor costs per patient had largely returned to their previous levels, settling at $38.75 compared with $36.88 prior to the installation, and expense which remain at the same level when calculated at two years after the EMR implementation.

However, providers were spending even more time documenting encounters than they had before the rolling, with time climbing to 8.43 minutes or roughly 5 minutes more than prior to the introduction of the EMR. Not only that, providers were spending less time interacting with patients, falling to 10.03 as compared with 14.65 minutes in the past.

Sadly, we might have been able to predict this outcome. Clearly, the clinic’s EMR implementation has burdened its providers and further minimized time the providers spend with their patients. This, unfortunately, is more of a rule than an exception.

So why did the ortho practice even bother? It’s hard to say. The study doesn’t say what the practice hoped to accomplish by putting the EMR in place, or whether it met those goals. Given that the system was still in place after two years one would hope that it was providing some form of value.

Truthfully, I’d much rather have learned about what the clinic actually got for its investment than how long it took to get everyone trained up and using it. To be fair, though, this data might have some relevance to the hospital systems that manage a broad spectrum of medical practices, and that’s worth something.

AI-Based Tech Could Speed Patient Documentation Process

Posted on August 27, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

A researcher with a Google AI team, Google Brain, has published a paper describing how AI could help physicians complete patient documentation more quickly. The author, software engineer Peter Lui, contends that AI technology can speed up patient documentation considerably by predicting its content.

On my initial reading of the paper, it wasn’t clear to me what advantage this has over pre-filling templates or even allowing physicians to cut-and-paste text from previous patient encounters. Still, judge for yourself as I outline what author Liu has to say, and by all means, check out the write-up.

In its introduction, the paper notes that physicians spend a great deal of time and energy entering patient notes into EHRs, a process which is not only taxing but also demoralizing for many physicians. Choosing from just one of countless data points underscoring this conclusion, Liu cites a 2016 study noting that physicians spend almost 2 hours of administrative work for every hour of patient contact.

However, it might be possible to reduce the number of hours doctors spend on this dreary task. Google Brain has been working on technologies which can speed up the process of documentation, including a new medical language modeling approach. Liu and his colleagues are also looking at how to represent an EHR’s mix of structured and unstructured text data.

The net of all of this? Google Brain has been able to create a set of systems which, by drawing on previous patient records can predict most of the content a physician will use next time they see that patient.

The heart of this effort is the MIMIC-III dataset, which contains the de-identified electronic health records of 39,597 patients from the ICU of a large tertiary care hospital. The dataset includes patient demographic data, medications, lab results, and notes written by providers. The system includes AI capabilities which are “trained” to predict the text physicians will use in their latest patient note.

In addition to making predictions, the Google Brain AI seems to have been able to pick out some forms of errors in existing notes, including patient ages and drug names, as well as providing autocorrect options for corrupted words.

By way of caveats, the paper warns that the research used only data generated within 24 hours of the current note content. Liu points out that while this may be a wide enough range of information for ICU notes, as things happen fast there, it would be better to draw on data representing larger windows of time for non-ICU patients. In addition, Liu concedes that it won’t always be possible to predict the content of notes even if the system has absorbed all existing documentation.

However, none of these problems are insurmountable, and Liu understandably describes these results as “encouraging,” but that’s also a way of conceding that this is only an experimental conclusion. In other words, these predictive capabilities are not a done deal by any means. That being said, it seems likely that his approach could be valuable.

I am left with at least one question, though. If the Google Brain technology can predict physician notes with great fidelity, how does that differ than having the physician cut-and-paste previous notes on their own?  I may be missing something here, because I’m not a software engineer, but I’d still like to know how these predictions improve on existing workarounds.