Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Is EHR Use Causing Physician Burnout?

Posted on November 12, 2018 I Written By

The following is a guest blog post by Wayne Crandall, President & CEO of NoteSwift.

Over the past decade, numerous studies have been published with the same concerning conclusion – physicians are increasingly expressing feelings of burnout, frustration, and a lack of support from their employers and technology solutions. There is no single cause for this burnout, but there are plenty of signals pointing to a primary cause:

EHR use, requirements, and regulations are leading to incredibly high levels of physician burnout.

The data is increasingly clear on this issue. Consider this statistic: according to a 2015 survey, almost 90% of doctors feel moderately to severely stressed and burned out on an average workday.

And this one: A new study by the University of Wisconsin and the American Medical Association (AMA) found primary care physicians spend almost six hours (5.9) on EHR data entry during a typical 11.4 hour workday.

Because of this rapid rise in physician burnout and clear connection to EHR use and management, we decided to look more deeply into the causes, symptoms, and possible solutions to the physician burnout crisis. The result of this research is a newly published white paper we’ve created in partnership with Dr. Robert Van Demark, Jr., a leading voice on the issue of physician burnout.

In this paper, you’ll find the following:

  • Compilation of recent data and studies on the symptoms and causes or physician burnout.
  • Researching connecting physician burnout to employee retention
  • Examination of how EHR use contributes to the burnout crisis
  • A look ahead to emerging solutions to this crisis

There are many compelling examples for why this research is more timely and important than ever. In a time where many physicians are questioning whether the burnout, stress, and anxiety are worth it, health care systems are reporting massive costs for recruiting and replacing doctors who leave due to burnout and overwork. The stakes could not be higher for health systems, doctors, and patients who need access to expert care.

The paper also takes a closer look at the innovative world of artificial intelligence and how it holds much promise for improving health care and EHR entry through automation and understanding. At a time where physicians are looking for more ways to control their workflow and create better, more efficient care for patients, the world of artificial intelligence is leading the way toward better solutions and better care.

I was recently reading a helpful LinkedIn article on the topic of physician burnout, and the author noted how many practices and health care systems focus on treating the symptoms of physician burnout instead of treating the actual cause of this burnout. More meetings, more committees, more work for doctors, while the underlying causes go untreated. EHRs are a primary cause of this burnout, and we believe that finding a better way to handle our EHR work is major way we can improve workflows and reduce physician burnout. Hopefully this white paper can lead the conversation in that direction.

To receive your complimentary copy of this white paper, “Physician Burnout By The Numbers,” click here. You’ll receive instant access to the paper as a resource for you and your team.

About Wayne Crandall
Wayne Crandall’s career in technology spans sales, marketing, product management, strategic development and operations. Wayne was a co-founder, executive officer, and senior vice president of sales, marketing and business development at Nuance Communications and was responsible for growing the company to over $120M following the acquisition of Dragon and SpeechWorks.

Prior to joining the NoteSwift team, Wayne was President and CEO of CYA Technologies and then took over as President of enChoice, which specialized in ECM systems and services, when they purchased CYA.

Wayne joined NoteSwift, Inc. at its inception, working with founder Dr. Chris Russell to build the team from the ground up. Wayne has continued to guide the company’s growth and evolution, resulting in the development of the industry’s first AI-powered EHR Virtual Assistant, Samantha(TM).

NoteSwift is the leading provider of EHR Virtual Assistants and a proud sponsor of Healthcare Scene.

Stanford Offers 10-Year Vision For EHRs

Posted on October 12, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Despite many efforts to improve EHRs, few physicians see them as adding value to the practice. Sadly, it’s little surprise given that many vendors don’t worry much about what physicians want, focusing instead on selling features to CIOs.

As a result, they still don’t like their EHRs that much. In fact, a recent survey conducted by Stanford Medicine and the Harris Poll found that 44% of physicians said that the top value of the EHR was to serve as digital storage, which isn’t a ringing endorsement. Just eight percent saw the EHR as having clinical value, with three percent citing disease prevention, 2% clinical decision support and 3% patient engagement as top benefits.

Is it possible to create a new EHR model that physicians love? According to Stanford, we could build out an ideal EHR by the year 2028.

In Stanford’s vision, clinicians and other healthcare professionals simply take care of the patients without having to think about health records. Once examinations are complete, information would flow seamlessly to all parties involved, including payers, hospitals, physicians and the patient.

Meanwhile, it would be possible to populate the EHR with little or no effort. For example, an automated physician’s assistant would “listen” to interactions between the doctor and the patient and analyze what was said. Depending on what is said in the room, along with verbal cues of the clinicians, it would record all relevant information in the physical exam.

What’s more, the automated physician’s assistant would have AI capabilities, allowing it to synthesize medical literature, the patient’s history and relevant histories of other patients available in anonymized, aggregated form.

Having reviewed these factors, the system would then populate different possible diagnoses for the clinician to address. The analysis would take patient characteristics into account, including lifestyle, medication history, and genetic makeup.

In addition to its vision, the survey report offered some short-term recommendations on how medical practices can support physician EHR use. They included:

  • Training physicians well on how to use the EHR when they’re coming on board, as well as when there are incremental changes to the system
  • Involving physicians in the development of clinical workflows that take advantage of EHR capabilities
  • Delivering EHR development projects as quickly as possible once physicians request them
  • Making data analytics abilities available to physicians in a manner that can be used intuitively at the point of care
  • Considering automated solutions to eliminate manual EHR documentation

Technologists, for their part, can take also take immediate steps to support physician EHR use, including:

  • Developing systems and product updates in partnership with physicians
  • Limiting the use of manual EHR documentation by using AI, natural language processing and other emerging technologies
  • Using AI to perform several other functions, including synthesizing and summarizing relevant information in the EHR for each patient encounter and offering current and contextualized information to each member of the patient care team

In addition, to boost the value of EHRs over the long-term, 67% of physicians said making interoperability work was important, followed by improving predictive analytics capabilities (43%), and integrating financial information into the EHR to help patients understand care costs (32%).

Real-world Health AI Applications in 2018 and Further

Posted on August 29, 2018 I Written By

The following is a guest blog post by Inga Shugalo, Healthcare Industry Analyst at Itransition.

In contrast to legacy systems that are just algorithms performing strict tasks, artificial intelligence can extend the task itself, creating new insights from the information fed to it. Current healthcare AI is powerful enough to undertake such complex challenges as automated diagnosis, medical image analysis, virtual patient assistance, and risk analysis, supporting health specialists in making more swift and informed decisions.

In 2016, Frost & Sullivan predicted the healthcare AI market to reach $6.6 billion by 2021. Meanwhile, 2017’s Accenture report estimates AI saving $150 billion annually for the U.S. healthcare economy by 2026. “At hyper-speed, AI is re-wiring our modern conception of healthcare delivery,” researchers from Accenture say.

Standing in the middle of 2018, the industry already hints on its course regarding further AI expansion. Spoiler alert: as well as with blockchain AR, VR, and any other kind of innovative custom medical software, the adoption challenges persist.

Current and prospective AI directions in healthcare

Diagnosis support

One of the most fascinating and valuable directions for AI to evolve is its ability to help providers diagnose patients more accurately and at a higher pace. We are thrilled to see how 2018 erupts with many healthcare organizations adopting artificial intelligence and creating unprecedented cases of assisted diagnostics with it.

Geisinger specialists applied AI to analyze CT scans of patients’ heads and detect intracranial hemorrhage early. Intracranial hemorrhage is a life-threatening form of internal bleeding, affecting about 50,000 patients per year, with 47% dying within 30 days.

Geisinger was able to automatically pinpoint and prioritize the cases of intracranial hemorrhage, focusing the attention of radiologists on them and thus allowing for timely interventions. This approach reduced the time to diagnosis by 96%.

Mayo Clinic currently uses IBM Watson’s superpowers to match patients with fitting clinical trials. The clinic’s officials stated that only 5% of patients enrolled in trials in the U.S., which significantly hinders clinical research and innovation in cancer therapies. On the other side, manual patient-trial matching is a time-exhausting process.

Watson runs this process on the background, comparing the patients’ conditions with available trials and suggesting the appropriate trials for providers and patients to consider including in a treatment plan. Since its implementation in 2016, Watson was able to deliver about an 80% increase in enrollment to Mayo’s trials for breast cancer.

Patient risk analysis

“…Healthcare is one of the most important fields AI is going to transform,” Google CEO Sundar Pichai noted during the Google I/O 2018 keynote. Last year, the event presented Google AI, a “collection of our teams and efforts to bring the benefits of AI to everyone.”

In 2018, Google uses their AI to tap into critical patient risks, such as mortality, readmission, and prolonged LOS. Cooperating with UC San Francisco, The University of Chicago Medicine, and Stanford Medicine, they analyzed over 46 billion anonymized retrospective EHR data points collected from over 216 thousand adult patients hospitalized for at least 24 hours at two US academic medical centers.

The deep learning model built by researchers reviewed each patient’s chart as a timeline, from its creation to the point of hospitalization. This data allowed clinicians to make various predictions on patient health outcomes, including prolonged length of stay, 30-day unplanned readmission, upcoming in-hospital mortality, and even a patient’s final discharge diagnosis. Remarkably, the model achieved an accuracy level that significantly outperformed traditional predictive models.

According to Pichai, “If you go and analyze over 100,000 data points per patient, more than any single doctor could analyze, we can actually quantitatively predict the chance of readmission 24 to 48 hours earlier than traditional methods. It gives doctors time to act.”

Of course, researchers don’t claim that their approach is ready for implementation in clinical settings, but they are looking forward to collaborating with providers to test this model further. Hopefully, we will see field trials and, who knows, even early adoption in 2019.

EHRs “on steroids”

HIMSS18 was all about artificial intelligence and machine learning. Surprisingly, all major EHR vendors – Allscripts, Cerner, athenahealth, Epic, and eClinicalWorks – came up with a promise to include AI into upcoming iterations of their platforms.

At the event, Epic announced a new partnership with Nuance to integrate their AI-powered conversational virtual assistant into the Epic EHR workflow. Particularly, the assistant will enable health specialists to access patient information and lab results, record patient vitals as well as check schedules and manage patient appointments using voice.

Similarly, eClinicalWorks puts AI into work on voice control but also prioritizes telemedicine, pop health, and clinical decision support. According to the company’s CEO Girish Navani, “We spent the last decade putting data in EHRs. The next decade is about intelligence and creating inferences that improve care outcomes. We can have the computer do things for the clinician to make them aware of actions they can take.” The new EHR’s launch is expected in late 2018 or early 2019.

Athenahealth also added a virtual assistant into their EHRs to improve mobile connectivity and welcomes NoteSwift’s AI-based Samantha technology to enhance clinical workflows by introducing robust automation. Samantha can grasp free-text and natural language, process information, structure it, assign ICD-10, SNOMED or CPT codes, prepare e-prescriptions and orders.

Pre-existing challenges for healthcare AI adoption

Gartner predicted that 50% of organizations will miss AI and data literacy skills to gain business value by 2020. Certainly, a lot of healthcare organizations will get in this 50%, and there are two reasons for that.

Regulations and security concerns are the main pre-existing challenges that delay practically any technology adoption in healthcare and entail an array of new challenges along with them.

First, an AI application or device has to be approved by the FDA. The catch is that the existing process focuses on the hardware or the way that algorithms work, but not the data it should or would interact with.

Speaking of data, another challenge is security breaches. Safeguarding sensitive information is a must for healthcare because patient data is a constant target for identity theft and reimbursement fraud. In Accenture’s new report, nearly 25% of healthcare execs admitted experiencing “adversarial AI behaviors, like falsified location data or bot fraud.” While this doesn’t mean AI threatens patient data, such claims do increase the concerns related to its adoption.

Still, artificial intelligence is growing in healthcare and will continue to do so. Maybe not at rocket speed, but the most recent cases show consistent improvements in major care delivery gaps. Healthcare AI’s future appears bright.

About Inga Shugalo
Inga Shugalo is a Healthcare Industry Analyst at Itransition. She focuses on Healthcare IT, highlighting the industry challenges and technology solutions that tackle them. Inga’s articles explore diagnostic potential of healthcare IoT, opportunities of precision medicine, robotics and VR in healthcare and more.

AI-Based Tech Could Speed Patient Documentation Process

Posted on August 27, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

A researcher with a Google AI team, Google Brain, has published a paper describing how AI could help physicians complete patient documentation more quickly. The author, software engineer Peter Lui, contends that AI technology can speed up patient documentation considerably by predicting its content.

On my initial reading of the paper, it wasn’t clear to me what advantage this has over pre-filling templates or even allowing physicians to cut-and-paste text from previous patient encounters. Still, judge for yourself as I outline what author Liu has to say, and by all means, check out the write-up.

In its introduction, the paper notes that physicians spend a great deal of time and energy entering patient notes into EHRs, a process which is not only taxing but also demoralizing for many physicians. Choosing from just one of countless data points underscoring this conclusion, Liu cites a 2016 study noting that physicians spend almost 2 hours of administrative work for every hour of patient contact.

However, it might be possible to reduce the number of hours doctors spend on this dreary task. Google Brain has been working on technologies which can speed up the process of documentation, including a new medical language modeling approach. Liu and his colleagues are also looking at how to represent an EHR’s mix of structured and unstructured text data.

The net of all of this? Google Brain has been able to create a set of systems which, by drawing on previous patient records can predict most of the content a physician will use next time they see that patient.

The heart of this effort is the MIMIC-III dataset, which contains the de-identified electronic health records of 39,597 patients from the ICU of a large tertiary care hospital. The dataset includes patient demographic data, medications, lab results, and notes written by providers. The system includes AI capabilities which are “trained” to predict the text physicians will use in their latest patient note.

In addition to making predictions, the Google Brain AI seems to have been able to pick out some forms of errors in existing notes, including patient ages and drug names, as well as providing autocorrect options for corrupted words.

By way of caveats, the paper warns that the research used only data generated within 24 hours of the current note content. Liu points out that while this may be a wide enough range of information for ICU notes, as things happen fast there, it would be better to draw on data representing larger windows of time for non-ICU patients. In addition, Liu concedes that it won’t always be possible to predict the content of notes even if the system has absorbed all existing documentation.

However, none of these problems are insurmountable, and Liu understandably describes these results as “encouraging,” but that’s also a way of conceding that this is only an experimental conclusion. In other words, these predictive capabilities are not a done deal by any means. That being said, it seems likely that his approach could be valuable.

I am left with at least one question, though. If the Google Brain technology can predict physician notes with great fidelity, how does that differ than having the physician cut-and-paste previous notes on their own?  I may be missing something here, because I’m not a software engineer, but I’d still like to know how these predictions improve on existing workarounds.

Let Vendors Lead The Way? Are You Nuts?

Posted on August 13, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Every now and then, a vendor pops up and explains how the next-gen EHR should work. It’s easy to ask yourself why anyone should listen, given that you’re the one dishing out the care. But bear with me. I’ve got a theory working here.

First of all, let’s start with a basic assumption, that EHRs aren’t going to stay in their current form much longer. We’re seeing them grow to encompass virtually every form of medical data and just about every transaction, and nobody’s sure where this crazy process is going to end.

Who’s going to be our guide to this world? Vendors. Yup, the people who want to sell you stuff. I will go out on a limb and suggest that at this point in the health data revolution, they’re in a better position to predict the future.

Sure, that probably sounds obnoxious. While vendors may employ reputable, well-intended physicians, the vast majority of those physicians don’t provide care themselves anymore. They’re rusty. And unless they’re in charge of the company they serve, their recommendations may be overruled by people who have never touched a patient.

On the flip side, though, vendor teams have the time and money to explore emerging technologies, not just the hip stuff but the ones that will almost certainly be part of medical practice in the future. The reality is that few practicing physicians have time to keep up with their progress. Heck, I spend all day researching these things, and I’m going nuts trying to figure out which tech has gone from a nifty idea to a practical one.

Given that vendors have the research in hand, it may actually make sense to let them drive the car for a while. Honestly, they’re doing a decent job of riding the waves.

In fact, it seems to me that the current generation of health data management systems are coming closer to where they should be.  For example, far more of what I’d call “enhanced EHR” systems include care management tools, integrating support for virtual visits and modules that help practices pull together MIPS data. As always, they aren’t perfect – for example, few ambulatory EHRs are flexible enough to add new functions easily — but they’re getting better.

I guess what I’m saying is that even if you have no intention of investing in a given product, you might want to see where developers’ ideas are headed. Health data platforms are at an especially fluid stage right now, tossing blockchain, big data analytics, AI and genomic data together and creating new things. Let’s give developers a bit of slack and see what they can do to tame these beasts.

Physicians Lack IT Tools Needed For Value-Based Care

Posted on July 23, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

A new study sponsored by Quest Diagnostics has concluded that progress toward value-based care has slowed because physicians lack the IT tools they need.  In fact, the survey of health plan executives and physicians found that both groups see the progress of VBC as backsliding, with 67% reporting that the U.S. still has a fee-for-service system in place.

The study, which was conducted by Regina Corso Consulting, took place between April 26 and May 7 of 2018, included 451 respondents, 300 which for primary care physicians in private practice. The other 151 were health plan executives holding director-level positions.

More than half (57%) of health plan respondents said that a lack of tools is preventing doctors from moving ahead with VBC, compared with 45% last year. Also, 72% of physicians and health plan leaders said that doctors don’t have all the information they need about their patients to proceed with VBC.

A minority of doctors (39%) reported that EHRs provide all the data they need to care for the patients, though 86% said they could provide better care for patients if their EHR was interoperable with other technologies. Eighty-eight percent of physicians and health plan execs said that such data can provide insights that prescribing and claims data typically can’t.

All of the survey respondents agreed that making do with existing health IT tools is better than spending more. Fifty-three percent said that optimizing existing health information technology made sense, compared with 25% recommending investing in some new information technology and just 11% suggesting that large information technology infrastructure investments were a good idea.

Survey respondents said that a lack of interoperability between health IT systems with the biggest barrier to investing in new technology, followed by the perception that it would create more work while producing little or no benefit.

On the other hand, respondents named several technologies which could help speed VBC adoption. They include bioinformatics (73%), AI (68%), SMART app platform (65%), FHIR (64%), machine learning (64%), augmented reality (51%) and blockchain (47%). In its commentary, the report noted that SMART app platform use and FHIR might offer near-term benefits, as they allow companies to plug new technologies into existing platforms.

Bottom line, new ideas and technologies can make a difference. Eighty-nine percent off health plan execs and physicians said that healthcare organizations need to be more innovative and integrate more options and tools that support patient care.

Some Alexa Health “Skills” Don’t Comply With Amazon Medical Policies

Posted on July 18, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

It’s becoming predictable: A company offering AI assistant for scheduling medical appointments thinks that consumers want to use Amazon’s Alexa to schedule appointments with their doctor. The company, Nimblr, is just one of an expanding number of developers that see Alexa integration as an opportunity for growth.

However, Nimblr and its peers have stepped into an environment where the standards for health applications are a bit slippery. That’s no fault of theirs, but it might affect the future of Amazon Alexa health applications, which can ultimately affect every developer that works with the Alexa interface.

Nimblr’s Holly AI has recently begun to let patients book and reschedule appointments using Alexa voice commands. According to its prepared statement, Nimblr expects to integrate with other voice command platforms as well, but Alexa is clearly an important first step.

The medical appointment service is integrated with a range of EHRs, including athenahealth, Care Cloud and DrChrono.  To use the service, doctors sign up and let Holly access their calendar and EHR.

Patients who choose to use the Amazon interface go through a scripted dialogue allowing them to set, change or cancel an appointment with their doctor. The patient uses Alexa to summon Holly, then tells Holly the doctor with whom they’d like to book an appointment. A few commands later, the patient has booked a visit. No need to sit at a computer or peer at a smartphone screen.

For Amazon, this kind of agreement is the culmination of a long-term strategy. According to an article featured in Quartz Alexa is now in roughly 20 million American homes and owns more than 70% of the US market for voice-driven assistants. Recently it’s made some power moves in healthcare — including the acquisition of online pharmacy PillPack. It’s has also worked to build connections with healthcare partners, including third-party developers that can enrich the healthcare options available to Alexa users.

Most of the activity that drives Alexa comes from “skills,” which resemble smartphone apps, made available on the Alexa store by independent developers. According to Quartz, the store hosted roughly 900 skills in its “health and fitness” category on the Alexa skills store as of mid-April.

In theory, externally-developed health skills must meet three criteria: they may not collect personal information from customers, cannot imply that they are life-saving by names and descriptions and must include a disclaimer stating that they are not medical devices — and that users should ask their providers if they believe they need medical attention.

However, according to Quartz, as of mid-April there were 65 skills in the store that didn’t provide the required disclaimer. If so, this raises questions as to how stringently Amazon supervises the skills uploaded by its third-party developers.

Let me be clear that I’m not criticizing Nimblr in any way. As far as I know, the company is doing everything the right way. My only critiques would be that it’s not clear to me why its Alexa tool is much more useful than a plain old portal, and that of the demo video is any indication, that the interactions between Alexa and the consumer are a trifle awkward. On the whole, it seems like a useful tool and will likely get better over time.

However, with a growing number of healthcare developers featuring apps Alexa’s skills store, it will be worth watching to see if Amazon enforces its own rules. If not, reputable developers like Nimblr might not want to go there.

AMA Hopes To Drive Healthcare AI

Posted on July 6, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Last month, the AMA adopted a new policy setting standards for its approach to the use of AI. Now, the question is how much leverage it will actually have on the use in the practice of medicine.

In its policy statement, the trade group said it would work to set standards on how AI can improve patient outcomes and physicians’ professional satisfaction. It also hopes to see that physicians get a say-so in the development, design, validation implementation of healthcare AI tools.

More specifically, the AMA said it would promote the development of well-designed, clinically-validated standards for healthcare AI, including that they:

  • Are designed and evaluated using best-practices user-centered design
  • Address bias and avoid introducing or exacerbating healthcare disparities when testing or deploying new AI tools
  • Safeguard patients’ and other individuals’ privacy and preserve security and integrity of personal information

That being said, I find myself wondering whether the AMA will have the chance to play a significant role in the evolution of AI tools. It certainly has a fair amount of competition.

It’s certainly worth noting that the organization is knee-deep in the development of digital health solutions. Its ventures include the MATTER incubator, which brings physicians and entrepreneurs together to solve healthcare problems; biotech incubator Sling Health, which is run by medical students; Health2047, which brings helps healthcare organizations and entrepreneurs work together and Xcertia, an AMA-backed non-profit which has developed a mobile health app framework.

On the other hand, the group certainly has a lot of competition for doctors’ attention. Over the last year or two, the use of AI in healthcare has gone from a nifty idea to a practical one, and many health systems are deploying platforms that integrate AI features. These platforms include tools helping doctors collaborate with care teams, avoid errors and identify oncoming crises within the patient population.

If you’re wondering why I’m bringing all this up, here’s why. Ordinarily, I wouldn’t bother to discuss an AMA policy statement — some of them are less interesting than watching grass grow — but in this case, it’s worth thinking about for a bit.

When you look at the big picture, it matters who drive the train when it comes to healthcare AI. If physicians take the lead, as the AMA would obviously prefer, we may be able to avoid the deployment of user-hostile platforms like many of the first-generation EHRs.

If hospitals end up dictating how physicians use AI technology, it might mean that we see another round of kludgy interfaces, lousy decision-support options and time-consuming documentation extras which will give physicians an unwanted feeling of deja-vu. Not to mention doctors who refuse to use it and try to upend efforts to use AI in healthcare.

Of course, some hospitals will have learned from their mistakes, but I’m guessing that many may not, and things could go downhill from there. Regardless, let’s hope that AI tools don’t become the next albatross hung around doctors’ necks.

This Futurist Says AI Will Never Replace Physicians

Posted on June 6, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Most of us would agree that AI technology has amazing — almost frightening — potential to change the healthcare world. The thing is, no one is exactly sure what form those changes will take, and some fear that AI technologies will make their work obsolete. Doctors, in particular, worry that AI will undercut their decision-making process or even take their jobs.

Their fears are not entirely misplaced. Vendors in the healthcare AI world insist that their products are intended solely to support care, but of course, they need to say that. It’s not surprising that doctors fret as AI software starts to diagnose conditions, triage patients and perform radiology readings.

But according to medical futurist Bertalan Mesko, MD, Ph.D., physicians have nothing to worry about. “AI will transform the meaning of what it means to be a doctor; some tasks will disappear while others will be added to the work routine,” Mesko writes. “However, there will never be a situation where the embodiment of automation, either a robot or an algorithm, will take the place of a doctor.”

In the article, Mesko lists five reasons why he takes this position:

  1. Empathy is irreplaceable: “Even if the array of technologies will offer brilliant solutions, it would be difficult for them to mimic empathy,” he argues. “… We will need doctors holding our hands while telling us about life-changing diagnoses, their guide to therapy and their overall support.”
  2. Physicians think creatively: “Although data, measurements and quantitative analytics are a crucial part of a doctor’s work…setting up a diagnosis and treating a patient is not a linear process. It requires creativity and problem-solving skills that algorithms and robots will ever have,” he says.
  3. Digital technologies are just tools: “It’s only doctors together with their patients who can choose [treatments], and only physicians can evaluate whether the smart algorithm came up with potentially useful suggestions,” Mesko writes.
  4. AI can’t do everything: “There are responsibilities and duties which technologies cannot perform,” he argues. “… There will always be tasks where humans will be faster, more reliable — or cheaper than technology.”
  5. AI tech isn’t competing with humans: “Technology will help bring medical professionals towards a more efficient, less error-prone and more seamless healthcare,” he says. “… The physician will have more time for the patient, the doctor can enjoy his work in healthcare will move into an overall positive direction.”

I don’t have much to add to his analysis. I largely agree with what he has to say.

I do think he may be wrong about the world needing physicians to make all diagnoses – after all, a sophisticated AI tool could access millions of data points in making patient care recommendations. However, I don’t think the need for human contact will ever go away.

Recording Doctor-Patient Visits Shows Great Potential

Posted on June 1, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Doctors, do you know how you would feel if a patient recorded their visit with you? Would you choose to record them if you could? You may soon find out.

A new story appearing in STAT suggests that both patients and physicians are increasingly recording visits, with some doctors sharing the audio recording and encouraging patients to check it out at home.

The idea behind this practice is to help patients recall their physician’s instructions and adhere to treatment plans. According to one source, patients forget between 40% to 80% of physician instructions immediately after leaving the doctor’s office. Sharing such recordings could increase patient recall substantially.

What’s more, STAT notes, emerging AI technologies are pushing this trend further. Using speech recognition and machine learning tools, physicians can automatically transcribe recordings, then upload the transcription to their EMR.

Then, health IT professionals can analyze the texts using natural language processing to gain more knowledge about specific diseases. Such analytics are likely to be even more helpful than processes focused on physician notes, as voice recordings offer more nuance and context.

The growth of such recordings is being driven not only by patients and their doctors, but also by researchers interested in how to best leverage the content found in these recordings.

For example, a professor at Dartmouth is leading a project focused on creating an artificial intelligence-enabled system allowing for routine audio recording of conversations between doctors and patients. Paul Barr is a researcher and professor at the Dartmouth Institute for Health Policy and Clinical Practice.

The project, known as ORALS (Open Recording Automated Logging System), will develop and test an interoperable system to support routine recording of patient medical visits. The fundamental assumption behind this effort is that recording such content on smart phones is inappropriate, as if the patient loses their phone, their private healthcare information could be exposed.

To avoid this potential privacy breach, researchers are storing voice information on a secure central server allowing both patients and caregivers to control the information. The ORALS software offers both a recording and playback application designed for recording patient-physician visits.

Using the system, patients record visits on their phone, have them uploaded to a secure server and after that, have the recordings automatically removed from the phone. In addition, ORALS also offers a web application allowing patients to view, annotate and organize their recordings.

As I see it, this is a natural outgrowth of the trailblazing Open Notes project, which was perhaps the first organization encouraging doctors to share patient information. What makes this different is that we now have the technology to make better use of what we learn. I think this is exciting.