Some Alexa Health “Skills” Don’t Comply With Amazon Medical Policies

Posted on July 18, 2018 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

It’s becoming predictable: A company offering AI assistant for scheduling medical appointments thinks that consumers want to use Amazon’s Alexa to schedule appointments with their doctor. The company, Nimblr, is just one of an expanding number of developers that see Alexa integration as an opportunity for growth.

However, Nimblr and its peers have stepped into an environment where the standards for health applications are a bit slippery. That’s no fault of theirs, but it might affect the future of Amazon Alexa health applications, which can ultimately affect every developer that works with the Alexa interface.

Nimblr’s Holly AI has recently begun to let patients book and reschedule appointments using Alexa voice commands. According to its prepared statement, Nimblr expects to integrate with other voice command platforms as well, but Alexa is clearly an important first step.

The medical appointment service is integrated with a range of EHRs, including athenahealth, Care Cloud and DrChrono.  To use the service, doctors sign up and let Holly access their calendar and EHR.

Patients who choose to use the Amazon interface go through a scripted dialogue allowing them to set, change or cancel an appointment with their doctor. The patient uses Alexa to summon Holly, then tells Holly the doctor with whom they’d like to book an appointment. A few commands later, the patient has booked a visit. No need to sit at a computer or peer at a smartphone screen.

For Amazon, this kind of agreement is the culmination of a long-term strategy. According to an article featured in Quartz Alexa is now in roughly 20 million American homes and owns more than 70% of the US market for voice-driven assistants. Recently it’s made some power moves in healthcare — including the acquisition of online pharmacy PillPack. It’s has also worked to build connections with healthcare partners, including third-party developers that can enrich the healthcare options available to Alexa users.

Most of the activity that drives Alexa comes from “skills,” which resemble smartphone apps, made available on the Alexa store by independent developers. According to Quartz, the store hosted roughly 900 skills in its “health and fitness” category on the Alexa skills store as of mid-April.

In theory, externally-developed health skills must meet three criteria: they may not collect personal information from customers, cannot imply that they are life-saving by names and descriptions and must include a disclaimer stating that they are not medical devices — and that users should ask their providers if they believe they need medical attention.

However, according to Quartz, as of mid-April there were 65 skills in the store that didn’t provide the required disclaimer. If so, this raises questions as to how stringently Amazon supervises the skills uploaded by its third-party developers.

Let me be clear that I’m not criticizing Nimblr in any way. As far as I know, the company is doing everything the right way. My only critiques would be that it’s not clear to me why its Alexa tool is much more useful than a plain old portal, and that of the demo video is any indication, that the interactions between Alexa and the consumer are a trifle awkward. On the whole, it seems like a useful tool and will likely get better over time.

However, with a growing number of healthcare developers featuring apps Alexa’s skills store, it will be worth watching to see if Amazon enforces its own rules. If not, reputable developers like Nimblr might not want to go there.