Alexa and Medical Practices

Today I was asked to do a webinar for Solutionreach on the topic of “What You Need to Know for 2018: From Government Regulations to New Technology.” It was a fun webinar to put together and I believe you can still register and get access to the recorded version of the webinar.

In my presentation, I covered a lot of ground including talking about the consumerization of healthcare and how our retail experiences are so different than our healthcare experiences. In 2018, I see the wave of technology that’s available to make a medical practice’s patient experience be much closer to a patient’s retail experience. That’s exciting.

One of the areas I mentioned is the move to voice-powered devices like Amazon Echo, Google Home, Siri, etc. Someone asked a question about how quickly these devices were going to hit healthcare. No doubt they have experienced how amazing these devices are in their home (I have 2 at home and love them), but the idea of connecting with your doctor through Alexa is a little mind bending. It goes against our normal rational thoughts. However, it will absolutely happen.

Just to be clear, Alexa is not currently HIPAA compliant. However, many things we want to do in healthcare don’t require PHI. Plus, if the patient agrees to do it, then HIPAA is not an issue. It’s not very hard to see how patients could ask “Alexa, when is my next appointment?” or even “Alexa, please schedule an appointment with my OB/GYN on Friday in the afternoon.” The technology is almost there to do this. Especially if you tie this in to one of the patient self scheduling tools. Pretty amazing to consider, no?

I also highlighted how the latest Amazon Echo Show includes a video screen as well. It’s easy to see how one could say, “Alexa, please connect me with my doctor.” Then, Alexa could connect you with a doctor for a telemedicine visit all through the Alexa Show. Ideally, this would be your primary care doctor, but most patients will be ok with a doctor of any sort in order to make the experience easy and convenient for them.

Of course, we see a lot of other healthcare applications of Alexa. It can help with loneliness. It can help with Alzheimers patients who are asking the same question over and over again and driving their caregiver crazy. It could remind you of medications and track how well you’re doing at taking them or other care plan tracking. And we’re just getting started.

It’s an exciting time to be in healthcare and it won’t be long until voice activated devices like Alexa are connecting us to our healthcare and improving our health.

What do you think of Alexa and other related solutions? Where do you see it having success in healthcare? How long will it take for us to get there?

Note: Solutionreach is a Healthcare Scene sponsor.

About the author

John Lynn

John Lynn is the Founder of HealthcareScene.com, a network of leading Healthcare IT resources. The flagship blog, Healthcare IT Today, contains over 13,000 articles with over half of the articles written by John. These EMR and Healthcare IT related articles have been viewed over 20 million times.

John manages Healthcare IT Central, the leading career Health IT job board. He also organizes the first of its kind conference and community focused on healthcare marketing, Healthcare and IT Marketing Conference, and a healthcare IT conference, EXPO.health, focused on practical healthcare IT innovation. John is an advisor to multiple healthcare IT companies. John is highly involved in social media, and in addition to his blogs can be found on Twitter: @techguy.

8 Comments

  • My Mom had a devastating stroke and lost control of her dominant left side of her body, she is unable to type she lost use of her computer also. We purchased her an echo dot for the holidays and it was an instant hit. She could ask it questions, Time and date and play music. Unfortunately the nursing facility in which she now lives says having it in her room is a potential HIPAA liability and had it removed. Getting this technology approved will be a great patent quality of life boost also.

  • Hi Mike,
    Thanks for sharing. I have to admit that I think the nursing facility is dead wrong about it being a potential HIPAA liability for your mother to use it. It might be a HIPAA liability for them to use it, but having it in her room is her choice. As a patient, she can do whatever she wants with her health information and it’s not a violation of HIPAA. I think they’re inappropriately using HIPAA to remove the device, but I’ll ask my lawyer friends to chime in here. I don’t think you should have to wait at all for her to be able to use it.

  • Based on the facts as described, there are likely a few issues going on here, though HIPAA should not necessarily be one of them. From the strict HIPAA perspective, the echo dot in this case is a device being used by the patient and is arguably no different than a cellphone or any other electronic device that could record information brought by a patient to an encounter with a provider. As a tool of the patient, the device is recording and accessing information for the patient, which is not a HIPAA concern of the provider. It is the recording aspect that could present an issue. If the echo dot passively records information, there could be objection on that front as facilities can (and should) have policies on recording in the facility. Any objection by the facility on that front would be more of a malpractice or other liability issue as opposed to a HIPAA one though. Turning back to HIPAA, it could be an issue if the echo dot could record information it overhears concerning another patient because a facility does have an obligation to protect the privacy of all patients there. While this is not necessarily a complete answer, it does show there are always nuances and different considerations in every situation.

  • I agree with Matt here. But I’d add a different way to look at it. Can the Echo be seen as an assistive device? Often computers are seen as assistive devices for those with differing communication needs. Because she cannot use a computer, one could see this as an assistive device that helps her communicate. But then as described above, it’s really mostly used for personal enjoyment. The case would need to be made that she has to have this for medical reasons.

    Not to complicate this by adding in disability law too much, but I think if we are going to accept that these devices can help in getting health care, that we should be able to see them as tools to help those with disabilities and think outside the box on why a patient may need such a tool.

    Overall, I would really recommend sitting down with the facility’s privacy and security officer and having a chat on this policy. Often they haven’t thought it through (especially with newer technology) and have just told staff it’s not allowed. I find that just being told “it’s a HIPAA issue” or like brush off means that you need to talk to someone higher up and have a more in depth conversation. If you can go in and talk to them about why it’s needed of if they can provide other ideas on how to accommodate her given she cannot use a computer, that would be more helpful.

  • []
    Hi folks,

    I’m with a company called Orbita, and we work specifically with Voice experiences (everything from apps to smart speakers) within healthcare. The question of HIPAA compliance as it relates to smart speakers is incredibly common.

    The reality is that Voice can be a more accessible medium for folks in Senior Care compared to smartphones or computers, and there are plenty of actions that the smart speakers can facilitate that aren’t P.H.I. related. Assisting a senior in booking a ride to the hospital for an appointment, sharing news in their community, reminding them it’s someone’s Birthday, medication reminders – the expanse of Skill’s being developed resembles the diversity we saw when Smartphone apps began to be released.

    Responding specifically to the question of how often Alexa is recording, it may be helpful to understand the how the speakers are built. There are two different “modes” that use two separate sets of hardware. The first is always “on” and is ONLY listening for the distinct syllables of the wake-word (i.e. Alexa, Echo). This chip contains almost no memory and does not transmit outside of the console. It’s ONLY purpose is to listen for the wake-word, and then wake up the second Mode.

    That second Mode is what does the active listening, engaging and responding to the user. It is only on once it is notified the wake-word was spoken, and it ignores background noise and focuses specifically on a single human voice. It’s impossible from a hardware standpoint for the device to always be listening actively (let alone recording and transmitting) for information outside of a few basic words.

    Happy to answer any questions, here or directly!
    Evan – evan.sutherland@orbita.ai

  • Thanks everyone. Lots to chew on here. My point is similar to Erin’s. If they blame it on HIPAA, they’re often just using it as an excuse to not do something. Most haven’t gone to the real effort of understanding the nuances and balancing the legal requirements with the desires of the patient.

  • Analysis should be similar to use of camera (incl cellphone camera) in healthcare facility. Do you just tell people to be careful abt what’s in the background or do you ban them? If you ban the dot you have to ban lots of other things. IMHO largely fear of the unknown.

  • Definitely not in any way HIPAA concern. Totally bogus. It’s fear of unknown device. FUD.
    Like the assistive device angle above. Staff should t use the device any PHI along with her cellphone etc.

Click here to post a comment
   

Categories