Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

OpenUMA: New Privacy Tools for Health Care Data

Posted on August 10, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site ( and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The health care field, becoming more computer-savvy, is starting to take advantage of conveniences and flexibilities that were developed over the past decade for the Web and mobile platforms. A couple weeks ago, a new open source project was announced to increase options for offering data over the Internet with proper controls–options with particular relevance for patient control over health data.

The User-Managed Access (UMA) standard supports privacy through a combination of encryption and network protocols that have a thirty-year history. UMA reached a stable release, 1.0 in April of this year. A number of implementations are being developed, some of them open source.

Before I try to navigate the complexities of privacy protocols and standards, let’s look at a few use cases (currently still hypothetical) for UMA:

  • A parent wants to show the child’s school records from the doctor’s office just long enough for the school nurse to verify that the child has received the necessary vaccinations.

  • A traveler taking a temporary job in a foreign city wants to grant a local clinic access to the health records stored by her primary care physician for the six months during which the job lasts.

The open source implementation I’ll highlight in this article is OpenUMA from a company named ForgeRock. ForgeRock specializes in identity management online and creates a number of open source projects that can be found on their web page. They are also a leading participant in the non-profit Kantara Initiative, where they helped develop UMA as part of the UMA Developer Resources Work Group.

The advantage of open source libraries and tools for UMA is that the standard involves many different pieces of software run by different parts of the system: anyone with data to share, and anyone who wants access to it. The technology is not aimed at any one field, but health IT experts are among its greatest enthusiasts.

The fundamental technology behind UMA is OAuth, a well-tested means of authorizing people on web sites. When you want to leave a comment on a news article and see a button that says, “Log in using Facebook” or some other popular site, OAuth is in use.

OAuth is an enabling technology, by which I mean that it opens up huge possibilities for more complex and feature-rich tools to be built on top. It provides hooks for such tools through its notion of profiles–new standards that anyone can create to work with it. UMA is one such profile.

What UMA contributes over and above OAuth was described to me by Eve Maler, a leading member of the UMA working group who wrote their work up in the specification I cited earlier, and who currently works for ForgeRock. OAuth lets you manage different services for yourself. When you run an app that posts to Twitter on your behalf, or log in to a new site through your Facebook account, OAuth lets your account on one service do something for your account on another service.

UMA, in contrast, lets you grant access to other people. It’s not your account at a doctor’s office that is accessing data, but the doctor himself.

UMA can take on some nitty-gritty real-life situations that are hard to handle with OAuth alone. OAuth provides a single yes/no decision: is a person authorized or not? It’s UMA that can handle the wide variety of conditions that affect whether you want information released. These vary from field to field, but the conditions of time and credentials mentioned earlier are important examples in health care. I covered one project using UMA in an earlier article.

With OAuth, you can grant access to an account and then revoke it later (with some technical dexterity). But UMA allows you to build a time limit into the original access. Of course, the recipient does not lose the data to which you granted access, but when the time expires he cannot return to get new data.

UMA also allows you to define resource sets to segment data. You could put vaccinations in a resource set that you share with others, withholding other kinds of data.

OpenUMA contains two crucial elements of a UMA implementation:

The authorization server

This server accepts a list of restrictions from the site holding the data and the credentials submitted by the person requesting access to the data. The server is a very generic function: any UMA request can use any authorization server, and the server can run anywhere. Theoretically, you could run your own. But it would be more practical for a site that hosts data–Microsoft HealthVault, for instance, or some general cloud provider–to run an authorization server. In any case, the site publicizes a URL where it can be contacted by people with data or people requesting data.

The resource server

These submit requests to the authorization server from applications and servers that hold people’s data. The resource server handles the complex interactions with the authorization server so that application developers can focus on their core business.

Instead of the OpenUMA resource server, apps can link in libraries that provide the same functions. These libraries are being developed by the Kantara Initiative.

So before we can safely share and withhold data, what’s missing?

The UMA standard doesn’t offer any way to specify a condition, such as “Release my data only this week.” This gap is filled by policy languages, which standards groups will have to develop and code up in a compatible manner. A few exist already.

Maler points out that developers could also benefit from tools for editing and testing code, along with other supporting software that projects build up over time. The UMA resource working group is still at the beginning of their efforts, but we can look forward to a time when fine-grained patient control over access to data becomes as simple as using any of the other RESTful APIs that have filled the programmer’s toolbox.

EMR Data and Privacy

Posted on November 21, 2011 I Written By

Priya Ramachandran is a Maryland based freelance writer. In a former life, she wrote software code and managed Sarbanes Oxley related audits for IT departments. She now enjoys writing about healthcare, science and technology.

From, a post on Sen. Al Franken’s second hearing as chairman of the Senate Subcommittee on Privacy, Technology and the Law. Franken’s take was that federal agencies tasked with enforcing digital privacy are not doing so. While we might be aware on some subliminal level about the lack of enforcement, when presented in sheer numbers, the statistics are shocking.

According to the MinnPost article:

“Total, there have been 364 “major breaches” of 18 million patient’s private data since 2009, Franken said. Meanwhile, enforcement of data privacy laws have been lax — out of the 22,500 complaints the Health and Human Services Department has received since 2003, it’s levied only one fine and reached monetary settlements in six others. Of the 495 cases referred to the Department of Justice, only 16 have been prosecuted.”

Here on the HHS website, you can see all the breaches affecting 500 or more people (sort by Breach Date to see recent breaches). Even with all the rules around reporting, effectively, given the lack of enforcement, hospitals and care organizations stand to gain the most in this lax enforcement landscape. I’d be curious to know the process of fining and reaching settlements, whether it is proportional to the amount of data stolen/lost. More importantly, I’d like to know what organizations are doing differently if data thefts have been identified – the worst thing for an organization would be to pay the fine, and continue with the same faulty processes that led the breach in the first place.

Balancing Privacy and Security with Patient Care

Posted on December 23, 2009 I Written By

John Lynn is the Founder of the blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of and John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Healthcare InformationWeek has an article that discusses the challenges of EMR security and privacy. A lot of the stuff is nothing new to those of us in the healthcare space. Although, it’s interesting to see how they summarize things like the goal to be full EMR by 2014 and the EMR stimulus money.

However, the article did include these interesting stats on the number of breaches that happen in healthcare and the focus IT managers put on privacy and data security in healthcare.

Healthcare providers and other health businesses aren’t stepping up to protect privacy, according to a recent study. Some 80% of healthcare organizations have experienced at least one incident of lost or stolen health information in the past year, according to the study, released this month from security management company LogLogic and the Ponemon Institute, which conducts privacy and information management research.

Also, some 70% of IT managers surveyed said senior management doesn’t view privacy and data security as a priority, and 53% say their organizations don’t take appropriate steps to protect patient privacy. Less than half judge their existing security measures as “effective or very effective.”

I was surprised that 80% of organizations have had an incident of lost or stolen health information. However, I honestly don’t see this ever changing. Stuff happens even with the very best efforts.

I did also like this quote of John Halamka about the challenge of balancing privacy and security with sharing the patient information to provide better patient care.

“You want to protect the patient’s preferences for confidentiality,” Halamka said. But you also need to get information where it’s needed. “If you come to the emergency department in a coma, and you have a record that includes psychiatric treatment, HIV, drug abuse, and other information, would you share part of it or all of it? My preference would be all of it, with the hope that emergency workers would use it discreetly, to save my life.” But other people may feel differently, Halamka said, and healthcare policy needs to serve all those needs.

I’m a little surprised that Halamka has had psychiatric treatment, HIV and drug abuse. He’s doing quite well considering that history. (that’s sarcasm in case you didn’t note it) His history aside, I’m totally with him on wanting that information available as well. However, he’s totally correct that many people wouldn’t want that stuff shared. Enabling the consumer to make that decision though is a hard nut to crack.