Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Provider-Backed Health Data Interoperability Organization Launches

Posted on April 12, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

In 1988, some members of the cable television industry got together to form CableLabs, a non-proft innovation center and R&D lab. Since then, the non-profit has been a driving force in bringing cable operators together, developing technologies and specifications for services as well as offering testing and certification facilities.

Among its accomplishments is the development of DOCSIS (Data-over-Cable Service Interface Specification), a standard used worldwide to provide Internet access via a cable modem. If your cable modem is DOCSIS compliant, it can be used on any modern cable network.

If you’re thinking this approach might work well in healthcare, you’re not the only one. In fact, a group of powerful healthcare providers as just launched a health data sharing-focused organization with a similar approach.

The Center for Medical Interoperability, which will be housed in a 16,000-square-foot location in Nashville, is a membership-based organization offering a testing and certification lab for devices and systems. The organization has been in the works since 2011, when the Gary and Mary West Health Institute began looking at standards-based approaches to medical device interoperability.

The Center brings together a group of top US healthcare systems – including HCA Healthcare, Vanderbilt University and Community Health Systems — to tackle interoperability issues collaboratively.  Taken together, the board of directors represent more than 50 percent of the healthcare industry’s purchasing power, said Kerry McDermott, vice president of public policy and communications for the Center.

According to Health Data Management, the group will initially focus on acute care setting within a hospital, such as the ICU. In the ICU, patients are “surrounded by dozens of medical devices – each of which knows something valuable about the patient  — but we don’t have a streamlined way to aggregate all that data and make it useful for clinicians,” said McDermott, who spoke with HDM.

Broadly speaking, the Center’s goal is to let providers share health information as seamlessly as ATMs pass banking data across their network. To achieve that goal, its leaders hope to serve as a force for collaboration and consensus between healthcare organizations.

The project’s initial $10M in funding, which came from the Gary and Mary West Foundation, will be used to develop, test and certify devices and software. The goal will be to develop vendor-neutral approaches that support health data sharing between and within health systems. Other goals include supporting real-time one-to-many communications, plug-and-play device and system integration and the use of standards, HDM reports.

It will also host a lab known as the Transformation Learning Center, which will help clinicians explore the impact of emerging technologies. Clinicians will develop use cases for new technologies there, as well as capturing clinical requirements for their projects. They’ll also participate in evaluating new technologies on their safety, usefulness, and ability to satisfy patients and care teams.

As part of its efforts, the Center is taking a close look at the FHIR API.  Still, while FHIR has great potential, it’s not mature yet, McDermott told the magazine.

Two Worth Reading

Posted on April 6, 2017 I Written By

When Carl Bergman isn't rooting for the Washington Nationals or searching for a Steeler bar, he’s Managing Partner of EHRSelector.com, a free service for matching users and EHRs. For the last dozen years, he’s concentrated on EHR consulting and writing. He spent the 80s and 90s as an itinerant project manger doing his small part for the dot com bubble. Prior to that, Bergman served a ten year stretch in the District of Columbia government as a policy and fiscal analyst.

HIT is a relatively small world that generates no end of notices, promotions and commentaries. You can usually skim them, pick out what’s new or different and move on. Recently, I’ve run into two articles that deserve a slow, savored reading: Politico’s Arthur Allen’s History of VistA, the VA’s homegrown EHR and Julia Adler-Milstein’s take on interoperability’s hard times.

VistA: An Old Soldier That May Just Fade Away – Maybe

The VA’s EHR is not only older than just about any other EHR, it’s older than just about any app you’ve used in the last ten years. It started when Jimmy Carter was in his first presidential year. It was a world of mainframes running TSO and 3270 terminals. Punch cards still abounded and dialup modems were rare. Even then, there were doctors and programmers who wanted to move vet’s hard copy files into a more usable, shareable form.

Arthur Allen has recounted their efforts, often clandestine, in tracking VistA’s history. It’s not only a history of one EHR and how it has fallen in and out of favor, but it’s also a history of how personal computing has grown, evolved and changed. Still a user favorite, it looks like its accumulated problems, often political as much as technical, may mean it will finally meet its end – or maybe not. In any event, Allen has written an effective, well researched piece of technological history.

Adler-Milstein: Interoperability’s Not for the Faint of Heart

Adler-Milstein, a University of Michigan Associate Professor of Health Management and Policy has two things going for her. She knows her stuff and she writes in a clear, direct prose. It’s a powerful and sadly rare combination.

In this case, she probes the seemingly simple issue of HIE interoperability or the lack thereof. She first looks at the history of EHR adoption, noting that MU1 took a pass on I/O. This was a critical error, because it:

[A]llowed EHR systems to be designed and adopted in ways that did not take HIE into account, and there were no market forces to fill the void.

When stage two with HIE came along, it meant retrofitting thousands of systems. We’ve been playing catch up, if at all, ever since.

Her major point is simple. It’s in everyone’s interest to find ways of making I/O work and that means abandoning fault finding and figuring out what can work.

Health IT End of Year Loose Ends

Posted on December 13, 2016 I Written By

When Carl Bergman isn't rooting for the Washington Nationals or searching for a Steeler bar, he’s Managing Partner of EHRSelector.com, a free service for matching users and EHRs. For the last dozen years, he’s concentrated on EHR consulting and writing. He spent the 80s and 90s as an itinerant project manger doing his small part for the dot com bubble. Prior to that, Bergman served a ten year stretch in the District of Columbia government as a policy and fiscal analyst.

In that random scrap heap I refer to as my memory, I’ve compiled several items not worthy of a full post, but that keep nagging me for a mention. Here are the ones that’ve surfaced:

Patient Matching. Ideally, your doc should be able to pull your records from another system like pulling cash from an ATM. The hang up is doing patient matching, which is record sharing’s last mile problem. Patients don’t have a unique identifier, which means to make sure your records are really yours your doctor’s practice has to use several cumbersome workarounds.

The 21st Century Cures Act calls for GAO to study ONC’s approach to patient matching and determine if there’s a need for a standard set of data elements, etc. With luck, GAO will cut to the chase and address the need for a national patient ID.

fEMR. In 2014, I noted Team fEMR, which developed an open source EHR for medical teams working on short term – often crises — projects. I’m pleased to report the project and its leaders Sarah Diane Draugelis and Kevin Zurek are going strong and recently got a grant from the Pollination Project. Bravo.

What’s What. I live in DC, read the Washington Post daily etc., but if I want to know what’s up with HIT in Congress, etc., my first source is Politico’s Morning EHealth. Recommended.

Practice Fusion. Five years ago, I wrote a post that was my note to PF about why I couldn’t be one of their consultants anymore. Since then the post has garnered almost 30,000 hits and just keeps going. As pleased as I am at its longevity, I think it’s only fair to say that it’s pretty long in the tooth, so read it with that in mind.

Ancestry Health. A year ago September, I wrote about Ancestry.com’s beta site Ancestry Health. It lets families document your parents, grandparents, etc., and your medical histories, which can be quite helpful. It also promised to use your family’s depersonalized data for medical research. As an example, I set up King Agamemnon family’s tree. The site is still in beta, which I assume means it’s not going anywhere. Too bad. It’s a thoughtful and useful idea. I also do enjoy getting their occasional “Dear Agamemnon” emails.

Jibo. I’d love to see an AI personal assistant for PCPs, etc., to bring up related information during exams, capture new data, make appointments and prepare scripts. One AI solution that looked promising was Jibo. The bad news is that it keeps missing its beta ship date. However, investors are closing in on $100 million. Stay tuned.

 

Health Data Sharing Varies Widely From State To State

Posted on November 4, 2016 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

A new report from the CDC concludes that many physicians have interacted with shared health data, though only a small percentage of them had checked off all of the boxes by sending, receiving, integrating and searching for patient health data from other providers. The study also found that data sharing practices varied widely from state to state.

According to the CDC data, 38.2% of office-based physicians had sent data electronically to their peers in 2015. A nearly identical amount (39.3%) had received data, 31.1% integrated such data and 34% has searched such data from other providers.

On the other hand, physicians’ data interactions seem to have been somewhat limited. The CDC indicated that just 8.7% of office-based doctors had performed all four of these data sharing activities, a level which suggests that few are completely comfortable with such exercises.

Another striking aspect to the data was that it laid out the extent to which physicians in different states had different levels of data sharing activity.

For example, it found that in 2015, physicians fell below the national average for sending patient data in Idaho (19.4%), Connecticut (22.7%) and New Jersey (24.3%). In another anomaly, 56.3% of physicians in Arizona had sent information electronically other providers, a figure well above the 38.2% national average, with Idaho at the bottom of the range.

Meanwhile, the percentage of physicians who had received information electronically from other providers fell below the national average of 38.3% in Louisiana (23.6%), Mississippi (23.6%), Missouri (24.2%) and Alabama (24.3%). States where physicians exceeded the average for receiving information included Massachusetts (52.9%), Minnesota (55%), Oregon (59.2%) in Wisconsin (66.5%).

Where things get particularly interesting is when we look at the states were physicians had integrated electronic patient information they had received into their health data systems, a significantly more advanced step than sending or receiving data.

States that fell below the 31.1% average of physicians during such integration include Alaska (18.4%), the District of Columbia (18.6%), Montana (18.6%), Alabama (18.8%) and Idaho (20.6%). States that performed above the national average included Indiana (44.2%) and Delaware (49.3%).

Also worth noting was the diverse levels to which physicians had searched for patient health information from other providers, a data point which might suggest how much confidence they had in finding data. (Physicians who felt interoperability wasn’t serving them might not bother to search after all.)

The study found that while the average level of physicians who searched was 34%, several states fell below that average, including the District of Columbia (15.1%), Mississippi (19.7%), Pennsylvania (20.8%), Texas (21%), Missouri (21.6%) and Oklahoma (22.8%).

On the other hand, 10 states boasted a higher level physicians who searched than the national average. These included Ohio (47.2%), Alaska (47.3%, Colorado (47.5%), Maryland (47.9%), Virginia (48.3%), North Carolina (48.8%), Delaware (53.9%), Wisconsin (54.1%), Washington (58%) and Oregon (61.2%).

If it’s true that integrating and searching for data indicate higher levels of faith in the ability to use shared data, this actually looks like an encouraging report. Clearly, we have a long way to go, but substantial numbers of physicians are engaging in shared data use. To me this looks like progress.

A Circular Chat On Healthcare Interoperability

Posted on September 6, 2016 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

About a week ago, a press release on health data interoperability came into my inbox. I read it over and shook my head. Then I pinged a health tech buddy for some help. This guy has seen it all, and I felt pretty confident that he would know whether there was any real news there.

And this is how our chat went.

—-

“So you got another interoperability pitch from one of those groups. Is this the one that Cerner kicked off to spite Epic?” he asked me.

“No, this is the one that Epic and its buddies kicked off to spite Cerner,” I told him. “You know, health data exchange that can work for anyone that gets involved.”

“Do you mean a set of technical specs? Maybe that one that everyone seems to think is the next big hope for application-based data sharing? The one ONC seems to like.” he observed. “Or at least it did during the DeSalvo administration.”

“No, I mean the group working on a common technical approach to sharing health data securely,” I said. “You know, the one that lets doctors send data straight to another provider without digging into an EMR.”

“You mean that technology that supports underground currency trading? That one seems a little bit too raw to support health data trading,” he said.

“Maybe so. But I was talking about data-sharing standards adopted by an industry group trying to get everyone together under one roof,” I said. “It’s led by vendors but it claims to be serving the entire health IT world. Like a charity, though not very much.”

“Oh, I get it. You must be talking about the industry group that throws that humungous trade show each year.” he told me. “A friend wore through two pairs of wingtips on the trade show floor last year. And he hardly left his booth!”

“Actually, I was talking about a different industry group. You know, one that a few top vendors have created to promote their approach to interoperability.” I said. “Big footprint. Big hopes. Big claims about the future.”

“Oh yeah. You’re talking about that group Epic created to steal a move from Cerner.” he said.

“Um, sure. That must have been it,” I told him. “I’m sure that’s what I meant.”

—-

OK, I made most of this up. You’ve got me. But it is a pretty accurate representation of how most conversations go when I try to figure out who has a chance of actually making interoperability happen. (Of course, I added some snark for laughs, but not much, believe it or not.)

Does this exchange sound familiar to anyone else?

And if it does, is it any wonder we don’t have interoperability in healthcare?

Is Interoperability Worth Paying For?

Posted on August 18, 2016 I Written By

When Carl Bergman isn't rooting for the Washington Nationals or searching for a Steeler bar, he’s Managing Partner of EHRSelector.com, a free service for matching users and EHRs. For the last dozen years, he’s concentrated on EHR consulting and writing. He spent the 80s and 90s as an itinerant project manger doing his small part for the dot com bubble. Prior to that, Bergman served a ten year stretch in the District of Columbia government as a policy and fiscal analyst.

A member of our extended family is a nurse practitioner. Recently, we talked about her practice providing care for several homebound, older patients. She tracks their health with her employer’s proprietary EHR, which she quickly compared to a half-dozen others she’s used. If you want a good, quick EHR eval, ask a nurse.

What concerned her most, beyond usability, etc., was piecing together their medical records. She didn’t have an interoperability problem, she had several of them. Most of her patients had moved from their old home to Florida leaving a mixed trail of practioners, hospitals, and clinics, etc. She has to plow through paper and electronic files to put together a working record. She worries about being blindsided by important omissions or doctors who hold onto records for fear of losing patients.

Interop Problems: Not Just Your Doc and Hospital

She is not alone. Our remarkably decentralized healthcare system generates these glitches, omissions, ironies and hang ups with amazing speed. However, when we talk about interoperability, we focus on mainly on hospital to hospital or PCP to PCP relations. Doing so, doesn’t fully cover the subject. For example, others who provide care include:

  • College Health Systems
  • Pharmacy and Lab Systems
  • Public Health Clinics
  • Travel and other Specialty Clinics
  • Urgent Care Clinics
  • Visiting Nurses
  • Walk in Clinics, etc., etc.

They may or may not pass their records back to a main provider, if there is one. When they do it’s usually by FAX making the recipient key in the data. None of this is particularly a new story. Indeed, the AHA did a study of interoperability that nails interoperability’s barriers:

Hospitals have tried to overcome interoperability barriers through the use of interfaces and HIEs but they are, at best, costly workarounds and, at worst, mechanisms that will never get the country to true interoperability. While standards are part of the solution, they are still not specified enough to make them truly work. Clearly, much work remains, including steps by the federal government to support advances in interoperability. Until that happens, patients across the country will be shortchanged from the benefits of truly connected care.

We’ve Tried Standards, We’ve Tried Matching, Now, Let’s Try Money

So, what do we do? Do we hope for some technical panacea that makes these problems seem like dial-up modems? Perhaps. We could also put our hopes in the industry suddenly adopting an interop standard. Again, Perhaps.

I think the answer lies not in technology or standards, but by paying for interop successes. For a long time, I’ve mulled over a conversation I had with Chandresh Shah at John’s first conference. I’d lamented to him that buying a Coke at a Las Vegas CVS, brought up my DC buying record. Why couldn’t we have EHR systems like that? Chandresh instantly answered that CVS had an economic incentive to follow me, but my medical records didn’t. He was right. There’s no money to follow, as it were.

That leads to this question, why not redirect some MU funds and pay for interoperability? Would providers make interop, that is data exchange, CCDs, etc., work if they were paid? For example, what if we paid them $50 for their first 500 transfers and $25 for their first 500 receptions? This, of course, would need rules. I’m well aware of the human ability to game just about anything from soda machines to state lotteries.

If pay incentives were tried, they’d have to start slowly and in several different settings, but start they should. Progress, such as it is, is far too slow and isn’t getting us much of anywhere. My nurse practitioner’s patients can’t wait forever.

ONC’s Budget: A Closer Look

Posted on August 3, 2016 I Written By

When Carl Bergman isn't rooting for the Washington Nationals or searching for a Steeler bar, he’s Managing Partner of EHRSelector.com, a free service for matching users and EHRs. For the last dozen years, he’s concentrated on EHR consulting and writing. He spent the 80s and 90s as an itinerant project manger doing his small part for the dot com bubble. Prior to that, Bergman served a ten year stretch in the District of Columbia government as a policy and fiscal analyst.

When HHS released ONC’s proposed FY2017 budget last winter, almost all attention focused on one part, a $22 million increase for interoperability. While the increase is notable, I think ONC’s full $82 Million budget deserves some attention.

ONC’s FY2017 Spending Plan.

Table I, summarizes ONC’s plan for Fiscal Year 2017, which runs from October 1, 2016 through September 30, 2017. The first thing to note is that ONC’s funding would change from general budget funds, known as Budget Authority or BA, to Public Health Service Evaluation funds. HHS’ Secretary may allocate up to 2.1 percent of HHS’ funds to these PHS funds. This change would not alter Congress’ funding role, but apparently signals HHS’s desire to put ONC fully in the public health sector.

Table I
ONC FY2017 Budget

fy2017-budget-justification-onc

What the ONC Budget Shows and What it Doesn’t

ONC’s budget follows the standard, federal government budget presentation format. That is, it lists, by program, how many people and how much money is allocated. In this table, each fiscal year, beginning with FY2015, shows the staffing level and then spending.

Staffing is shown in FTEs, that is, full time equivalent positions. For example, if two persons work 20 hours each, then they are equivalent to one full time person or FTE.

Spending definitions for each fiscal year is a little different. Here’s how that works:

  • FY2015 – What actually was spent or how many actually were hired
  • FY2016 – The spending and hiring Congress set for ONC for the current year.
  • FY2017 – The spending and hiring in the President’s request to Congress for next year.

If you’re looking to see how well or how poorly ONC does its planning, you won’t see it here. As with other federal and most other government budgets, you never see a comparison of plans v how they really did. For example, FY2015 was the last complete fiscal year. ONC’s budget doesn’t have a column showing its FY2015 budget and next to it, what it actually did. If it did, you could see how well or how poorly it did following its plan.

You can’t see the amount budgeted for FY2015 in ONC’s budget, except for its total budget. However, if you look at the FY2016 ONC budget, you can see what was budgeted for each of its four programs. While the budget total and the corresponding actual are identical -$60,367,000, the story at the division level is quite different.

                                   Table II
                    ONC FY2015 Budget v Actual
                                    000s

Division

FY2015 Budget $ FY2015 Actuals $ Diff
Policy Development and Coordination 12,474 13,112 638
Standards, Interoperability, and Certification 15,230 15,425 195
Adoption and Meaningful Use 11,139 10,524 (615)
Agency-wide Support 21,524 21,306 (218)
Total 60,367 60,367

 

Table II, shows this by comparing the FY2015 Enacted Budget from ONC’s FY2015 Actuals for its four major activities. While the total remained the same, it shows that there was a major shift of $638,000 from Meaningful Use to Policy. There was a lesser shift of $195,000 from Agency Support to Standards. These shifts could have been actual transfers or they could have been from under and over spending by the divisions.

Interestingly, Table III for staffing shows a different pattern. During FY2015, ONC dropped 25 FTEs, a dozen from Policy Development and the rest from Standards and Meaningful Use. That means, for example, that Policy Development had less people and more money during FY2015.

Table III
ONC FY2015
Budget v Actual Staffing FTEs
Division FY2015 Budget FTEs FY2015 Actuals FTEs Diff
Policy Development and Coordination 49 37 (12)
Standards, Interoperability, and Certification 32 26 (6)
Adoption and Meaningful Use 49 42 (7)
Agency-wide Support 55 55
Total 185 160 25

 

To try to make sense of this, I looked at the current and past year’s budgets, but to no avail. As best I can tell is ONC made great use of contracts and other non personnel services. For example, ONC spent $30 Million on purchase/contracts, which is $8 million more than it did on its payroll.

ONC’s budget, understandably, concentrates on its programs and plans. It puts little emphasis on measuring its hiring and spending abilities. It’s not alone, budgets government and otherwise, are forecast and request documents. However, if we could know how plans went – without having to dig in last year’s weeds  – it would let us know how well a program executed its plans as well as make them. That would be something worth knowing.

New ONC Scorecard Tool Grades C-CDA Documents

Posted on August 2, 2016 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

The ONC has released a new scorecard tool which helps providers and developers find and resolve interoperability problems with C-CDA documents. According to HealthDataManagement, C-CDA docs that score well are coded with appropriate structure and semantics under HL7, and so have a better chance of being parseable by different systems.

The scorecard tool, which can be found here, actually offers two different types of scores for C-CDA documents, which must be uploaded to the site to be analyzed. One score diagnoses whether the document meets the requirements of the 2015 Edition Health IT Certification for Transitions of Care, granting a pass/fail grade. The other score, which is awarded as a letter grade ranging from A+ to D, is based on a set of enhanced interoperability rules developed by HL7.

The C-CDA scorecard takes advantage of the work done to develop SMART (Substitutable Medical Apps Resusable Technologies). SMART leverages FHIR, which is intended to make it simpler for app developers to access data and for EMR vendors to develop an API for this purpose. The scorecard, which leverages open-source technology, focuses on C-CDA 2.1 documents.

The SMART C-CDA scorecard was designed to promote best practices in C-CDA implementation by helping creators figure out how well and how often they follow best practices. The idea is also to highlight improvements that can be made right away (a welcome approach in a world where improvement can be elusive and even hard to define).

As SMART backers note, existing C-CDA validation tools like the Transport Testing Tool provided by NIST and Mode-Driven Health Tools, offer a comprehensive analysis of syntactic conformance to C-CDA specs, but don’t promote higher-level best practices. The new scorecard is intended to close this gap.

In case developers and providers have HIPAA concerns, the ONC makes a point of letting users know that the scorecard tool doesn’t retain submitted C-CDA files, and actually deletes them from the server after the files have been processed. That being said, ONC leaders still suggest that submitters not include any PHI or personally-identifiable information in the scorecards they have analyzed.

Checking up on C-CDA validity is becoming increasingly important, as this format is being used far more often than one might expect. For example, according to a story appearing last year in Modern Healthcare:

  • Epic customers shared 10.2 million C-CDA documents in March 2015, including 1.3 million outside the Epic ecosystem (non-Epic EMRs, HIEs and the health systems for the Defense and Veterans Affairs Departments)
  • Cerner customers sent 7.3 million C-CDA docs that month, more than half of which were consumed by non-Cerner systems.
  • Athenahealth customers sent about 117,000 C-CDA documents directly to other doctors during the first quarter of 2015.

Critics note that it’s still not clear how useful C-CDA information is to care, nor how often these documents are shared relative to the absolute number of patient visits. Still, even if the jury is still out on their benefits, it certainly makes sense to get C-CDA docs right if they’re going to be transmitted this often.

No, The Market Can’t Solve Health Data Interoperability Problems

Posted on July 6, 2016 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

I seldom disagree with John Halamka, whose commentary on HIT generally strikes me as measured, sensible and well-grounded. But this time, Dr. Halamka, I’m afraid we’ll have to agree to disagree.

Dr. Halamka, chief information officer of Beth Israel Deaconess Medical Center and co-chair of the ONC’s Health IT Standards Committee, recently told Healthcare IT News that it’s time for ONC and other federal regulators to stop trying to regulate health data interoperability into existence.

“It’s time to return the agenda to the private sector in the clinician’s guide vendors reduce the products and services they want,” Halamka said. “We’re on the cusp of real breakthroughs in EHR usability and interoperability based on the new incentives for outcomes suggested by MACRA and MIPS. {T}he worst thing we could do it this time is to co-opt the private sector agenda more prescriptive regulations but EHR functionality, usability and quality measurement.”

Government regs could backfire

Don’t get me wrong — I certainly appreciate the sentiment. Government regulation of a dynamic goal like interoperability could certainly backfire spectacularly, if for no other reason than that technology evolves far more quickly than policy. Regulations could easily set approaches to interoperability in stone that become outmoded far too quickly.

Not only that, I sympathize with Halamka’s desire to let independent clinical organizations come together to figure out what their priorities are for health data sharing. Even if regulators hire the best, most insightful clinicians on the planet, they still won’t have quite the same perspective as those still working on the front lines every day. Hospitals and medical professionals are in a much better position to identify what data should be shared, how it should be shared and most importantly what they can accomplish with this data.

Nonetheless, it’s worth asking what the “private sector agenda” that Halamka cites is, actually. Is he referring to the goals of health IT vendors? Hospitals? Medical practices? Health plans? The dozens of standards and interoperability organization that exist, ranging from HL7 and FHIR to the CommonWell Health Alliance? CHIME? HIMSS? HIEs? To me, it looks like the private sector agenda is to avoid having one. At best, we might achieve the United Nations version of unity as an industry, but like that body it would be interesting but toothless.

Patients ready to snap

After many years of thought, I have come to believe that healthcare interoperability is far too important to leave to the undisciplined forces of the market. As things stand, patients like me are deeply affected by the inefficiencies and mistakes bred by the healthcare industry’ lack of interoperability — and we’re getting pretty tired of it. And readers, I guarantee that anyone who taps the healthcare system as frequently as I do feels the same way. We are on the verge of rebellion. Every time someone tells me they can’t get my records from a sister facility, we’re ready to snap.

So do I believe that government regulation is a wonderful thing? Certainly not. But after watching the HIT industry for about 20 years on health data sharing, I think it’s time for some central body to impose order on this chaos. And in such a fractured market as ours, no voluntary organization is going to have the clout to do so.

Sure, I’d love to think that providers could pressure vendors into coming up with solutions to this problem, but if they haven’t been able to do so yet, after spending a small nation’s GNP on EMRs, I doubt it’s going to happen. Rather than fighting it, let’s work together with the government and regulatory agencies to create a minimal data interoperability set everyone can live with. Any other way leads to madness.

Dallas Children’s Health and Sickle Cell Patients: Cobbling Together a Sound Solution

Posted on June 23, 2016 I Written By

When Carl Bergman isn't rooting for the Washington Nationals or searching for a Steeler bar, he’s Managing Partner of EHRSelector.com, a free service for matching users and EHRs. For the last dozen years, he’s concentrated on EHR consulting and writing. He spent the 80s and 90s as an itinerant project manger doing his small part for the dot com bubble. Prior to that, Bergman served a ten year stretch in the District of Columbia government as a policy and fiscal analyst.

Sickle cell anemia (SCA) is a genetic, red blood cell condition, which damages cell walls impeding their passage through capillaries. Episodic, it is often extremely painful. It can damage organs, cause infections, strokes or joint problems. These episodes or SCA crises can be prompted by any number of environmental or personal factors.

In the US, African Americans are most commonly susceptible to SCA, but other groups can have it as well. SCA presents a variety of management problems in the best of circumstances. As is often the case, management is made even more difficult when the patient is a child. That’s what Children’s Health of Dallas, Texas, one of the nation’s oldest and largest pediatric treatment facilities faced two years ago. Children’s Health, sixty five percent of whose patients are on Medicaid, operates a large, intensive SCA management program as the anchor institution of the NIH funded Southwestern Comprehensive Sickle Cell Center.

Children’s Health problem wasn’t with its inpatient care or with its outpatient clinics. Rather, it was keeping a child’s parents and doctors up to date on developments. Along with the SCA clinical staff, Children’s Chief Information Officer, Pamela Arora, and Information Management and Exchange Director, Katherine Lusk, tackled the problem. They came up with a solution using all off the shelf technology.

Their solution? Provide each child’s caregiver with a free Verizon smartphone. Each night, they extracted the child’s information from EPIC and sent it to Microsoft’s free, vendor-neutral HealthVault PHR. This gave the child’s doctor and parents an easy ability to stay current with the child’s treatment. Notably, Children’s was able to put the solution together quickly with minimal staff and without extensive development.

That was two years ago. Since then, EPIC’s Lucy PHR has supplanted the project. However, Katherine Lusk who described the project to me is still proud of what they did. Even though the project has been replaced, it’s worth noting as an important example. It shows that not all HIE projects must be costly, time-consuming or resource intense to be successful.

Children’s SCA project points out the value of these system development factors:

  • Clear, understood goal
  • Precise understanding of users and their needs
  • Small focused team
  • Searching for off the shelf solutions
  • Staying focused and preventing scope creep

Each of these proved critical to Children’s success. Not every project lends itself to this approach, but Children’s experience is worth keeping in mind as a useful and repeatable model of meeting an immediate need with a simple, direct approach.

Note: I first heard of Children’s project at John’s Atlanta conference. ONC’s Peter Ashkenaz mentioned it as a notable project that had not gained media attention. I owe him a thanks for pointing me to Katherine Lusk.