Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Taking a New Look at the Lamented Personal Health Record: Flow Health’s Debut

Posted on June 8, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site ( and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

After the disappointing lack of adoption suffered by Google Health and Microsoft HealthVault, many observers declared personal health records (PHRs) a non-starter, while others predicted that any progress toward personal control over health data would require a radically new approach.

Several new stabs at a PHR are emerging, of which Flow Health shows several promising traits. The company tries to take advantage of–and boost the benefits of–advances in IT standards and payment models. This article is based on a conversation I had with their general counsel, David Harlow, who is widely recognized as the leading legal expert in health IT and health privacy and who consults with companies in those spaces through the Harlow Group.

Because records are collected by doctors, not patients, the chief hurdle any PHR has to overcome is to persuade the health care providers to relinquish sole control over the records they squirrel away in their local EHR silos. Harlow believes the shift to shared risk and coordinated care is creating the incentive for doctors to share. The Center for Medicare & Medicaid Services is promising to greatly increase the role of pay-for-value, and a number of private insurers have promised to do so as well. In short, Flow Health can make headway if the tangible benefit of learning about a patient’s recent hospital discharge or treating chronic conditions while the patient remains at home start to override the doctor’s perception that she can benefit by keeping the patient’s data away from competitors.

The next challenge is technically obtaining the records. This is facilitated first by the widespread move to electronic records (a legacy of Meaningful Use Stage 1) and the partial standardization of these records in the C-CDA. Flow Health recognizes both the C-CDA and Blue Button, as well as using the Direct protocol to obtain records. Harlow says that FHIR will be supported when the standard settles down.

But none of that is enough to offer Flow Health what the doctors and patients really want, which is a unified health record containing all the information given by different providers. Therefore, like other companies trying to broaden access to patient data, Flow Health must deal with the problem that Dr. Eric Topol recently termed the Tower of EMR Babel. They study each format produced by different popular EHRs (each one using the C-CDA in slightly incompatible ways) and convert the data into a harmonized format. This allows Flow Health to then reconcile records when a diagnosis, a medication list, or some other aspect of the patient’s health is represented differently in different records.

What’s next for Flow Health? Harlow said they are preparing an API to let third parties add powerful functionality, such as care coordination and patient access from any app of their choice. Flow Health is already working closely with payers and providers to address workflow challenges, thus accelerating the aggregation of patient health record data for access and use by clinicians and patients.

A relative of mine could have used something like Flow Health recently when her eye doctor referred her to the prestigious Lahey Clinic in the Boston area. First of all, the test that led to the referral had to be repeated at the Lahey Clinic, because the eye doctor did not forward test results. Nor did anyone provide a medication list, so the Lahey Clinic printed out a five-year old medication list that happened to hang around from a visit long ago and asked her to manually update it. There was also confusion about what her insurer would cover, but that’s a different matter. All this took place in 2015, in the country’s leading region for medical care.

It seems inevitable that–as Flow Health hopes–patients will come to demand access to their medical records. A slew of interesting experiments will proliferate, like Flow Health and the rather different vision of Medyear to treat health information like a social network feed. Patient-generated data, such as the output from fitness devices and home sensors, will put yet more pressure on health care providers to take the patient seriously as a source of information. And I’ll continue to follow developments.

HHS’ $30B Interoperability Mistake

Posted on May 8, 2015 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Sometimes things are so ill-advised, in hindsight, that you wonder what people were thinking. That includes HHS’ willingness to give out $30 billion to date in Meaningful Use incentives without demanding that vendors offer some kind of interoperability. A staggering amount of money has been paid out under HITECH to incentivize providers to make EMR progress, but we still have countless situations where one EMR can’t talk to another one right across town.

When you ponder the wasted opportunity, it’s truly painful. While the Meaningful Use program may have been a good idea, it failed to bring the interoperability hammer down on vendors, and now that ship has sailed. While HHS might have been able to force the issue back in the day, demanding that vendors step up or be ineligible for certification, I doubt vendors could backward-engineer the necessary communications formats into their current systems, even if there was a straightforward standard to implement — at least not at a price anyone’s willing to pay.

Now, don’t get me wrong, I realize that “interoperability” is an elastic concept, and that the feds couldn’t just demand that vendors bolt on some kind of module and be done with it. Without a doubt, making EMRs universally interoperable is a grand challenge, perhaps on the order of getting the first plane to fly.

But you can bet your last dollars that vendors, especially giants like Cerner and Epic, would have found their Wilbur and Orville Wright if that was what it took to fill their buckets with incentive money. It’s amazing how technical problems get solved when powerful executives decide that it will get done.

But now, as things stand, all the government can do is throw its hands up in the air and complain. At a Senate hearing held in March, speakers emphasized the crying need for interoperability between providers, but none of the experts seemed to have any methods in their hip pocket for fixing the problem. And being legislators, not IT execs, the Senators probably didn’t grasp half of the technical stuff.

As the speakers noted, what it comes down to is that vendors have every reason to create silos and keep customers locked into their product.  So unless Congress passes legislation making it illegal to create a walled garden — something that would be nearly impossible unless we had a consensus definition of interoperability — EMR vendors will continue to merrily make hay on closed systems.  It’s not a pretty picture.

Should Healthcare Institutes Perform “Rip-and-Replace” to Achieve Interoperability? Less Disruption, Please!

Posted on October 7, 2014 I Written By

The following is a guest blog post by Dr. Donald Voltz, MD, Aultman Hospital, Department of Anesthesiology, Medical Director of the Main Operating Room, Assistant Professor of Anesthesiology, Case Western Reserve University and Northeast Ohio Medical University.
Dr Voltz
A KLAS Research Report on the EMR buying trends of 277 hospitals with at least 200 beds has identified that almost half will be making a new EMR purchase by 2016.  Of the providers considering a change, 34 percent have already selected a vendor and another 44 percent are strongly leaning toward a specific vendor. Driving factors include concerns over outdated technology and health system consolidation.

But is the technology really outdated and health system consolidation necessary, or is the real issue lack of interoperability?  And if you are a hospital looking for a new EMR, let’s not forget the history of technology before we jump to conclusions that the greatest market share means the best of breed.

When we look at EMR adoption over the past number of years, we need to be careful with the data we use. Implementations, and now rip and replace switching to other venders, has been the only choice offices, clinics, hospitals and health systems had to address the issues with interoperability.

Most of current deployed EMRs are designed as a one-size-fits-all, leading to the situation where today out-of-the-box functionalities fit none of the care providers’ requirements. Besides that, EMR vendors have been designed with proprietary data where patient medical sharing (or exchange) becomes the biggest roadblock for patient care continuum. The reason for the rip-and-replace approach by some hospitals is to reach interoperability between inpatient and outpatient data with a single integrated and consolidated database approach.

A 50 percent turnover of EMRs is an incredibly high numbers of hospitals and clinics who have either replaced or are looking to replace their current EHR’s. Being that the majority of the initial implementations were supported by the HITECH act, one would think the government would raise issue with vendors to address this high turnover of EHR’s. There seems to be a general misperception that if our current systems do not meet the demands and needs of providers, administrators, and financial arms of a healthcare delivery system, ripping out the system and implementing a new one will solve the issues.

What is the True Total Cost of Ownership of an EMR?

Healthcare management must look beyond the actual cost paid to an EHR vendor as the only cost but they must look into the total cost, much beyond the normal Total Cost of Ownership (TCO). TCO only includes the initial license cost, maintenance cost, IT support cost, but in healthcare, there is another cost – it is the disruption of the care providers’ workflow. That disruption is directly correlated to healthcare system revenue and patient care outcomes.

Stop this disruption and let’s look for another solution where we integrate disparate systems since many of them are built upon databases that can address the needs of health. The cost to providers in time to learn a new system, the migration and loss of patient data that has been collected in the current systems, the capital expense of system software, the hardware, trainers, IT personnel, etc. all add to the burden, something that is currently being looked at as a necessary expense.

Interoperability Saves Resources

This need not be the case when platforms exist to connect systems and improve access for providers. Having a consistent display of data allows for more efficient and effective management of patients and when coupled with a robust collaborative platform, we close many of the open loopholes that exist in medicine today, even with EHR’s.

2.0 EMR connectors like Zoeticx and others have taken the medical information bus, middleware platform, to solve the challenges that current EHR’s have not.  This connection of systems and uniform display of information that physicians depend on for the management of patients is crucial if hospitals want their new EMRs to succeed. In addition, a middleware platform allows for patients to access their medical information between EMR’s in a single institution or across institutions, a major issue for Meaningful Use.

Fragmentation Prevents Some EMRs From Connecting With Their Own Software

Large EMR vendors’ lack of healthcare interoperability only reflects on how they compete against each other. Patient medical data and its proprietary structure is the tool for such competition where the outcome would not be necessarily beneficial for the hospital, medical professionals or patients. There are plenty of examples where healthcare facilities with EHRs even from the same vendor fail to interoperate with each other.

Such symptoms have little to do with the EMRs that have the same data structure, but about the fragmentation being put in place over the years of customization. We believe that the reason for this is to address fragmentation of the software product. Fragmentation is a case where deployments from the same software products have gone through significant amounts of customization, leading to its divergence from the product baseline.

To believe that ripping the whole infrastructure – inpatient and outpatient–as the method to reach interoperability would only cause a lot of disruption, yet the outcome would be very questionable down the road. Appreciating the backlash of calling the implementation of EMR’s a beta-release, we have much data to use in looking for the next solution to HIT.

As with much of medicine, we are constantly looking for the best way to take care of our patients. Like it or not, EMR’s have become a medical device and we need to start to evaluate them as we would any device used to manage health and disease. As we move forward, there will be an expansion in the openness of patient data, and in my prediction, a migration away from a single EHR solution to all of the requirements of healthcare, and into a system of interconnected applications and databases.

Once again, we have learned that massively engineered systems do not evolve into complex adaptive systems to respond to changing environmental pressures. Simple, interrelated and interdependent applications are more fluid and readily adaptable to the constantly changing healthcare environment. Currently, the only buffer for the stresses and changes to the healthcare system are the patients and the providers who depend on these systems to manage healthcare.

About Dr. Donald Voltz
By Dr. Donald Voltz, MD, Aultman Hospital, Department of Anesthesiology, Medical Director of the Main Operating Room, Assistant Professor of Anesthesiology, Case Western Reserve University and Northeast Ohio Medical University.  A board-certified anesthesiologist, researcher, medical educator, and entrepreneur. With more than 15 years of experience in healthcare, Dr. Voltz has been involved with many facets of medicine. He has performed basic science and clinical research and has experience in the translation of ideas into viable medical systems and devices.

Could Clinicians Create Better HIE Tools?

Posted on August 13, 2014 I Written By

The following is a guest blog post by Andy Oram.His post reminds me of when I asked “Is Full Healthcare Interoperability a Pipe Dream?

A tense and flustered discussion took place on Monday, August 11 during a routine meeting of the HIT Standards Committee Implementation Workgroup, a subcommittee set up by the Office of the National Coordinator (ONC), which takes responsibility for U.S. government efforts to support new IT initiatives in the health care field. The subject of their uncomfortable phone call was the interoperability of electronic health records (EHRs), the leading issue of health IT. A number of “user experience” reports from the field revealed that the situation is not good.

We have to look at the depth of the problem before hoping to shed light on a solution.

An interoperability showcase literally takes the center of the major health IT conference each year, HIMSS. When I have attended, they physically arranged their sessions around a large pavilion filled with booths and computer screens. But the material on display at the showcase is not the whiz-bang features and glossy displays found at most IT coventions (those appear on the exhibition floor at HIMSS), but just demonstrations of document exchange among EHR vendors.

The hoopla over interoperability at HIMSS suggests its importance to the health care industry. The ability to share coordination of care documents is the focus of current government incentives (Meaningful Use), anchoring Stage 2 and destined to be even more important (if Meaningful Use lasts) in Stage 3.

And for good reason: every time we see a specialist, or our parent moves from a hospital to a rehab facility, or our doctor even moves to another practice (an event that recently threw my wife’s medical records into exasperating limbo), we need record exchange. If we ever expect to track epidemics better or run analytics that can lower health case costs, interoperability will matter even more.

But take a look at extensive testing done by a team for the Journal of the American Medical Informatics Association, recently summarized in a posting by health IT expert Brian Ahier. When they dug into the documents being exchanged, researchers found that many vendors inserted the wrong codes for diagnoses or drugs, placed results in the wrong fields (leaving them inaccessible to recipients), and failed to include relevant data. You don’t have to be an XML programmer or standards expert to get the gist from a list of sample errors included with the study.

And that list covers only the problems found in the 19 organizations who showed enough politeness and concern for the public interest to submit samples–what about the many who ignored the researchers’ request?

A slightly different list of complaints came up at the HIT Standards Committee Implementation Workgroup meeting, although along similar lines. The participants in the call were concerned with errors, but also pointed out the woeful inadequacy of the EHR implementations in representing the complexities and variety of patient care. Some called for changes I find of questionable ethics (such as the ability to exclude certain information from the data exchange while leaving it in the doctor’s records) and complained that the documents exchanged were not easy for patients to read, a goal that was not part of the original requirements.

However, it’s worth pointing out that documents exchange would fall far short of true coordinated care, even if everything worked as the standards called for. Continuity of care documents, the most common format in current health information exchange, have only a superficial sliver of diagnoses, treatments, and other immediate concerns, but do not have space for patient histories. Data that patients can now collect, either through fitness devices or self-reporting, has no place to be recorded. This is why many health reformers call for adopting an entire new standard, FHIR, a suggestion recognized by the ONC as valid but postponed indefinitely because it’s such a big change. The failure to adopt current formats seems to become the justification for keeping on the same path.

Let’s take a step back. After all those standards, all those certifications, all those interoperability showcases, why does document exchange still fail?

The JAMIA article indicated that failure can be widely spread around. There are rarely villains in health care, only people pursuing business as usual when that is insufficient. Thus:

  • The Consolidated CDA standard itself could have been more precisely defined, indicating what to do for instance when values are missing from the record.

  • Certification tests can look deeper into documents, testing for instance that codes are recorded correctly. Although I don’t know why the interoperability showcase results don’t translate into real-world success, I would find it quite believable that vendors might focus on superficial goals (such as using the Direct protocols to exchange data) without determining whether that data is actually usable.

  • Meaningful Use requirements (already hundreds of pages long) could specify more details. One caller in the HIT Standards Committee session mentioned medication reconciliation as one such area.

The HIT Standards Committee agonized over whether to pursue broad goals, necessarily at a slow pace, or to seek a few achievable improvements in the process right away. In either case, what we have to look forward to is more meetings of committees, longer and more mind-numbing documents, heavier and heavier tests–infrastructure galore.

Meanwhile, the structure facilitating all this bureaucracy is crumbling. Many criticisms of Meaningful Use Stage 2 have been publicly aired–some during the HIT Standards Committee call–and Stage 3 now looks like a faint hope. Some journalists predict a doctor’s revolt. Instead of continuing on a path hated by everybody, including the people laying it out, maybe we need a new approach.

Software developers over the past couple decades have adopted a range of ways to involve the users of software in its design. Sometimes called agile or lean methodologies, these strategies roll out prototypes and even production systems for realistic testing. The strategies call for a whole retooling of the software development process, a change that would not come easily to slow-moving proprietary companies such as those dominating the EHR industry. But how would agile programming look in health care?

Instead of bringing a doctor in from time to time to explain what a clinical workflow looks like or to approve the screens put up by a product, clinicians would be actively designing the screens and the transitions between them as they work. They would discover what needs to be in front of a resident’s eyes as she enters the intensive care ward and what needs to be conveyed to the nurses’ station when an alarm goes off sixty feet away.

Clinicians can ensure that the information transferred is complete and holds value. They would not tolerate, as the products tested by the JAMIA team do, a document that reports a medication without including its dose, timing, and route of administration.

Not being software experts (for the most part), doctors can’t be expected to anticipate all problems, such as changes of data versions. They still need to work closely with standards experts and programmers.

It also should be mentioned that agile methods include rigorous testing, sometimes to the extent that programmers write tests before writing the code they are testing. So the process is by no means lax about programming errors and patient safety.

Finally, modern software teams maintain databases–often open to the users and even the general public–of reported errors. The health care field needs this kind of transparency. Clinicians need to be warned of possible problems with a software module.

What we’re talking about here is a design that creates a product intimately congruent with each site’s needs and workflow. The software is not imported into a clinical environment–much less imposed on one–but grows organically from it, as early developers of the VistA software at the Veterans Administration claimed to have done. Problems with document exchange would be caught immediately during such a process, and the programmers would work out a common format cooperatively–because that’s what the clinicians want them to do.

Data Ownership Disputes, Not Tech Challenges, Slow Interoperability

Posted on August 13, 2013 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Most of the time, when we discuss obstacles to interoperability, we focus on the varied technical issues and expense involved in data sharing between hospitals and doctors. And without a doubt, there are formidable technical challenges ahead — as well as financial ones  — on the road to full-on, fluid, national data exchange between providers.

But those aren’t the only obstacles to widespread interoperability, according to one health IT leader. There’s another issue lurking in the background which is also slowing the adoption of HIEs and other data-sharing plans, according to HIMSS head H. Stephen Lieber, who recently spoke to MedCity News. According to Lieber, the idea that providers (not patients) own clinical data is one of the biggest barriers standing in the way of broad interoperability.

“There is still some fine-tuning needed around how technology is adopted, but fundamentally it’s not a technology barrier. It’s a cultural barrier and it’s also a lack of a compelling case,” Lieber told MedCity News.

In Lieber’s experience, few institutions actually admit that they believe they own the data. But the truth is that they want to hold on to their data for competitive reasons, he told MedCity News.

What’s more, there’s actually a business case for not sharing data. After all, if a doctor or hospital has no data on a patient, they end up retesting and re-doing things — and get paid for it, Lieber notes.

Over time, however, hospitals and doctors will eventually be pushed hard in the direction of interoperability by changes in reimbursement, Lieber said. “Work is already being done in Washington to redesign reimbursement. Once Medicare heads down that path, commercial insurers will follow,” Lieber told the publication.

Lieber’s comments make a great deal of sense, and what’s more, focus on an aspect of interoperability which is seldom discussed. If hospitals and doctors still cling to a culture in which they own the clinical data, it’s most definitely going to make the task of building out HIEs more difficult. Let’s see if CMS actually comes up with a reimbursement structure that directly rewards data sharing; if it does, then I imagine you will see real change.

Specialty EHR Speaks that Specialty

Posted on June 19, 2013 I Written By

John Lynn is the Founder of the blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of and John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’ve long been a proponent of the role of specialty specific EHRs. In fact, at one point I suggested that a really great EHR company could be a roll up of the top specialty specific EHRs. I still think this would be an extraordinary company that could really compete with the top EHR vendors out there. For now, I haven’t seen anyone take that strategy.

There are just some really compelling reasons to focus your EHR on a specific specialty. In fact, what you find is that even the EHR vendor that claims to support every medical specialty is usually best fit for one or a couple specific specialties. Just ask for their client list and you’ll have a good idea of which specialty likes their system the most.

I was recently talking with a specialty EHR vendor and they made a good case for why specialists love working with them. The obvious one he didn’t mention was that the EHR functions are tailored to that specialty. Everyone sees and understands this.

What most people don’t think about is when they talk to the support or sales people at that company. This is particularly important with the support people. It’s a very different experience calling an EHR vendor call center that supports every medical specialty from one that supports only your specialty. They understand your specialties unique needs, terminology, and language. Plus, any reference clients they give you are going to be in your specialty so you can compare apples to apples.

Certainly there can be weaknesses in a specialty specific EHR. For example, if you’re in a large multi specialty organization you really can’t go with a specialty specific EHR. It’s just not going to happen. With so many practices being acquired by hospitals, this does put the specialty specific EHR at risk (depending on the specialty).

Another weakness is when you want to connect your EHR to an outside organization. Most of them can handle lab and prescription interfaces without too much pain. However, connecting to a hospital or HIE can often be a challenge or cost you a lot of money to make happen. Certainly the meaningful use interoperability requirements and HL7 standards help some. We’ll see if it’s enough or if the future of healthcare interoperability will need something more. For example, will specialty specific EHR be able to participate in CommonWell if it achieves its goals?

There’s a case to be made on both sides of the specialty specific EHR debate. As with most EHR decisions, you have to choose which things matter most to your clinic.

Telemedicine Not Connecting With EMRs

Posted on June 5, 2013 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

As smartphones and tablets become a standard part of healthcare as we know it, telemedicine is gaining a new foothold in medicine too.  In some cases, we’re talking off the cuff transactions in which, say, a patient e-mails a photo to a doctor who can then diagnose and prescribe.  But telemedicine is also taking root on an institutional level, with health systems rolling out projects across the country.

The problem is, however, that these telemedicine projects simply don’t integrate with EMRs, according to an article in SearchHealthIT.  The piece’s writer, Don Fluckinger, recently attended American Telemedicine Association’s 2013 Annual International Meeting & Trade Show, where complaints were rife that EMRs and telemedicine don’t interoperate.

I really liked this summary of the situation one executive shared with Fluckinger:

For now, the executive (who asked not to be named) said, telemedicine providers need to keep away from the “blast radius” of EHR vendor conflicts, lest their budgets get consumed by building interfaces to the various non-interoperable EHR systems.

Not only are health systems struggling to integrate telemedicine data with EMRs, telemedicine providers are in a bit of a difficult spot too, Fluckinger notes. As an example, he tells the tale of Seattle-based Carena Inc., a provider of primary care services to patients via phone and video, which provides after-hours support to physicians at Franciscan Health System in Tacoma, Wash.

Carena itself has an EMR which has the ability to share searchable PDF documents for use in patient EMRs, but Franciscan’s seven hospitals are bringing up an Epic implementation which can’t support this trick.  Top execs at Franciscan want to connect Carena’s data to Epic, but that won’t happen right away.  So Franciscan may end up setting up Carena’s after-hours service within Franciscan’s Epic installation to work around the interoperability problem.

This is just one sample of the interoperability obstacles healthcare organizations are encountering when they set out to create a telemedicine service. As telemedicine explodes with the use of portable devices, I can only imagine that this will impose one more pressure on vendors to conquer compatibility problems. (But sadly, I doubt it will force any real changes in the near future.)

EMR Vendors Want Meaningful Use Stage 3 Delay

Posted on January 29, 2013 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

A group of EMR vendors have joined the chorus of industry organizations asking that Meaningful Use Stage 3 deadlines be moved up to a later date.  The vendors also want to see the nature of Stage 3 requirements changed to put a greater emphasis on interoperabilityInformation Week reports.

The group, the HIMSS EHR Association (EHRA), represents 40 vendors pulled together by HIMSS.  Members include both enterprise and physician-oriented vendors, including athenahealth, Cerner, Epic, eClinicalWorks, Emdeon, Meditech, McKesson, Siemens GE Healthcare IT and Practice Fusion.

In comments submitted to HHS, the vendors argue that MU Stage 3 requirements should not kick in until three years after a provider reaches Stage 2, and start no earlier than 2017. But their larger request, and more significant one, is that they’d like to see Meaningful Use Stage 3’s focus changed:

“The EHRA strongly recommends that Stage 3 focus primarily on encouraging and assisting providers to take advantage of the substantial capabilities established in Stage 1 and especially Stage 2, rather than adding new meaningful use requirements and product certification criteria. In particular, we believe that any meaningful use and functionality changes should focus primarily on interoperability and building on accelerated momentum and more extensive use of Stage 2 capabilities and clinical quality measurement.”

So, we’ve finally got vendors like walled-garden-player Epic finding a reason to fight for interoperability. It took being clubbed by the development requirements of Stage 3, which seems to have EHRA members worried, but it happened nonetheless.

While there’s obviously self-interest in vendors asking not to strain their resources on new development, they still have a point which deserves considering.  Does it really make sense to push the development curve as far as Stage 3 requires before providers have gotten the chance to leverage what they’ve got?  Maybe not.

Now, the question is whether the vendors will put their code where their mouth is. Will the highly proprietary approach taken by Epic and some of its peers become passe?

EHR Benefit, Goodhart’s Law, and EHR Interoperability

Posted on January 27, 2013 I Written By

John Lynn is the Founder of the blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of and John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Thanks to Sherry for pointing us out to this example of the benefit of EHR. I hope that Sherry’s dad does well in surgery and recovers well.

I think that Charles might be on to something here. The interesting thing to me is that it’s very likely that looking back on the HITECH act, the most valuable part will just be shining the spotlight on EHR. It’s woken a lot of healthcare organizations up to EHR and what was happening with EHR that were in a cozy slumber. I think that’s the most important thing we can do to move healthcare IT forward.

I don’t see this getting better any time soon. Check out the entire Twitter thread for this message to get the full context of the discussion. I’m still bamboozled by why we can all see the value of exchanging data, the technical details have been solved (see HIMSS interoperability showcase) and yet we’re still not sharing data.

Will Big EMR Vendors Use Healthcare Standards As A Weapon?

Posted on October 9, 2012 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Standards are a tricky thing. Some times, they bring a technical niche to its senses and promote innovation, and others, they’re well-intentioned academic efforts which gain no ground.  From what I’ve seen over the years, the difference between which standards gain acceptance and which end up in trash bin of history has more to do with politics than technical merit.

But what the EMR industry did neither? From the mind of my crafty colleague John, here’s a scenario to consider.  What if rather than going with an industry-wide standard for interoperability, the big EMR vendors agreed on a standard they’d share and more or less shut out the smaller players?

Yeah, I already hear you asking: “Wouldn’t that be an antitrust violation?”  While I am not and probably never will be a lawyer, my guess is if a bunch of big vendors deliberately, obviously shut the smaller players out, it would be. But standards are so slippery that I bet it’d be a while before anyone outside of our industry saw something funny going on.

Besides, the government is doing everything in its power to get EMR vendors to help providers achieve interoperability. Right now ONC is not getting much cooperation — in fact, I’d characterize the big vendors’ stance as ‘passive aggressive’ at best.  So if Epic, Cerner, Siemens, MEDITECH and their brethren found a way to make their products work together, they might get a gold star rather then an FTC/DoJ slap on the wrist.

Besides, it would be in the interests of the bigger firms to include a few smaller players in their interoperability effort, the ones in the big boys’ sweet spots, and then “oops,” the smaller companies would get acquired and the knowledge would stay home.

Right now, as far as I can tell, it’s Epic versus the rest of the world, and that rest of the EMR world is not minded to play nicely with anyone else either. But if John can imagine a big-EMR-company standards-based coup d’etat happening, rest assured they have as well.

John’s Comment: Since Anne mentions this as my idea, I thought I’d weight in a little bit on the subject. While it’s possible that the big EHR vendors could adopt a different standard and shut out the small EHR vendors, I don’t think that’s likely. Instead of adopting a different standard, I could see the large EHR vendors basically prioritizing the interfaces with the small EHR vendors into oblivion.

In fact, in many ways the big EHR vendors could use the standard as a shield for what they’re doing. They’ll say that they can interface with any EHR vendor because they’re using the widely adopted standard. However, it’s one thing to have the technical capability to exchange healthcare information and a very different thing to actually create the trust relationship between EHR vendors to make the data sharing possible.

Think about it from a large EHR vendor perspective. Why do they want to be bothered with interoperability with 600+ EHR vendors? That’s a lot of work and is something that could actually hurt their business more than it helps.

My hope is that I’m completely wrong with this, but I’ve already seen the large EHR vendors getting together to make data sharing possible. The question is whether they’re sincerely doing this out of a desire to connect as many health records as quickly as possible or whether it is good strategy. My gut feeling is that it’s probably both. It just works out that the first is better to say in public and the second is just a nice result of doing the first.