Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Amazon Attacking Health IT Opportunities

Posted on August 17, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Getting a footing in the health IT industry is more challenging than it looks. After all, even tech giants like Microsoft, Apple, and Google haven’t managed to take over despite their evident interest in the field.

Apparently, that hasn’t daunted Amazon. The retail giant has pulled together a secret team dedicated to exploring new healthcare technology opportunities, according to a CNBC report. And unlike other companies attacking the space from outside, Amazon has a history of sliding its way into unexpected markets successfully.

According to CNBC the new team, which is named 1492, is working to find an easier way to extract data from EMRs as well as push data into them. In doing so, Amazon is going up against a very wide field of competitors ranging from small startups to the healthcare arms of giant tech vendors and consulting firms.

What distinguishes Amazon’s approach from its competitors is that the online retailer hopes to aggregate that data and make it available to consumers and their doctors, sources told CNBC. The story doesn’t say whether Amazon plans to sell this data, and I don’t know what’s legal and what isn’t here, but my bet is that if it can, Amazon will pitch the data to pharmaceutical companies. And where there’s a will there’s a way.

In addition to looking at data management opportunities, 1492 members are scouting out ways of repurposing Amazon’s existing technology for use in healthcare. As another article notes, some healthcare organizations have already begun experimenting with delivering routine medical information and even coaching surgeons on safety protocols using Amazon voice-based assistant Alexa.  The new group, for its part, will be looking for healthcare applications for existing Amazon products like the Echo and Dash Wand.

The 1492 group is also preparing to build a telemedicine platform. Your first thought might be that the industry doesn’t need another telemedicine platform, and generally speaking, you would probably be right.  But if Amazon can get its healthcare IT bona fides in order, and manages to attract enough doctors to its platform, it could be in a strong position to market those services to consumers.

Make no mistake: We should take Amazon’s health IT effort seriously. At first glance, healthcare may seem like an odd arena for a company best known for selling frying pans and socks and discount beauty supplies. But Amazon has expanded its focus many times over the years and has typically done better than people expected. It may do so this time as well.

By the way, the retailer is apparently still hiring people for the 1492 initiative. I doubt it’s easy to find the hiring manager in question, but if I were you I’d inquire. These jobs could pose some interesting challenges.

ONC To Farm Out Certification Testing To Private Sector – MACRA Monday

Posted on August 14, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

This post is part of the MACRA Monday series of blog posts where we dive into the details of the MACRA Quality Payment Program (QPP) and related topics.

EHR certification has been a big part of the meaningful use program and is now part of MACRA as well. After several years of using health IT certification testing tools developed by government organizations, the ONC has announced plans to turn the development of these tools over to the private sector.

Since its inception, ONC has managed its health IT’s education program internally, developing automated tools designed to measure health IT can compliance with certification requirements in partnership with the CDC, CMS and NIST. However, in a new blog post, Office of Standards and Technology director Steven Posnack just announced that ONC would be transitioning development of these tools to private industry over the next five years.

In the post, Posnack said that farming out tool development would bring diversity to certification effort and help it perform optimally. “We have set a goal…to include as many industry-developed and maintained testing tools as possible in lieu of taxpayer financed testing tools,” Posnack wrote. “Achieving this goal will enable the Program to more efficiently focus its testing resources and better aligned with industry-developed testing tools.”

Readers, I don’t have any insider information on this, but I have to think this transition was spurred (or at least sped up) by the eClinicalWorks certification debacle.  As we reported earlier this year, eCW settled a whistleblower lawsuit for $155 million a few months ago;  in the suit, the federal government asserted that the vendor had gotten its EHR certified by faking its capabilities. Of course the potential cuts to ONC’s budget could have spurred this as well.

I have no reason to believe that eCW was able to beat the system because ONC’s certification testing tools were inadequate. As we all know, any tool can be tricked if you throw the right people at the problem. On the other hand, it can’t hurt to turn tool development over to the private sector. Of course, I’m not suggesting that government coders are less skilled than private industry folks (and after all, lots of government technology work is done by private contractors), but perhaps the rhythms of private industry are better suited to this task.

It’s worth noting that this change is not just cosmetic. Poznack notes that with private industry at the helm, vendors may need to enter into new business arrangements and assume new fees depending on who has invested in the testing tools, what it costs to administer them and how the tools are used.

However, I’d be surprised if private sector companies that develop certification arrangements will stay tremendously far from the existing model. Health IT vendors may want to get their products certified, but they’re likely to push back hard if private companies jack up the price for being evaluated or create business structures that don’t work.

Honestly, I’d like to see the ONC stay on this path. I think it works best as a sort of think tank focused on finding best practices health IT companies across government and private industry, rather than sweating the smaller stuff as it has in recent times. Otherwise, it’s going to stay bogged down in detail and lose whatever thought leadership position it may have.

Before Investing In Health IT, Fix Your Processes

Posted on August 2, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Recently, my colleague John Lynn conducted a video interview with healthcare consultant and “recovering CIO” Drex DeFord (@drexdeford) on patient engagement and care coordination. During the interview, DeFord made a very interesting observation: “When you finally have a process leaned out to the point where [tech] can make fewer mistakes than a human, that’s the time to make big technology investments.”

This makes a lot of sense. If a process is refined enough, even a robot may be able to maintain it, but if it remains fuzzy or arbitrary that’s far less likely. And by extension, we shouldn’t automate processes until they’re clearly defined and efficient.

Honestly, as I see it this is just common sense. If the way things are done doesn’t work well, who wants to embed them in their IT infrastructure? Doing so is arguably worse than keeping a manual process in place. It may be simpler — though not easy — to change how people work than to rewrite complicated enterprise software then shift human routines.

Meanwhile, if you do rush ahead without refining your processes, you could be building dangerously flawed care into the system. Patients could suffer needless harm or even die. In fact, I can envision a situation in which a provider gets sued because their technology rollout perpetuated existing care management problems.

Unfortunately, CIOs have powerful incentives to roll ahead with their technology implementation plans whether they’ve optimized care processes or not.

Sometimes, they’re trying to satisfy CEOs pushing to get systems in gear no matter what. They can’t afford to alienate someone who could refuse to greenlight their plans for future investments, so they cross their fingers and plunge ahead. Other times, they might not be aware of serious care delivery problems and see no reason to let their implementation deadlines slip. Or perhaps they believe that they will be able to fix workflow problems during after the rollout. But if they thought they could act first and deal with workflow later, they may get a nasty surprise later.

Of course, the ultimate solution is for providers to invest in more flexible enterprise systems which support process improvements (including across mobile devices). To date, however, few big health IT platforms have strayed much from decades-old computing models that make change expensive and time-consuming. Such systems may be durable, but updating them to meet user needs is no picnic.

Eventually, you’ll be able to adjust health IT workflows without dispatching an army of developers. In the meantime, though, providers should anything they can to perfect processes, especially those related to care delivery, before they’re fixed in place by technology rollouts. Doing so may be a bit disruptive, but it’s the kind of disruption that helps rather than hurts.

Is The ONC Still Relevant?

Posted on July 18, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Today, I read an article in Healthcare IT News reporting on the latest word from the ONC. Apparently, during a recent press call, National Coordinator Donald Rucker, MD, gave an update on agency activities without sharing a single new idea.

Now, if I were the head of the ONC, I might do the same. I’m sure it played well with the wire services and daily newspapers reporters, most of whom don’t dig in to tech issues like interoperability too deeply.

But if I were wiseacre health IT blogger (and I am, of course) I’d react a bit differently. By which I mean that I would wonder aloud, very seriously, if the ONC is even relevant anymore. To be fair, I can’t judge the agency’s current efforts by what it said at a press conference, but I’m not going to ignore what was said, either.

According to HIN, the ONC sees developing a clear definition of interoperability, improving EMR usability and getting a better understanding of information blocking as key objectives.

To address some of these issues, Dr. Rucker apparently suggested that using open APIs, notably RESTful APIs like JSON, would be important to future EMR interoperability efforts. Reportedly, he’s also impressed with the FHIR standard, because it’s a modern API and because large vendors have very get some work with the SMART project.

To put it kindly, I doubt any of this was news to the health IT press.

Now, I’m not saying that Dr. Rucker got anything wrong, exactly. It’s hard to argue that we’re far behind when it comes to EMR usability, embarrassingly so. In fact, if we address that issue many of EMR-related efforts aren’t worth much. That being said, much of the rest strikes me as, well, lacking originality and/or substance.

Addressing interoperability by using open APIs? I’m pretty sure someone the health IT business has thought that through before. If Dr. Rucker knows this, why would he present this as a novel idea (as seems to be the case)? And if he doesn’t, is the agency really that far behind the curve?

Establishing full interoperability with FHIR? Maybe, someday. But at least as of a year ago, FHIR product director Grahame Grieve argued that people are “[making] wildly inflated claims about what is possible, [willfully] misunderstanding the limits of the technology and evangelizing the technology for all sorts of ill-judged applications.”  If Grieve thinks people are exaggerating FHIR’s capabilities, does ONC bring anything useful to the table by endorsing it?

Understanding information blocking?  Well, perhaps, but I think we already know what’s going on. At its core, this is a straightforward business use: EMR vendors and many of their customers have an incentive to make health data sharing tough. Until they have a stronger incentive share data, they won’t play ball voluntarily. And studying a much-studied problem probably won’t help things much.

To be clear, I’m relying on HIN as a source of facts here. Also, I realize that Dr. Rucker may have been simplifying things in an effort to address a general audience.

But if my overall impression is right, the news from this press conference isn’t encouraging. I would have hoped that by 2017, ONC would be advancing the ball further, and telling us something we truly didn’t know already. If it’s not sharing new ideas by this point, what good can it do? Maybe that’s why the rumors of HHS budget cuts could hit ONC really hard.

Did Meaningful Use Really Turn EMRs Into A Commodity?

Posted on July 12, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Not long ago, I had a nice email exchange with a sales manager with one of the top ambulatory EMR vendors.  He had written to comment on “The EMR Vendor’s Dilemma,” a piece I wrote about the difficult choices such vendors face in staying just slightly ahead of the market.

In our correspondence, he argued that Meaningful Use (MU) had led customers to see EMRs as commodities. I think he meant that MU sucked the innovation out of EMR development.

After reflecting on his comments, I realized that I didn’t quite agree that EMRs had become a commodity item. Though the MU program obviously relied on the use of commoditized, certified EMR technology, I’d argue that the industry has simply grown around that obstacle.

If anything, I’d argue that MU has actually sparked greater innovation in EMR development. Follow me for a minute here.

Consider the early stages of the EMR market. At the outset, say, in the 50s, there were a few innovators who figured out that medical processes could be automated, and built out versions of their ideas. However, there was essentially no market for such systems, so those who developed them had no incentive to keep reinventing them.

Over time, a few select healthcare providers developed platforms which had the general outline EMRs would later have, and vendors like Epic began selling packaged EMR systems. These emerging systems began to leverage powerful databases and connect with increasingly powerful front-end systems available to clinicians. The design for overall EMR architecture was still up for grabs, but some consensus was building on what its core was.

Eventually, the feds decided that it was time for mass EMR adoption, the Meaningful Use program came along. MU certification set some baselines standards for EMR vendors, leaving little practical debate as to what an EMR’s working parts were. Sure, at least at first, these requirements bled a lot of experimentation out of the market, and certainly discouraged wide-ranging innovation to a degree. But it also set the stage for an explosion of ideas.

Because the truth is, having a dull, standardized baseline that defines a product can be liberating. Having a basic outline to work with frees up energy and resources for use in innovating at the edges. Who wants to keep figuring out what the product is? There’s far more upside in, say, creating modules that help providers tackle their unique problems.

In other words, while commoditization solves one (less interesting) set of problems, it also lets vendors focus on the high-level solutions that arguably have the most potential to help providers.

That’s certainly been the case when an industry agrees on a technology specification set such as, say, the 802.11 and 802.11x standards for wireless LANs. I doubt Wi-Fi tech would be ubiquitous today if the IEEE hadn’t codified these standards. Yes, working from technical specs is different than building complex systems to meet multi-layered requirements, but I’d argue that the principle still stands.

All told, I think the feds did EMR vendors a favor when they created Meaningful Use EMR certification standards. I doubt the vendors could have found common ground any other way.

The EMR Vendor’s Dilemma

Posted on June 6, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Yesterday, I had a great conversation with an executive at one of the leading EMR vendors. During our conversation, she stressed that her company was focused on the future – not on shoring up its existing infrastructure, but rather, rebuilding its code into something “transformational.”

In describing her company’s next steps, she touched on many familiar bases, including population health, patient registries and mobile- first deployment to support clinicians. She told me that after several years of development, she felt her company was truly ready to take on operational challenges like delivering value-based care and conducting disease surveillance.

All that being said – with all due respect to the gracious exec with whom I spoke – I wouldn’t want to be a vendor trying to be transformed at the moment. As I see it, vendors who want to keep up with current EMR trends are stuck between a rock and a hard place.

On the one hand, such vendors need to support providers’ evolving health IT needs, which are changing rapidly as new models of care delivery are emerging. Not only do they need to provide the powerhouse infrastructure necessary to handle and route massive floods of data, they also need to help their customers reach and engage consumers in new ways.

To do so, however, they need to shoot at moving targets, or they won’t meet provider demand. Providers may not be sure what shape certain processes will take, but they still expect EMR vendors to keep up with their needs nonetheless. And that can certainly be tricky these days.

For example, while everybody is talking about population health management, as far as I know we still haven’t adopted a widely-accepted model for adopting it. Sure, people are arriving at many of the same conclusions about pop health, but their approach to rolling it out varies widely.  And that makes things very tough for vendors to create pop health technology.

And what about patient engagement solutions? At present, the tools providers use to engage patients with their care are all over the map, from portals to mobile apps to back-end systems using predictive analytics. Synchronizing and storing the data generated by these solutions is challenging enough. Figuring out what configuration of options actually produces results is even harder, and nobody, including the savviest EMR vendors, can be sure what the consensus model will be in the future.

Look, I’m aware that virtually all software vendors face this problem. It’s difficult as heck to decide when to lead the industry you serve and when to let the industry lead you. Straddling these two approaches successfully is what separates the men from the boys — or the girls from the women — and dictates who the winners and losers are in any technology market.

But arguably, health IT vendors face a particularly difficult challenge when it comes to keeping up with the times. There’s certainly few industries are in a greater state of flux, and that’s not likely to change anytime soon.

It will take some very fancy footwork to dance gracefully with providers. Within a few years, we’ll look back and know vendors adapted just enough.

Dogged By Privacy Concerns, Consumers Wonder If Using HIT Is Worthwhile

Posted on May 17, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

I just came across a survey suggesting that while we in the health IT world see a world of possibilities in emerging technologies, consumers aren’t so sure. The researchers found that consumers question the value of many tech platforms popular with health execs, apparently because they don’t trust providers to keep their personal health data secure.

The study, which was conducted between September and December 2016, was done by technology research firm Black Book. To conduct the survey, Black Book reached out to 12,090 adult consumers across the United States.

The topline conclusion from the study was that 57 percent of consumers who had been exposed to HIT through physicians, hospitals or ancillary providers doubted its benefits. Their concerns extended not only to EHRs, but also to many commonly-deployed solutions such as patient portals and mobile apps. The survey also concluded that 70 percent of Americans distrusted HIT, up sharply from just 10 percent in 2014.

Black Book researchers tied consumers’ skepticism to their very substantial  privacy concerns. Survey data indicated that 87 percent of respondents weren’t willing to divulge all of their personal health data, even if it improved their care.

Some categories of health information were especially sensitive for consumers. Ninety-nine percent were worried about providers sharing their mental health data with anyone but payers, 90 percent didn’t want their prescription data shared and 81 percent didn’t want information on their chronic conditions shared.

And their data security worries go beyond clinical data. A full 93 percent responding said they were concerned about the security of their personal financial information, particularly as banking and credit card data are increasingly shared among providers.

As a result, at least some consumers said they weren’t disclosing all of their health information. Also, 69 percent of patients admitted that they were holding back information from their current primary care physicians because they doubted the PCPs knew enough about technology to protect patient data effectively.

One of the reason patients are so protective of their data is because many don’t understand health IT, the survey suggested. For example, Black Book found that 92 percent of nurse leaders in hospital under 200 beds said they had no time during the discharge process to improve patient tech literacy. (In contrast, only 55 percent of nurse leaders working in large hospitals had this complaint, one of the few bright spots in Black Book’s data.)

When it comes to tech training, medical practices aren’t much help either. A whopping 96 percent of patients said that physicians and staff didn’t do a good job of explaining how to use the patient portal. About 40 percent of patients tried to use their medical practice’s portal, but 83 percent said they had trouble using it when they were at home.

All that being said, consumers seemed to feel much differently about data they generate on their own. In fact, 91 percent of consumers with wearables reported that they’d like to see their physician practice’s medical record system store any health data they request. In fact, 91 percent of patients who feel that their apps and devices were important to improving their health were disappointed when providers wouldn’t store their personal data.

Using AI To Streamline EMR Workflow For Clinicians

Posted on May 10, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Understandably, most of the discussion around AI use in healthcare focuses on data analytics for population health management and predictive analytics. Given the massive scale of the data we’re collecting, that’s no surprise.

In fact, one could argue that using AI technologies has gone from an interesting idea to an increasingly established parto the health IT mix. After all, few human beings can truly understand what’s revealed by terabytes of data on their own, even using well-designed dashboards, filters, scripting and what have you. I believe it takes a self-educating AI “persona,” if you will, to glean advanced insights from the eternity of information we have today.

That being said, I believe there’s other compelling uses for AI-fueled technologies for healthcare organizations. If we use even a relatively simple form of interpretive intelligence, we can improve health IT workflows for clinicians.

As clinicians have pointed out over and over, most of what they do with EMRs is repetitive monkey work, varied only by the need to customize small but vital elements of the medical record. Tasks related to that work – such as sending copies of a CT scan to a referring doctor – usually have to be done in another application. (And that’s if they’re lucky. They might be forced to hunt down and mail a DVD disc loaded with the image.)

Then there’s documentation work which, though important enough, has to be done in a way to satisfy payers. I know some practice management systems that integrate with the office EMR auto-populate the patient record with coding and billing information, but my sense is that this type of automation wouldn’t scale within a health system given the data silos that still exist.

What if we used AI to make all of this easier for providers? I’m talking about using a predictive intelligence, integrated with the EMR, that personalizes the way data entry, documentation and follow-up needs are presented. The AI solution could automatically queue up or even execute some of the routine tasks on its own, leaving doctors to focus on the essence of their work. We all know Dr. Z doesn’t really want to chase down that imaging study and mail it to Albany. AI technology could also route patients to testing and scans in the most efficient manner, adjusted for acuity of course.

While AI development has been focused on enterprise issues for some time, it’s already moving beyond the back office into day-to-day care. In fact, always-ahead-of-the-curve Geisinger Health System is already doing a great deal to bring AI and predictive analytics to the bedside.

Geisinger, which has had a full-featured EMR in place since 1996, was struggling to aggregate and manage patient data, largely because its legacy analytics systems couldn’t handle the flood of new data types emerging today.

To address the problem, the system rolled out a unified data architecture which allowed it to integrate current data with its existing data analytics and management tools. This includes a program bringing together all sepsis-vulnerable patient information in one place as they travel through the hospital. The tool uses real-time data to track patients in septic shock, helping doctors to stick to protocols.

As for me, I’d like to see AI tools pushed further. Let’s use them to lessen the administrative burden on overworked physicians, eliminating needless chores and simplifying documentation workflow. And it’s more than time to use AI capabilities to create a personalized, efficient EMR workflow for every clinician.

Think I’m dreaming here? I hope not! Using AI to eliminate physician hassles could be a very big deal.

Provider-Backed Health Data Interoperability Organization Launches

Posted on April 12, 2017 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

In 1988, some members of the cable television industry got together to form CableLabs, a non-proft innovation center and R&D lab. Since then, the non-profit has been a driving force in bringing cable operators together, developing technologies and specifications for services as well as offering testing and certification facilities.

Among its accomplishments is the development of DOCSIS (Data-over-Cable Service Interface Specification), a standard used worldwide to provide Internet access via a cable modem. If your cable modem is DOCSIS compliant, it can be used on any modern cable network.

If you’re thinking this approach might work well in healthcare, you’re not the only one. In fact, a group of powerful healthcare providers as just launched a health data sharing-focused organization with a similar approach.

The Center for Medical Interoperability, which will be housed in a 16,000-square-foot location in Nashville, is a membership-based organization offering a testing and certification lab for devices and systems. The organization has been in the works since 2011, when the Gary and Mary West Health Institute began looking at standards-based approaches to medical device interoperability.

The Center brings together a group of top US healthcare systems – including HCA Healthcare, Vanderbilt University and Community Health Systems — to tackle interoperability issues collaboratively.  Taken together, the board of directors represent more than 50 percent of the healthcare industry’s purchasing power, said Kerry McDermott, vice president of public policy and communications for the Center.

According to Health Data Management, the group will initially focus on acute care setting within a hospital, such as the ICU. In the ICU, patients are “surrounded by dozens of medical devices – each of which knows something valuable about the patient  — but we don’t have a streamlined way to aggregate all that data and make it useful for clinicians,” said McDermott, who spoke with HDM.

Broadly speaking, the Center’s goal is to let providers share health information as seamlessly as ATMs pass banking data across their network. To achieve that goal, its leaders hope to serve as a force for collaboration and consensus between healthcare organizations.

The project’s initial $10M in funding, which came from the Gary and Mary West Foundation, will be used to develop, test and certify devices and software. The goal will be to develop vendor-neutral approaches that support health data sharing between and within health systems. Other goals include supporting real-time one-to-many communications, plug-and-play device and system integration and the use of standards, HDM reports.

It will also host a lab known as the Transformation Learning Center, which will help clinicians explore the impact of emerging technologies. Clinicians will develop use cases for new technologies there, as well as capturing clinical requirements for their projects. They’ll also participate in evaluating new technologies on their safety, usefulness, and ability to satisfy patients and care teams.

As part of its efforts, the Center is taking a close look at the FHIR API.  Still, while FHIR has great potential, it’s not mature yet, McDermott told the magazine.

Two Worth Reading

Posted on April 6, 2017 I Written By

When Carl Bergman isn't rooting for the Washington Nationals or searching for a Steeler bar, he’s Managing Partner of EHRSelector.com, a free service for matching users and EHRs. For the last dozen years, he’s concentrated on EHR consulting and writing. He spent the 80s and 90s as an itinerant project manger doing his small part for the dot com bubble. Prior to that, Bergman served a ten year stretch in the District of Columbia government as a policy and fiscal analyst.

HIT is a relatively small world that generates no end of notices, promotions and commentaries. You can usually skim them, pick out what’s new or different and move on. Recently, I’ve run into two articles that deserve a slow, savored reading: Politico’s Arthur Allen’s History of VistA, the VA’s homegrown EHR and Julia Adler-Milstein’s take on interoperability’s hard times.

VistA: An Old Soldier That May Just Fade Away – Maybe

The VA’s EHR is not only older than just about any other EHR, it’s older than just about any app you’ve used in the last ten years. It started when Jimmy Carter was in his first presidential year. It was a world of mainframes running TSO and 3270 terminals. Punch cards still abounded and dialup modems were rare. Even then, there were doctors and programmers who wanted to move vet’s hard copy files into a more usable, shareable form.

Arthur Allen has recounted their efforts, often clandestine, in tracking VistA’s history. It’s not only a history of one EHR and how it has fallen in and out of favor, but it’s also a history of how personal computing has grown, evolved and changed. Still a user favorite, it looks like its accumulated problems, often political as much as technical, may mean it will finally meet its end – or maybe not. In any event, Allen has written an effective, well researched piece of technological history.

Adler-Milstein: Interoperability’s Not for the Faint of Heart

Adler-Milstein, a University of Michigan Associate Professor of Health Management and Policy has two things going for her. She knows her stuff and she writes in a clear, direct prose. It’s a powerful and sadly rare combination.

In this case, she probes the seemingly simple issue of HIE interoperability or the lack thereof. She first looks at the history of EHR adoption, noting that MU1 took a pass on I/O. This was a critical error, because it:

[A]llowed EHR systems to be designed and adopted in ways that did not take HIE into account, and there were no market forces to fill the void.

When stage two with HIE came along, it meant retrofitting thousands of systems. We’ve been playing catch up, if at all, ever since.

Her major point is simple. It’s in everyone’s interest to find ways of making I/O work and that means abandoning fault finding and figuring out what can work.