Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

CommonWell and Healthcare Interoperability

Posted on June 27, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

UPDATE: In case you missed the live interview, you can watch the recorded interview on YouTube below:

2016 June - CommonWell and Healthcare Interoperability-headshots

For our next Healthcare Scene interview, we’ll be sitting down with Scott Stuewe, Director at Cerner Network and Daniel Cane, CEO & Co-Founder at Modernizing Medicine on Wednesday, June 29, 2016 at 3 PM ET (Noon PT). Cerner was one of the Founding Members of CommonWell and Modernizing Medicine just announced they were joining CommonWell. No doubt these diverse perspectives will provide an engaging discussion about the work CommonWell is doing to improve healthcare data sharing.

You can join my live conversation with Scott Stuewe and Daniel Cane and even add your own comments to the discussion or ask them questions. All you need to do to watch live is visit this blog post on Wednesday, June 29, 2016 at 3 PM ET (Noon PT) and watch the video embed at the bottom of the post or you can subscribe to the blab directly. We’re hoping to include as many people in the conversation as possible. The discussion will be recorded as well and available on this post after the interview.

As we usually do with these interviews, we’ll be doing a more formal interview with Scott Stuewe and Daniel Cane for the first ~30 minutes of this conversation. Then, we’ll open up the floor for others to ask questions or join us on camera. CommonWell has become a big player in the healthcare interoperability space with most of the major EHR vendors involved, so we’re excited to learn more about what’s happening with CommonWell.

If you’d like to see the archives of Healthcare Scene’s past interviews, you can find and subscribe to all of Healthcare Scene’s interviews on YouTube.

When Did A Doctor Last Worry About Social Determinants of Health (SDOH)?

Posted on June 16, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’ve heard over and over the importance of social determinants of health (SDOH) and their impact on healthcare costs. The concept is fascinating and challenging. There are thousands of examples. A simple one to illustrate the challenge is the patient who arrives at the emergency room with a fever. The doctor treats the fever and then sends them back to their home where they have no heat and are likely to get sick again.

I ask all the doctors that read this blog, when was the last time you worried about these various social determinants of health (SDOH) in the care you provided a patient?

I’ll be interested to hear people’s responses to this question. I’m sure it would create some incredible stories from doctors who really care about their patients and go above and beyond their job duties. In fact, it would be amazing to hear and share some of these stories. We could learn a lot from them. However, I’m also quite sure that almost all of those stories would end with the doctor saying “I wasn’t paid to help the patient this way but it was the right thing to do.”

Let me be clear. I’m not blaming doctors for not doing more for their patients. If I were a doctor, I’m sure I’d have made similar decisions to most of the doctors out there. They do what they’re paid to do.

As I’ve been sitting through the AHIP Institute conference, I’m pondering on if this will change. Will value based reimbursement force doctors to understand SDOH or will they just leave that to their health system or their various software systems to figure it out for them?

I’m torn on the answer to that question. A part of me thinks that most doctors won’t want to dive into that area of health. Their training wasn’t designed for that type of thinking and it would be a tough transition of mindset for many. On the other hand, I think there’s a really important human component that’s going to be required in SDOH. Doctors have an inherent level of trust that is extremely valuable with patients.

What do you think of SDOH? Will doctors need to learn about it? Will the systems just take care of it for them?

When Providing a Health Service, the Infrastructure Behind the API is Equally Important

Posted on May 2, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

In my ongoing review of application programming interfaces (APIs) as a technical solution for offering rich and flexible services in health care, I recently ran into two companies who showed as much enthusiasm for their internal technologies behind the APIs as for the APIs themselves. APIs are no longer a novelty in health services, as they were just five years ago. As the field gets crowded, maintenance and performance take on more critical roles in offering a successful business–so let’s see how Orion Health and Mana Health back up their very different offerings.

Orion Health

This is a large analytics firm that has staked a a major claim in the White House’s Precision Medicine Initiative. Orion Health’s data platform, Amadeus, addresses population health management as well as “considering how they can better tailor care for each chronically ill individual,” as put by Dave Bennett, executive vice president for Product & Strategy. “We like to say that population health is the who and precision medicine is the how.” Thus, Amadeus can harmonize a huge variety of inputs, such as how many steps a patient takes each day at home, to prevent readmissions.

Orion Health has a cloud service, a capacity for handling huge data sets such as genomes, and a selection of tools for handling such varied sources as clinical, claims, pharmacy, genetic, and consumer device or other patient-generated data. Environmental and social data are currently being added. It has more than 90 million patient records in its systems worldwide.

Patient matching links up data sets from different providers. All this data is ingested, normalized, and made accessible through APIs to authorized parties. Customers can write their own applications, visualizations, and SQL queries. Amadeus is used by the Centers for Disease Control, and many hospitals join the chorus to submit data to the CDC.

So far, Orion Health resembles some other big initiatives that major companies in the health care space are offering. I covered services from Philips in a recent article, and another site talks about GE. Bennett says that Orion Health really distinguishes itself through the computing infrastructure that drives the analytics and data access.

Many companies use conventional relational database as their canonical data store. Relational databases are 1980s-era technology, unmatched in their robustness and sophistication in querying (through the SQL language), but becoming a bottleneck for the data sizes that health analytics deals with.

Over the past decade, every industry that needs to handle enormous, streaming sets of data has turned to a variety of data stores known collectively as NoSQL. Ironically, these are often conceptually simpler than SQL databases and have roots going much farther back in computing history (such as key/value stores). But these data stores let organizations run a critical subset of queries in real time over huge data sets. In addition, analytics are carried out by newer MapReduce algorithms and in-memory services such as Spark. As an added impetus for development, these new technologies are usually free and open source software.

Amadeus itself stores data in Cassandra, one of the most mature NoSQL data stores, and uses Spark for processing. According to Bennett, “Spark enables Amadeus to future proof healthcare organizations for long term innovation. Bringing data and analytics together in the cloud allows our customers to generate deeper insights efficiently and with increased relevancy, due to the rapidity of the analytics engine and the streaming of current data in Amadeus. All this can be done at a lower cost than traditional healthcare analytics that move the data from various data warehouses that are still siloed.” Elastic Search is also used. In short, the third-party tools used within Orion Health are ordinary and commonly found. It is simply modern in the same way as computing facilities in other industries–così fan tutte.

Mana Health

This company integrates device data into EHRs and other data stores. It achieved fame when it was chosen for the New York State patient portal. According to Raj Amin, co-founder and Executive Chairman, the company won over the judges with the convenient and slick tile concept in their user interface. Each tile could be clicked to reveal a deeper level of detail in the data. The company tries to serve clinicians, patients, and data analysts alike. Clients include HIEs, health systems, medical device manufacturers, insurers, and app developers.

Like Orion Health, Mana Health is very conscious of staying on the leading edge of technology. They are mobile-friendly and architect their solutions using microservices, a popular form of modular development that attempts to maximize flexibility in coding and deploying new services. On a lark, they developed a VR engine compatible with the Oculus Rift to showcase what can creatively be built on their API. Although this Rift project has no current uses, the development effort helps them stay flexible so that they can adapt to whatever new technologies come down the pike.

Because Mana Health developed their API some eighteen months ago, they pre-dated some newer approaches and standards. They plan to offer compatibility with emerging standards such as FHIR that see industry adoption. The company recently was announced as a partner in the Commonwell Alliance, a project formed by a wide selection of major EHR vendors to pursue interoperability.

To support machine learning, Mana Health stores data in an open source database called Neo4j. This is a very unusual technology called a graph database, whose history and purposes I described two years ago.

Graphs are familiar to anyone who has seen airline maps showing the flights between cities. Graphs are also common for showing social connections, such as your friends-of-friends on Facebook. In health care, as well, graphs are very useful tools. They show relationships, but in a very different way from relational databases. Graphs are better than relational databases at tracing connections between people or other entities. For instance, a team led by health IT expert Fred Trotter used Neo4J to store and query the data in DocGraph, linking primary care physicians to the specialists to which they refer patients.

In their unique ways, Mana Health and Orion Health follow trends in the computing industry and judiciously choose tools that offer new forms of access to data, while being proven in the field. Although commenters in health IT emphasize the importance of good user interfaces, infrastructure matters too.

Another Quality Initiative Ahead of Its Time, From California

Posted on March 21, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

When people go to get health care–or any other activity–we evaluate it for both cost and quality. But health care regulators have to recognize when the ingredients for quality assessment are missing. Otherwise, assessing quality becomes like the drunk who famously looked for his key under the lamplight instead of where the key actually lay. And sadly, as I read a March 4 draft of a California initiative to rate health care insurance, I find that once again the foundations for assessing quality are not in place, and we are chasing lamplights rather than the keys that will unlock better care.

The initiative I’ll discuss in this article comes out of Covered California, one of the Unites States’ 13 state-based marketplaces for health insurance mandated by the ACA. (All the other states use a federal marketplace or some hybrid solution.) As the country’s biggest state–and one known for progressive experiments–California is worth following to see how adept they are at promoting the universally acknowledged Triple Aim of health care.

An overview of health care quality

There’s no dearth of quality measurement efforts in health care–I gave a partial overview in another article. The Covered California draft cites many of these efforts and advises insurers to hook up with them.

Alas–there are problems with all the quality control efforts:

  • Problems with gathering accurate data (and as we’ll see in California’s case, problems with the overhead and bureaucracy created by this gathering)

  • Problems finding measures that reflect actual improvements in outcomes

  • Problems separating things doctors can control (such as follow-up phone calls) with things they can’t (lack of social supports or means of getting treatment)

  • Problems turning insights into programs that improve care.

But the biggest problem in health care quality, I believe, is the intractable variety of patients. How can you say that a particular patient with a particular combination of congestive heart failure, high blood pressure, and diabetes should improve by a certain amount over a certain period of time? How can you guess how many office visits it will take to achieve a change, how many pills, how many hospitalizations? How much should an insurer pay for this treatment?

The more sophisticated payers stratify patients, classifying them by the seriousness of their conditions. And of course, doctors have learned how to game that system. A cleverly designed study by the prestigious National Bureau of Economic Research has uncovered upcoding in the U.S.’s largest quality-based reimbursement program, Medicare Advantage. They demonstrate that doctors are gaming the system in two ways. First, as the use of Medicare Advantage goes up, so do the diagnosed risk levels of patients. Second, patients who transition from private insurance into Medicare Advantage show higher risk not seen in fee-for-service Medicare.

I don’t see any fixes in the Covered California draft to the problem of upcoding. Probably, like most government reimbursement programs, California will slap on some weighting factor that rewards hospitals with higher numbers of poor and underprivileged patients. But this is a crude measure and is often suspected of underestimating the extra costs these patients bring.

A look at the Covered California draft

Covered California certainly understands what the health care field needs, and one has to be impressed with the sheer reach and comprehensiveness of their quality plan. Among other things, they take on:

  • Patient involvement and access to records (how the providers hated that in the federal Meaningful Use requirements!)

  • Racial, ethnic, and gender disparities

  • Electronic record interoperability

  • Preventive health and wellness services

  • Mental and behavioral health

  • Pharmaceutical costs

  • Telemedicine

If there are any pet initiatives of healthcare reformers that didn’t make it into the Covered California plan, I certainly am having trouble finding them.

Being so extensive, the plan suffers from two more burdens. First, the reporting requirements are enormous–I would imagine that insurers and providers would balk simply at that. The requirements are burdensome partly because Covered California doesn’t seem to trust that the major thrust of health reform–paying for outcomes instead of for individual services–will provide an incentive for providers to do other good things. They haven’t forgotten value-based reimbursement (it’s in section 8.02, page 33), but they also insist on detailed reporting about patient engagement, identifying high-risk patients, and reducing overuse through choosing treatments wisely. All those things should happen on their own if insurers and clinicians adopt payments for outcomes.

Second, many of the mandates are vague. It’s not always clear what Covered California is looking for–let alone how the reporting requirements will contribute to positive change. For instance, how will insurers be evaluated in their use of behavioral health, and how will that use be mapped to meeting the goals of the Triple Aim?

Is rescue on the horizon?

According to a news report, the Covered California plan is “drawing heavy fire from medical providers and insurers.” I’m not surprised, given all the weaknesses I found, but I’m disappointed that their objections (as stated in the article) come from the worst possible motivation: they don’t like its call for transparent pricing. Hiding the padding of costs by major hospitals, the cozy payer/provider deals, and the widespread disparities unrelated to quality doesn’t put providers and insurers on the moral high ground.

To me, the true problem is that the health care field has not learned yet how to measure quality and cost effectiveness. There’s hope, though, with the Precision Medicine initiative that recently celebrated its first anniversary. Although analytical firms seem to be focusing on processing genomic information from patients–a high-tech and lucrative undertaking, but one that offers small gains–the real benefit would come if we “correlate activity, physiological measures and environmental exposures with health outcomes.” Those sources of patient variation account for most of the variability in care and in outcomes. Capture that, and quality will be measurable.

#HIMSS16: Some Questions I Plan To Ask

Posted on February 1, 2016 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

As most readers know, health IT’s biggest annual event is just around the corner, and the interwebz are heating up with discussions about what #HIMSS16 will bring. The show, which will take place in Las Vegas from February 29 to March 4, offers a ludicrously rich opportunity to learn about new HIT developments — and to mingle with more than 40,000 of the industry’s best and brightest (You may want to check out the session Healthcare Scene is taking part in and the New Media Meetup).

While you can learn virtually anything healthcare IT related at HIMSS, it helps to have an idea of what you want to take away from the big event. In that spirit, I’d like to offer some questions that I plan to ask, as follows:

  • How do you plan to support the shift to value-based healthcare over the next 12 months? The move to value-based payment is inevitable now, be it via ACOs or Medicare incentive programs under the Medicare Access and CHIP Reauthorization Act. But succeeding with value-based payment is no easy task. And one of the biggest challenges is building a health IT infrastructure that supports data use to manage the cost of care. So how do health systems and practices plan to meet this technical challenge, and what vendor solutions are they considering? And how do key vendors — especially those providing widely-used EMRs — expect to help?
  • What factors are you considering when you upgrade your EMR? Signs increasingly suggest that this may be the year of the forklift upgrade for many hospitals and health systems. Those that have already invested in massiveware EMRs like Cerner and Epic may be set, but others are ripping out their existing systems (notably McKesson). While in previous years the obvious blue-chip choice was Epic, it seems that some health systems are going with other big-iron vendors based on factors like usability and lower long-term cost of ownership. So, given these trends, how are health systems’ HIT buying decisions shaping up this year, and why?
  • How much progress can we realistically expect to make with leveraging population health technology over the next 12 months? I’m sure that when I travel the exhibit hall at HIMSS16, vendor banners will be peppered with references to their population health tools. In the past, when I’ve asked concrete questions about how they could actually impact population health management, vendor reps got vague quickly. Health system leaders, for their part, generally admit that PHM is still more a goal than a concrete plan.  My question: Is there likely to be any measurable progress in leveraging population health tech this year? If so, what can be done, and how will it help?
  • How much impact will mobile health have on health organizations this year? Mobile health is at a fascinating moment in its evolution. Most health systems are experimenting with rolling out their own apps, and some are working to integrate those apps with their enterprise infrastructure. But to date, it seems that few (if any) mobile health efforts have made a real impact on key areas like management of chronic conditions, wellness promotion and clinical quality improvement. Will 2016 be the year mobile health begins to deliver large-scale, tangible health results? If so, what do vendors and health leaders see as the most promising mHealth models?

Of course, these questions reflect my interests and prejudices. What are some of the questions that you hope to answer when you go to Vegas?

Significant Articles in the Health IT Community in 2015

Posted on December 15, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Have you kept current with changes in device connectivity, Meaningful Use, analytics in healthcare, and other health IT topics during 2015? Here are some of the articles I find significant that came out over the past year.

The year kicked off with an ominous poll about Stage 2 Meaningful Use, with implications that came to a head later with the release of Stage 3 requirements. Out of 1800 physicians polled around the beginning of the year, more than half were throwing in the towel–they were not even going to try to qualify for Stage 2 payments. Negotiations over Stage 3 of Meaningful Use were intense and fierce. A January 2015 letter from medical associations to ONC asked for more certainty around testing and certification, and mentioned the need for better data exchange (which the health field likes to call interoperability) in the C-CDA, the most popular document exchange format.

A number of expert panels asked ONC to cut back on some requirements, including public health measures and patient view-download-transmit. One major industry group asked for a delay of Stage 3 till 2019, essentially tolerating a lack of communication among EHRs. The final rules, absurdly described as a simplification, backed down on nothing from patient data access to quality measure reporting. Beth Israel CIO John Halamka–who has shuttled back and forth between his Massachusetts home and Washington, DC to advise ONC on how to achieve health IT reform–took aim at Meaningful Use and several other federal initiatives.

Another harbinger of emerging issues in health IT came in January with a speech about privacy risks in connected devices by the head of the Federal Trade Commission (not an organization we hear from often in the health IT space). The FTC is concerned about the security of recent trends in what industry analysts like to call the Internet of Things, and medical devices rank high in these risks. The speech was a lead-up to a major report issued by the FTC on protecting devices in the Internet of Things. Articles in WIRED and Bloomberg described serious security flaws. In August, John Halamka wrote own warning about medical devices, which have not yet started taking security really seriously. Smart watches are just as vulnerable as other devices.

Because so much medical innovation is happening in fast-moving software, and low-budget developers are hankering for quick and cheap ways to release their applications, in February, the FDA started to chip away at its bureaucratic gamut by releasing guidelines releasing developers from FDA regulation medical apps without impacts on treatment and apps used just to transfer data or do similarly non-transformative operations. They also released a rule for unique IDs on medical devices, a long-overdue measure that helps hospitals and researchers integrate devices into monitoring systems. Without clear and unambiguous IDs, one cannot trace which safety problems are associated with which devices. Other forms of automation may also now become possible. In September, the FDA announced a public advisory committee on devices.

Another FDA decision with a potential long-range impact was allowing 23andMe to market its genetic testing to consumers.

The Department of Health and Human Services has taken on exceedingly ambitious goals during 2015. In addition to the daunting Stage 3 of Meaningful Use, they announced a substantial increase in the use of fee-for-value, although they would still leave half of providers on the old system of doling out individual payments for individual procedures. In December, National Coordinator Karen DeSalvo announced that Health Information Exchanges (which limit themselves only to a small geographic area, or sometimes one state) would be able to exchange data throughout the country within one year. Observers immediately pointed out that the state of interoperability is not ready for this transition (and they could well have added the need for better analytics as well). HHS’s five-year plan includes the use of patient-generated and non-clinical data.

The poor state of interoperability was highlighted in an article about fees charged by EHR vendors just for setting up a connection and for each data transfer.

In the perennial search for why doctors are not exchanging patient information, attention has turned to rumors of deliberate information blocking. It’s a difficult accusation to pin down. Is information blocked by health care providers or by vendors? Does charging a fee, refusing to support a particular form of information exchange, or using a unique data format constitute information blocking? On the positive side, unnecessary imaging procedures can be reduced through information exchange.

Accountable Care Organizations are also having trouble, both because they are information-poor and because the CMS version of fee-for-value is too timid, along with other financial blows and perhaps an inability to retain patients. An August article analyzed the positives and negatives in a CMS announcement. On a large scale, fee-for-value may work. But a key component of improvement in chronic conditions is behavioral health which EHRs are also unsuited for.

Pricing and consumer choice have become a major battleground in the current health insurance business. The steep rise in health insurance deductibles and copays has been justified (somewhat retroactively) by claiming that patients should have more responsibility to control health care costs. But the reality of health care shopping points in the other direction. A report card on state price transparency laws found the situation “bleak.” Another article shows that efforts to list prices are hampered by interoperability and other problems. One personal account of a billing disaster shows the state of price transparency today, and may be dangerous to read because it could trigger traumatic memories of your own interactions with health providers and insurers. Narrow and confusing insurance networks as well as fragmented delivery of services hamper doctor shopping. You may go to a doctor who your insurance plan assures you is in their network, only to be charged outrageous out-of-network costs. Tools are often out of date overly simplistic.

In regard to the quality ratings that are supposed to allow intelligent choices to patients, A study found that four hospital rating sites have very different ratings for the same hospitals. The criteria used to rate them is inconsistent. Quality measures provided by government databases are marred by incorrect data. The American Medical Association, always disturbed by public ratings of doctors for obvious reasons, recently complained of incorrect numbers from the Centers for Medicare & Medicaid Services. In July, the ProPublica site offered a search service called the Surgeon Scorecard. One article summarized the many positive and negative reactions. The New England Journal of Medicine has called ratings of surgeons unreliable.

2015 was the year of the intensely watched Department of Defense upgrade to its health care system. One long article offered an in-depth examination of DoD options and their implications for the evolution of health care. Another article promoted the advantages of open-source VistA, an argument that was not persuasive enough for the DoD. Still, openness was one of the criteria sought by the DoD.

The remote delivery of information, monitoring, and treatment (which goes by the quaint term “telemedicine”) has been the subject of much discussion. Those concerned with this development can follow the links in a summary article to see the various positions of major industry players. One advocate of patient empowerment interviewed doctors to find that, contrary to common fears, they can offer email access to patients without becoming overwhelmed. In fact, they think it leads to better outcomes. (However, it still isn’t reimbursed.)

Laws permitting reimbursement for telemedicine continued to spread among the states. But a major battle shaped up around a ruling in Texas that doctors have a pre-existing face-to-face meeting with any patient whom they want to treat remotely. The spread of telemedicine depends also on reform of state licensing laws to permit practices across state lines.

Much wailing and tears welled up over the required transition from ICD-9 to ICD-10. The AMA, with some good arguments, suggested just waiting for ICD-11. But the transition cost much less than anticipated, making ICD-10 much less of a hot button, although it may be harmful to diagnosis.

Formal studies of EHR strengths and weaknesses are rare, so I’ll mention this survey finding that EHRs aid with public health but are ungainly for the sophisticated uses required for long-term, accountable patient care. Meanwhile, half of hospitals surveyed are unhappy with their EHRs’ usability and functionality and doctors are increasingly frustrated with EHRs. Nurses complained about technologies’s time demands and the eternal lack of interoperability. A HIMSS survey turned up somewhat more postive feelings.

EHRs are also expensive enough to hurt hospital balance sheets and force them to forgo other important expenditures.

Electronic health records also took a hit from ONC’s Sentinel Events program. To err, it seems, is not only human but now computer-aided. A Sentinel Event Alert indicated that more errors in health IT products should be reported, claiming that many go unreported because patient harm was avoided. The FDA started checking self-reported problems on PatientsLikeMe for adverse drug events.

The ONC reported gains in patient ability to view, download, and transmit their health information online, but found patient portals still limited. Although one article praised patient portals by Epic, Allscripts, and NextGen, an overview of studies found that patient portals are disappointing, partly because elderly patients have trouble with them. A literature review highlighted where patient portals fall short. In contrast, giving patients full access to doctors’ notes increases compliance and reduces errors. HHS’s Office of Civil Rights released rules underlining patients’ rights to access their data.

While we’re wallowing in downers, review a study questioning the value of patient-centered medical homes.

Reuters published a warning about employee wellness programs, which are nowhere near as fair or accurate as they claim to be. They are turning into just another expression of unequal power between employer and employee, with tendencies to punish sick people.

An interesting article questioned the industry narrative about the medical device tax in the Affordable Care Act, saying that the industry is expanding robustly in the face of the tax. However, this tax is still a hot political issue.

Does anyone remember that Republican congressmen published an alternative health care reform plan to replace the ACA? An analysis finds both good and bad points in its approach to mandates, malpractice, and insurance coverage.

Early reports on use of Apple’s open ResearchKit suggested problems with selection bias and diversity.

An in-depth look at the use of devices to enhance mental activity examined where they might be useful or harmful.

A major genetic data mining effort by pharma companies and Britain’s National Health Service was announced. The FDA announced a site called precisionFDA for sharing resources related to genetic testing. A recent site invites people to upload health and fitness data to support research.

As data becomes more liquid and is collected by more entities, patient privacy suffers. An analysis of web sites turned up shocking practices in , even at supposedly reputable sites like WebMD. Lax security in health care networks was addressed in a Forbes article.

Of minor interest to health IT workers, but eagerly awaited by doctors, was Congress’s “doc fix” to Medicare’s sustainable growth rate formula. The bill did contain additional clauses that were called significant by a number of observers, including former National Coordinator Farzad Mostashari no less, for opening up new initiatives in interoperability, telehealth, patient monitoring, and especially fee-for-value.

Connected health took a step forward when CMS issued reimbursement guidelines for patient monitoring in the community.

A wonky but important dispute concerned whether self-insured employers should be required to report public health measures, because public health by definition needs to draw information from as wide a population as possible.

Data breaches always make lurid news, sometimes under surprising circumstances, and not always caused by health care providers. The 2015 security news was dominated by a massive breach at the Anthem health insurer.

Along with great fanfare in Scientific American for “precision medicine,” another Scientific American article covered its privacy risks.

A blog posting promoted early and intensive interactions with end users during app design.

A study found that HIT implementations hamper clinicians, but could not identify the reasons.

Natural language processing was praised for its potential for simplifying data entry, and to discover useful side effects and treatment issues.

CVS’s refusal to stock tobacco products was called “a major sea-change for public health” and part of a general trend of pharmacies toward whole care of the patient.

A long interview with FHIR leader Grahame Grieve described the progress of the project, and its the need for clinicians to take data exchange seriously. A quiet milestone was reached in October with a a production version from Cerner.

Given the frequent invocation of Uber (even more than the Cheesecake Factory) as a model for health IT innovation, it’s worth seeing the reasons that model is inapplicable.

A number of hot new sensors and devices were announced, including a tiny sensor from Intel, a device from Google to measure blood sugar and another for multiple vital signs, enhancements to Microsoft products, a temperature monitor for babies, a headset for detecting epilepsy, cheap cameras from New Zealand and MIT for doing retinal scans, a smart phone app for recognizing respiratory illnesses, a smart-phone connected device for detecting brain injuries and one for detecting cancer, a sleep-tracking ring, bed sensors, ultrasound-guided needle placement, a device for detecting pneumonia, and a pill that can track heartbeats.

The medical field isn’t making extensive use yet of data collection and analysis–or uses analytics for financial gain rather than patient care–the potential is demonstrated by many isolated success stories, including one from Johns Hopkins study using 25 patient measures to study sepsis and another from an Ontario hospital. In an intriguing peek at our possible future, IBM Watson has started to integrate patient data with its base of clinical research studies.

Frustrated enough with 2015? To end on an upbeat note, envision a future made bright by predictive analytics.

Using APIs at the Department of Health and Human Services to Expand Web Content

Posted on October 21, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Application Programming Interfaces (APIs) appeal mostly to statisticians and researchers whose careers depend on access to data. But these programming tools are also a useful part of a Web that is becoming increasingly supple and sophisticated. I have written a series of articles about the use of APIs to share and run analytics on patient data, but today I’ll cover a cool use of an API developed by the Department of Health and Human Services for disseminating educational material.

The locus for this activity started with the wealth of information created by the Centers for Disease Control for doctors, public health workers, and the general public. Striving to help the public understand vaccinations, West Nile fever, Ebola (when that was a major public issue), and even everyday conditions such as diabetes, the CDC realized they had to make their content simple to embed in web sites for all those audiences.

The CDC also realized that it would be helpful to let outsiders quickly choose content along a number of dimensions. Not only would a particular web site be interested in a particular topic (diabetes, for instance), but they would want to filter the content to offer information to a particular audience in a particular language. One Web page might offer content aimed at doctors in English, while another might offer content for the general public in English and yet another offer content in Spanish. To allow all these distinctions, a RESTful API called from JavaScript allows each Web page to bring in just what is needed. Topics and languages are offered now, and filtering by audience will be supported soon. At some point, the API will even recognize ICD-10 codes and find any content related to those disease conditions.

We are all familiar with Web pages that embed dynamic content from other sites, such as videos from YouTube or Vimeo. Web developers embed the content by visiting the desired page, clicking on an Embed button, and copying some dense HTML to their own pages. The CDC offers several ways for visitors to syndicate content in this manner to their own web sites. If they are using a popular content management system (WordPress, Drupal, or Joomla!) they can install a plug-in that uses familiar practices to embed the content. Mobile app support is also provided. But the API developed by the CDC takes the process to a much more advanced level.

First, as already described, the API lets each page specify filters that extract content on the desired topic for the desired audience. Second, if a new video, e-card, or microsite is added to the CDC site, the API automatically picks it up when a user revisits the embedding page. Thus, without fussing with HTML, a site can integrate CDC content that’s tailored pretty precisely to its needs.

This API is also in use at the FDA–see for instance their Center for Tobacco Products–and at HHS more broadly. A community is starting to build around the code, which is open source, and soon it will be on GitHub, the most popular site for code sharing. A terse documentation page is available.

The API from Health and Human Services offers several lessons for health IT. First, communications can be improved by using the advanced features provided by the Web. (In addition to the API, the CDC tools make sophisticated use of HTML5 and iFrames to offer dynamic content in ways that fit in smoothly with the sites that choose to embed it.) Second, sites need to consider the people at the other end of the transaction in order to design tools that deliver an easy-to-use and easy-to-understand experience. And finally, releasing code as open source maximizes its value to the health care community. These trends need to be more widely adopted.

Using Healthcare Analytics to Achieve Strong Financial Performance

Posted on September 25, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Everyone is talking about analytics, but I’ve been looking for the solutions that take analytics and package it nicely. This is what I hoped for when I found this whitepaper called How Healthcare Providers Can Leverage Advanced Analytics to Achieve Strong Financial Performance. This is a goal that I think most of us in healthcare IT would like to achieve. We want healthcare providers to be able to leverage analytics to improve their business.

However, this illustration from the whitepaper shows exactly why we’re not seeing the results we want from our healthcare analytics efforts:
Advanced Analytics Impact on Healthcare

That’s a complex beast if I’ve ever seen one. Most providers I talk to want the results that this chart espouses, but they want it just to happen. They want all the back end processing of data to happen inside a black box and they just want to feed in data like they’ve always done and have the results spit out to them in a format they can use.

This is the challenge of the next century of healthcare IT. EHR is just the first step in the process of getting data. Now we have the hard work of turning that data into something more useful than the paper chart provided.

The whitepaper does suggest these three steps we need to take to get value from our analytics efforts:
1. Data capture, storage, and access
2. Big data and analytics
3. Cognitive computing

If you read the whitepaper they talk more about all three of these things. However, it’s very clear that most organizations are still at step 1 with only a few starting to dabble in step 2. Some might see this as frustrating or depressing. I see it as exciting since it means that the best uses of healthcare IT are still to come. However, we’re going to need these solutions to be packaged in a really easy to use package. Otherwise no one will adopt them.

Providers Still Have Hope For HIEs

Posted on July 10, 2015 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Sometimes, interoperability alone doesn’t cut it.  Increasingly, providers are expecting HIEs to go beyond linking up different organizations to delivering “actionable” data, according to a new report from NORC at the University of Chicago. The intriguing follow-on to the researchers’ conclusions is that HIEs aren’t obsolete, though their obsolescence seemed all but certain in the past.

The study, which was written up by Healthcare Informatics, conducted a series of site visits and 37 discussions with providers in Iowa, Mississippi, New Hampshire, Vermont, Utah and Wyoming. The researchers, who conducted their study in early 2014, hoped to understand how providers looked at HIEs generally and their state HIE program specifically. (The research was funded by ONC.)

One major lesson for the health IT types reading this article is that providers want data sharing models to reflect new care realities.  With Meaningful Use requirements and changes in payment models bearing down on providers, and triggering changes in how care is delivered, health IT-enabled data exchange needs to support new models of care.

According to the study, providers are intent on having HIEs deliver admission, discharge, and transfer alerts, interstate data exchange and data services that assist in coordinating care. While I don’t have comprehensive HIE services research to hand, maybe you do, readers. Are HIEs typically meeting these criteria? I doubt it, though I could be wrong.

That being said, providers seem to be willing to pay for HIE services if the vendor can meet their more stringent criteria.  While this may be tough to swallow for existing HIE technology sellers, it’s good news for the HIE model generally, as getting providers to pay for any form of community data exchange has been somewhat difficult historically.

Some of the biggest challenges in managing HIE connectivity identified by the study include getting good support from both HIE and EMR vendors, as well as a lack of internal staff qualified to manage data exchange, competing priorities and problems managing multiple funding streams. But vendors can work to overcome at least some of these problems.

As I noted previously, hospitals in particular have had many beliefs which have discouraged them from participating in HIEs. As one HIE leader quoted in my previous post noted, many have assumed that HIE connection costs would be in the same range as EMR adoption expenses; they’re been afraid that HIEs would not put strong enough data security in place to meet HIPAA obligations; and they assumed that HIE participation wasn’t that important.

Today, given the growing importance of sophisticated data management has come to the forefront, and most providers know that they need to have the big picture widespread data sharing can provide. Without the comprehensive data set cutting across the patient care environment — something few organizations are integrated enough to develop on their own — they’re unlikely to mount a successful population health management initiative or control costs sufficiently. So it’s interesting to see providers see a future for HIEs.

EMRs Should Include Telemedicine Capabilities

Posted on May 22, 2015 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

The volume of telemedicine visits is growing at a staggering pace, and they seem to have nowhere to go but up. In fact, a study released by Deloitte last August predicted that there would be 75 million virtual visits in 2014 and that there was room for 300 million visits a year going forward.

These telemedicine visits are generating a flood of medical data, some in familiar text formats and some in voice and video form. But since the entire encounter takes place outside of any EMR environment, huge volumes of such data are being left on the table.

Given the growing importance of telemedicine, the time has come for telemedicine providers to begin integrating virtual visit results into EMRs.  This might involve adopting specialized EMRs designed to capture video and voice, or EMR vendors might go with the times and develop ways of categorizing and integrating the full spectrum of telemedical contacts.

And as virtual visit data becomes increasingly important, providers and health plans will begin to demand that they get copies of telemedical encounter data.  It may not be clear yet how a provider or payer can effectively leverage video or voice content, which they’ve never had to do before, but if enough care is taking place in virtual environments they’ll have to figure out how to do so.

Ultimately, both enterprise and ambulatory EMRs will include technology allowing providers to search video, voice and text records from virtual consults.  These newest-gen EMRs may include software which can identify critical words spoken during a telemedical visit, such as “pain,” or “chest” which could be correlated with specific conditions.

It may be years before data gathered during virtual visits will stand on equal footing with traditional text-based EMR data and digital laboratory results.  As things stand today, telemedicine consults are used as a cheaper form of urgent care, and like an urgent care visit, the results are not usually considered a critical part of the patient’s long-term history.

But the more time patients spend getting their treatment from digital doctors on a screen, the more important the mass of medical data generated becomes. Now is the time to develop data structures and tools allowing clinicians and facilities to mine virtual visit data.  We’re entering a new era of medicine, one in which patients get better even when they can’t make it to a doctor’s office, so it’s critical that we develop the tools to learn from such encounters.