Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Value Based Care: We Need a Better Health IT System to Measure It

Posted on April 16, 2018 I Written By

Healthcare as a Human Right. Physician Suicide Loss Survivor. Janae writes about Artificial Intelligence, Virtual Reality, Data Analytics, Engagement and Investing in Healthcare. twitter: @coherencemed

At HIMSS this year in Las Vegas I looked at the nature of the EHR and if we have the current computing and data infrastructure to enable better value based care.  Our data capabilities are failing to allow providers to align reimbursement with great care delivery.

Under the premise of “what gets watched gets done”, we understand that improving care delivery will require us to align incentives with desired outcomes. The challenge is that, among the many ills plaguing our version of the truth mined from data found in electronic health records systems, reimbursement data presents the core issue for informatics departments across the country. To resolve this issue, we need documentation to reflect the care we are delivering, and we need care delivery to center around patient care. Health information management should be heavily involved in data capture. To truly improve care, we need better tools to measure it, and healthcare data is expanding to answer difficult questions about care delivery and cost.

Our first challenge is stemming the proliferation of extraneous documentation, and healthcare is still addressing this issue. What used to be written on a 3-by-5 index card (and sometimes via illegible doctor’s notes) is now a single point in a huge electronic record that is, surprisingly, not portable. Central to our issues around the cost of care, we have also seen that quantity is valued more than quality in care delivery.

Duplicated testing or unnecessary procedures are grimly accepted as standard practice within the business of medicine. Meaningless and siloed care delivery only helps this issue proliferate across the health of a population. To resolve these issues, our workflow and records need to capture the outcomes we are trying to obtain and must be customized for the incentives of every party.

Incentives for providers and hospital administrators should center around value: delivering the best outcomes, rather than doing more tests. Carefully mapping the processes of healthcare delivery and looking at the resource costs at the medical condition level, from the personnel costs of everyone involved to perform a medical procedure to the cost of the medical device itself, moves organizations closer to understanding total actual costs of care.  Maximizing value in healthcare–higher quality care at lower costs–involves a closer look and better understanding of costs at the medical condition level. Value and incentives alignment should provide the framework for health records infrastructure.

When you walk into Starbucks, your app will tell you what song is playing and offer options to get extra points based on what you usually order. Starbucks understands their value to the customer and the cost of their products to serve them. From the type of bean, to the seasonal paper cup, to the amount of time it takes to make the perfect pumpkin spice latte, Starbucks develops products with their audience in mind–and they know both how much this production costs and how much the user is willing to pay. The cost of each experience starts well before the purchase of the beverage. For Starbucks, they know their role is more than how many lattes they sell; it is to deliver a holistic experience; delight the customer each time.  

Healthcare has much to learn about careful cost analysis from the food and beverage retail industry, including how to use personalized medicine to deliver the best care. Value-Based Healthcare reporting will help the healthcare industry as a whole move beyond the catch-up game we currently play and be proactive in promoting health with a precise knowledge of individual needs and cost of care. The investment into quantifying healthcare delivery very precisely and defining personal treatment will have massive investments in the coming years and deliver better care at a lowered cost. Do current healthcare information systems and analytics have the capacity to record this type of cost analysis?

“Doctors want to deliver the best outcomes for their patients. They’re highly trained professionals. Value Based Healthcare allows you to implement a framework so every member of the care team operates at the top of his or her license.”

-Mahek Shah, MD of Harvard Business School.

These outcomes should be based on the population a given hospital serves, the group of people being treated, or at the medical condition level. Measures of good outcomes are dynamic and personalized to a population. One of the difficulties in healthcare is that while providers are working hard for the patient, healthcare systems are also working to make a profit.

It is possible to do well while doing good, but these two goals are seemingly in conflict within the billion dollar healthcare field. Providing as many services as possible in a fee-for-service-based system can obfuscate the goal of providing great healthcare. Many patients have seen multiple tests and unnecessary procedures that seem to be aligned with the incentive of getting more codes recorded for billing as opposed to better health outcomes for the patients.  

The work of Value Based Time Data Activity Based Costing can improve personalized delivery for delivery in underserved populations as well as for affluent populations. The World Health Organization (WHO) published the work of improving care delivery in Haiti. This picture of the care delivery team is population-specific. A young person after an accident will have different standards for what constitutes “right care right time right place” than a veteran with PTSD. Veterans might need different coverage than members of the general public, so value based care for a specific group of veterans might incorporate more mental health and behavioral health treatment than value based care serving the frail elderly, which could incorporate more palliative care and social (SDoH) care. Measuring costs with TDABC for that specific population would include not just the cost of specialists specific to each segment of the population, but of the entire team (social worker, nursing, nutritionist, psychologists) that is needed to deliver the right care, achieve the best outcomes, and meet the needs of the patient segment.

Healthcare systems are bombing providers and decision makers with information and trying to ferret out what that information really means. Where is it meaningful? Actionable? Process improvement teams for healthcare should look carefully at data with a solid strategy. This can start with cost analysis specific to given target populations. Frequently, the total cost of care delivery is not well understood, from the time spent at the clinic to prescribe a hip replacement to the time in the OR, to recovery time; capturing a better view includes accounting for every stage of care. Surgeons with better outcomes also have a lower total long-term cost of care, which impacts long-term expenses involved when viewing it through the lens of an entire care cycle. If you are a great surgeon–meaning your outcomes are better than others–you should get paid for it. The best care should be facilitated and compensated, rather than the greatest number of billing codes recorded. Capturing information about outcomes and care across multiple delivery areas means data must be more usable and more fluid than before.

Healthcare informatics systems should streamline the processes that are necessary to patient care and provider compensation. The beginning of this streamlined delivery involves capturing a picture of best care and mapping the cost of processes of care. The initial investment of TDABC in researching these care costs at the patient level can be a huge barrier for healthcare systems with small margins and limited resources. This alignment is an investment in your long-term viability and success.

Once you understand your underlying costs to deliver care, health systems will be better prepared to negotiate value-based payment contracts with payers and direct-to-employers. Pair your measurement of costs with your outcomes. Integrating care delivery with outcomes standards has improved in recent times through ICHOM. Medical systems need to incentivize health if healthy patients are a priority.  The analysis of specific costs to a system needs a better reporting system than a charge master or traditional EHR which is strongly designed toward recording fee for service work. We must align or incentives and our health IT with our desired outcomes in healthcare. The more billing codes I can create in an electronic health record, the more I am reimbursed. Reimbursement alignment should match desired outcomes and physicians operating at top of their license.

Under value-based care, health and well-being become a priority whereby often in the fee-for-service model, sickness can be the priority because you get paid by doing more interventions, which may not lead to the best outcomes. The careful measurement of care (i.e. TDABC) paired with standards of best care will improve care delivery and reduce the cost of that care delivery. Insights about improved models and standards of care for outcomes and healthcare delivery allow patients, providers, and administrators to align with the shared goal of healthier patient populations. I am looking forward to the data infrastructure to catch up with these goals of better care delivery and a great patient experience.

 

Using Healthcare Analytics to Achieve Strong Financial Performance

Posted on September 25, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Everyone is talking about analytics, but I’ve been looking for the solutions that take analytics and package it nicely. This is what I hoped for when I found this whitepaper called How Healthcare Providers Can Leverage Advanced Analytics to Achieve Strong Financial Performance. This is a goal that I think most of us in healthcare IT would like to achieve. We want healthcare providers to be able to leverage analytics to improve their business.

However, this illustration from the whitepaper shows exactly why we’re not seeing the results we want from our healthcare analytics efforts:
Advanced Analytics Impact on Healthcare

That’s a complex beast if I’ve ever seen one. Most providers I talk to want the results that this chart espouses, but they want it just to happen. They want all the back end processing of data to happen inside a black box and they just want to feed in data like they’ve always done and have the results spit out to them in a format they can use.

This is the challenge of the next century of healthcare IT. EHR is just the first step in the process of getting data. Now we have the hard work of turning that data into something more useful than the paper chart provided.

The whitepaper does suggest these three steps we need to take to get value from our analytics efforts:
1. Data capture, storage, and access
2. Big data and analytics
3. Cognitive computing

If you read the whitepaper they talk more about all three of these things. However, it’s very clear that most organizations are still at step 1 with only a few starting to dabble in step 2. Some might see this as frustrating or depressing. I see it as exciting since it means that the best uses of healthcare IT are still to come. However, we’re going to need these solutions to be packaged in a really easy to use package. Otherwise no one will adopt them.

Apervita Creates Health Analytics for the Millions

Posted on January 9, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Health officials are constantly talking up the importance of clinical decision support, more popularly known now as evidence-based medicine. We’re owning up to the awkward little fact–which really should embarrass nobody–that most doctors lack expertise on many of the conditions they encounter and can’t read the thousands of relevant studies published each year. New heuristics are developed all the time for things such as predicting cardiac arrest or preventing readmissions after surgery. But most never make their way into the clinic.

Let’s look at what has to happen before doctors and patients can benefit from a discovery:

  1. The researcher has to write a paper with enough detail to create a working program from the heuristic, and has to publish the paper, probably in an obscure journal.

  2. A clinician or administrator has to find the article and line up staff to write and thoroughly test a program.

  3. If the program is to be used outside the hospital where it was created, it has to be disseminated. The hospital is unlikely to have an organization set up to package and market the program. Even if it is simply put out for free use, other institutions have to learn about it and compile it to work on their systems, in order for it to spread widely. Neither the researcher nor the hospital is likely to be compensated for the development of the program.

  4. The program has to be integrated into the doctor’s workflow, by being put on a menu or generating an alert.

Evidence-based medicine, therefore, is failing to tap a lot of resources that could save lives. A commonly cited observation is that research findings take 17 years to go into widespread practice. That’s 17 years of unnecessary and costly suffering.

I have often advocated for better integration of analytics into everyday medical practice, and I found a company called Apervita (originally named Pervasive Health) that jumps off in the right direction. Apervita, which announced a Series A round of funding on January 7, also has potential users outside of clinical settings. Pharma companies can use it to track adverse drug events, while payers can use it to predict fraud and risks to patients. There is not much public health data in the platform yet, but they’re working on it. For instance, Leapfrog group has published hospital safety info through their platform, and Diameter Health provides an all-cause 30-day readmissions prediction for all non-maternal, non-pediatric hospitalizations.

Here’s how the sequence of events I laid out before would go using Apervita:

  1. The researcher implements her algorithm in Python, chosen because Python is easy for non-programmers to learn and is consequently one of the most popular programming languages, particularly in the sciences. Apervita adds functions to Python to make it easy, such as RangeCompute or tables to let you compute with coefficients, and presents these through an IDE.

  2. The researcher creates an analytic on the Apervita platform that describes and publishes the analytic, along with payment terms. Thus, the researcher derives some income from the research and has more motivation to offer the analytic publicly. Conversely, the provider pays only for usage of the analytic, and does not have to license or implement a new software package.

  3. Clinicians search for relevant analytics and upload data to generate reports at a patient or population level. Data in popular formats such as Excel or comma-separated value (CSV) files can be uploaded manually, while programmers can automate data exchange through a RESTful web service, which is currently the most popular way of exchanging data between cooperating programs. Rick Halton, co-founder and Chief Marketing Officer of Apervita, said they are working on support for HL7’s CCD, and are interested in Blue Button+ button, although they are not ready yet to support it.

  4. Clinicians can also make the results easy to consume through personalized dashboards (web pages showing visualizations and current information) or by triggering alerts. A typical dashboard for a hospital administrator might show a graphical thermometer indicating safety rankings at the hospital, along with numbers indicating safety grades. Each department or user could create a dashboard showing exactly what a clinician cares about at the moment–a patient assessment during an admission, or statistics needed for surgical pre-op, for instance.

  5. Apervita builds in version control, and can automatically update user sites with corrections or new versions.

I got a demo of Apervita and found the administration pretty complex, but this seems to be a result of its focus on security and the many options it offers large enterprises to break staff into groups or teams. The bottom line is that Apervita compresses the difficult processes required to turn research into practice and offers them as steps performed through a Web interface or easy programming. Apervita claims to have shown that one intern can create as many as 50 health analytics in one week on their platform, working just from the articles in journals and web resources.

The platform encrypts web requests and is HIPAA-compliant. It can be displayed off-platform, and has been integrated with at least one EHR (OpenMRS).

Always attuned to the technical difficulties of data use, I asked Halton how the users of Apervita analytics could make sure their data formats and types match the formats and types defined by the people who created the analytics. Halton said that the key was the recognition of different ontolgies, and the ability to translate between them using easy-to-create “codesets.”

An ontology is, in general, a way of representing data and the relationships between pieces of data. SNOMED and ICD are examples of common ontologies in health care. An even simpler ontology might simply be a statement that units of a particular data field are measured in milliliters. Whether simple or complex, standard or custom-built, the ontology is specified by the creator of an analytic. If the user has data in a different ontology, a codeset can translate between the two.

As an example of Apervita’s use, a forward prediction algorithm developed by Dr. Dana Edelson and others from the University of Chicago Medical Center can predict cardiac arrests better than the commonly used VitalPAC Early Warning Score (ViEWS) or Modified Early Warning Score (MEWS). Developed from a dataset of over 250,000 patient admissions across five hospitals, “eCART” (electronic Cardiac Arrest Triage) can identify high-risk hospital ward patients and improve ICU triage decisions, often as much as 48 hours in advance.

The new funding will allow Apervita to make their interface even easier for end-users, and to solicit algorithms from leading researchers such as the Mayo Clinic.

Halton heralds Apervita as a “community” for health care analytics for authors and providers. Not only can the creators of analytics share them, but providers can create dashboards or other tools of value to a wide range of colleagues, and share them. I believe that tools like Apervita can bridge the gap between the rare well-funded health clinic with the resources to develop tools, and the thousands of scattered institutions struggling to get the information that will provide better care.

Which Comes First in Accountable Care: Data or Patients?

Posted on September 30, 2014 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The headlines are stark and accusatory. “ACOs’ health IT capabilities remain rudimentary.” “ACOs held back by poor interoperability.” But a recent 19-page survey released by the eHealth Initiative tells two stories about Accountable Care Organizations–and I find the story about interoperability less compelling than another one that focuses on patient empowerment.
Read more..

Could Population Health Be Considered Discrimination?

Posted on August 19, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Long time reader of my site, Lou Galterio with the SunCoast RHIO, sent me a really great email with a fascinating question:

Are only the big hospitals who can afford the very expensive analytics pop health programs going to be allowed to play because only they can afford to and what does that do to the small hospital and clinic market?

I think this is a really challenging question. Let’s assume for a moment that population health programs are indeed a great way to improve the healthcare we provide a patient and also are an effective way to lower the cost of healthcare. Unfortunately, Lou is right that many of these population health programs require a big investment in technology and processes to make them a reality. Does that mean that as these population health programs progress, that by their nature these programs discriminate against the smaller hospitals who don’t have the money to invest in such programs?

I think the simple answer is that it depends. We’re quickly moving to a reimbursement model (ACOs) which I consider to be a form of population health management. Depending on how those programs evolve it could make it almost impossible for the small hospital or small practice to survive. Although, the laws could take this into account and make room for the smaller hospitals. Plus, most smaller hospitals and healthcare organizations can see this coming and realize that they need to align themselves to survive.

The other side of the discrimination coin comes when you start talking about the patient populations that organizations want to include as one of their “covered lives.” When the government talks about population health, they mean the entire population. When you start paying organizations based on the health of their patient population, it changes the dynamic of who you want to include in your patient population. Another possible opportunity for discrimination.

Certainly there are ways to avoid this discrimination. However, if we’re not thoughtful in our approach to how we design these population health and ACO programs, we could run into these problems. The first step is to realize the potential issues. Now, hopefully we can think about them going forward.

EMR for Analytics, HIT Marketing and PR Junkies, and Hospitals Without Walls

Posted on January 5, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.


Jennifer is highlighting how challenging it is to get data out of an EHR in order to do healthcare analytics. This is certainly an issue and a challenge. Although, as much of a challenge is the integrity of the data that’s entered in the EHR.


I love Beth’s description of the Health IT Marketing and PR conference we announced. It’s been interesting to see people’s reaction to the conference. So many marketing and PR people are use to going to conferences, but they’re always going there to sell their products. It seems that they’ve rarely gone to a conference where they go to learn. It’s such a change in what the word “conference” usually means to them. By the way, the conference is coming together very nicely. It’s going to be a great event.


I love the concept of a hospital without walls. This is happening. A little slower than I’d like, but we’re getting there in a lot of areas. Of course, this will never replace hospitals, but it will be a great compliment to hospitals.

Healthcare Analytics Project Works To Predict Preterm Birth

Posted on August 12, 2013 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

A large northern Virginia hospital and a Massachusetts analytics firm are partnering see if data mined from EMRs can better predict the risk of preterm live birth.

The Inova Translational Medicine Institute at Virginia’s Inova Fairfax Hospital is working with Cambridge, MA-based analytics firm GNS Healthcare to create and commercialize computer models to predict the risk of preterm birth, reports Healthcare IT NewsThe two are using next-generation genomic sequencing technology and EMR data to build the models.

The models will be built using ITMI’s large database, which is stocked with data on both normal and preterm family cohorts. GNS will then attempt to link genetic and molecular factors with clinical data and health outcomes, Healthcare IT News said.

Once created, GNS and ITMI will license the models and software — as well as optional access to underlying ITMI data — to academic researchers, health systems and pharma/biotech companies. The ITMI database includes whole genome sequencing (SNP, CNV, SV), RNAseq expression, CpG methylation, proteomic, metabolomic, imaging, EMR, clinical phenotypes and patient survey data for over 2,400 individuals, Healthcare IT News reports.

The two partners are attacking a large problem. As Healthcare IT News notes, 12 percent of babies born in the U.S. are delivered at less than 37 weeks gestation, which causes nearly 10,000 deaths per year and costs $28 billion annually.  Researchers suspect that genetic factors help to prompt preterm birth, though no specific genes have been identified to date.

But there’s many more problems to take on using this approach, and translational medicine projects of this kind are popping up nationally. For example, recently New York’s Mount Sinai Medical Center launched a new program designed to link information stored in the EMR with genetic information provided by patients. As of May, 25,000 patients had signed up for the biobanking program.

I believe this is just the tip of the iceburg. Using EMR data with genomic information is a very logical way to move further in the direction of personalized medicine. I’m eager to see other academic medical centers and hospitals jump in!

Is Skinny Data Harder Than Big Data?

Posted on May 24, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

On my post about Improving the Quality of EHR data for Healthcare Analytics, Glenn made a really great comment that I think is worth highlighting.

Power to change outcomes starts with liberating the data. Then transforming all that data into information and finally into knowledge. Ok – Sorry, that’s probably blindingly obvious. But skinny-data is a good metaphor because you don’t need to liberate ALL the data. And in fact the skinny metaphor covers what I refer to as the data becoming information part (filter out the noise). Selective liberation and combination into a skinny warehouse or skinny data platform is also manageable. And then build on top of that the analytics that release the knowledge to enable better outcomes. Now …if only all those behemoth mandated products would loosen up on their data controls…

His simple comment “filter out the noise” made me realize that skinny data might actually be much harder to do than big data. If you ask someone to just aggregate all the data, that is a generally pretty easy task. Once you start taking on the selection of data that really matters, it becomes much harder. This is likely why so many Enterprise Data Warehouses sit their basically idle. Knowing which data is useful, making sure it is collected in a useful way, and then putting that data to use is much harder than just aggregating all the data.

Dana Sellers commented on this in this Hospital EHR and Healthcare Analytics video interview I did (the whole video has some great insights). She said that data governance is going to be an important challenge going forward. Although she defined data governance as making sure that you’re collecting the data in a way that you know what that data really means and how it can be used in the future. That’s a powerful concept and one that most people haven’t dug into very much. They’re going to have to if they want to start using their data for good.