Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Using Healthcare Analytics to Achieve Strong Financial Performance

Posted on September 25, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Everyone is talking about analytics, but I’ve been looking for the solutions that take analytics and package it nicely. This is what I hoped for when I found this whitepaper called How Healthcare Providers Can Leverage Advanced Analytics to Achieve Strong Financial Performance. This is a goal that I think most of us in healthcare IT would like to achieve. We want healthcare providers to be able to leverage analytics to improve their business.

However, this illustration from the whitepaper shows exactly why we’re not seeing the results we want from our healthcare analytics efforts:
Advanced Analytics Impact on Healthcare

That’s a complex beast if I’ve ever seen one. Most providers I talk to want the results that this chart espouses, but they want it just to happen. They want all the back end processing of data to happen inside a black box and they just want to feed in data like they’ve always done and have the results spit out to them in a format they can use.

This is the challenge of the next century of healthcare IT. EHR is just the first step in the process of getting data. Now we have the hard work of turning that data into something more useful than the paper chart provided.

The whitepaper does suggest these three steps we need to take to get value from our analytics efforts:
1. Data capture, storage, and access
2. Big data and analytics
3. Cognitive computing

If you read the whitepaper they talk more about all three of these things. However, it’s very clear that most organizations are still at step 1 with only a few starting to dabble in step 2. Some might see this as frustrating or depressing. I see it as exciting since it means that the best uses of healthcare IT are still to come. However, we’re going to need these solutions to be packaged in a really easy to use package. Otherwise no one will adopt them.

Apervita Creates Health Analytics for the Millions

Posted on January 9, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Health officials are constantly talking up the importance of clinical decision support, more popularly known now as evidence-based medicine. We’re owning up to the awkward little fact–which really should embarrass nobody–that most doctors lack expertise on many of the conditions they encounter and can’t read the thousands of relevant studies published each year. New heuristics are developed all the time for things such as predicting cardiac arrest or preventing readmissions after surgery. But most never make their way into the clinic.

Let’s look at what has to happen before doctors and patients can benefit from a discovery:

  1. The researcher has to write a paper with enough detail to create a working program from the heuristic, and has to publish the paper, probably in an obscure journal.

  2. A clinician or administrator has to find the article and line up staff to write and thoroughly test a program.

  3. If the program is to be used outside the hospital where it was created, it has to be disseminated. The hospital is unlikely to have an organization set up to package and market the program. Even if it is simply put out for free use, other institutions have to learn about it and compile it to work on their systems, in order for it to spread widely. Neither the researcher nor the hospital is likely to be compensated for the development of the program.

  4. The program has to be integrated into the doctor’s workflow, by being put on a menu or generating an alert.

Evidence-based medicine, therefore, is failing to tap a lot of resources that could save lives. A commonly cited observation is that research findings take 17 years to go into widespread practice. That’s 17 years of unnecessary and costly suffering.

I have often advocated for better integration of analytics into everyday medical practice, and I found a company called Apervita (originally named Pervasive Health) that jumps off in the right direction. Apervita, which announced a Series A round of funding on January 7, also has potential users outside of clinical settings. Pharma companies can use it to track adverse drug events, while payers can use it to predict fraud and risks to patients. There is not much public health data in the platform yet, but they’re working on it. For instance, Leapfrog group has published hospital safety info through their platform, and Diameter Health provides an all-cause 30-day readmissions prediction for all non-maternal, non-pediatric hospitalizations.

Here’s how the sequence of events I laid out before would go using Apervita:

  1. The researcher implements her algorithm in Python, chosen because Python is easy for non-programmers to learn and is consequently one of the most popular programming languages, particularly in the sciences. Apervita adds functions to Python to make it easy, such as RangeCompute or tables to let you compute with coefficients, and presents these through an IDE.

  2. The researcher creates an analytic on the Apervita platform that describes and publishes the analytic, along with payment terms. Thus, the researcher derives some income from the research and has more motivation to offer the analytic publicly. Conversely, the provider pays only for usage of the analytic, and does not have to license or implement a new software package.

  3. Clinicians search for relevant analytics and upload data to generate reports at a patient or population level. Data in popular formats such as Excel or comma-separated value (CSV) files can be uploaded manually, while programmers can automate data exchange through a RESTful web service, which is currently the most popular way of exchanging data between cooperating programs. Rick Halton, co-founder and Chief Marketing Officer of Apervita, said they are working on support for HL7’s CCD, and are interested in Blue Button+ button, although they are not ready yet to support it.

  4. Clinicians can also make the results easy to consume through personalized dashboards (web pages showing visualizations and current information) or by triggering alerts. A typical dashboard for a hospital administrator might show a graphical thermometer indicating safety rankings at the hospital, along with numbers indicating safety grades. Each department or user could create a dashboard showing exactly what a clinician cares about at the moment–a patient assessment during an admission, or statistics needed for surgical pre-op, for instance.

  5. Apervita builds in version control, and can automatically update user sites with corrections or new versions.

I got a demo of Apervita and found the administration pretty complex, but this seems to be a result of its focus on security and the many options it offers large enterprises to break staff into groups or teams. The bottom line is that Apervita compresses the difficult processes required to turn research into practice and offers them as steps performed through a Web interface or easy programming. Apervita claims to have shown that one intern can create as many as 50 health analytics in one week on their platform, working just from the articles in journals and web resources.

The platform encrypts web requests and is HIPAA-compliant. It can be displayed off-platform, and has been integrated with at least one EHR (OpenMRS).

Always attuned to the technical difficulties of data use, I asked Halton how the users of Apervita analytics could make sure their data formats and types match the formats and types defined by the people who created the analytics. Halton said that the key was the recognition of different ontolgies, and the ability to translate between them using easy-to-create “codesets.”

An ontology is, in general, a way of representing data and the relationships between pieces of data. SNOMED and ICD are examples of common ontologies in health care. An even simpler ontology might simply be a statement that units of a particular data field are measured in milliliters. Whether simple or complex, standard or custom-built, the ontology is specified by the creator of an analytic. If the user has data in a different ontology, a codeset can translate between the two.

As an example of Apervita’s use, a forward prediction algorithm developed by Dr. Dana Edelson and others from the University of Chicago Medical Center can predict cardiac arrests better than the commonly used VitalPAC Early Warning Score (ViEWS) or Modified Early Warning Score (MEWS). Developed from a dataset of over 250,000 patient admissions across five hospitals, “eCART” (electronic Cardiac Arrest Triage) can identify high-risk hospital ward patients and improve ICU triage decisions, often as much as 48 hours in advance.

The new funding will allow Apervita to make their interface even easier for end-users, and to solicit algorithms from leading researchers such as the Mayo Clinic.

Halton heralds Apervita as a “community” for health care analytics for authors and providers. Not only can the creators of analytics share them, but providers can create dashboards or other tools of value to a wide range of colleagues, and share them. I believe that tools like Apervita can bridge the gap between the rare well-funded health clinic with the resources to develop tools, and the thousands of scattered institutions struggling to get the information that will provide better care.

Which Comes First in Accountable Care: Data or Patients?

Posted on September 30, 2014 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The headlines are stark and accusatory. “ACOs’ health IT capabilities remain rudimentary.” “ACOs held back by poor interoperability.” But a recent 19-page survey released by the eHealth Initiative tells two stories about Accountable Care Organizations–and I find the story about interoperability less compelling than another one that focuses on patient empowerment.
Read more..

Could Population Health Be Considered Discrimination?

Posted on August 19, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Long time reader of my site, Lou Galterio with the SunCoast RHIO, sent me a really great email with a fascinating question:

Are only the big hospitals who can afford the very expensive analytics pop health programs going to be allowed to play because only they can afford to and what does that do to the small hospital and clinic market?

I think this is a really challenging question. Let’s assume for a moment that population health programs are indeed a great way to improve the healthcare we provide a patient and also are an effective way to lower the cost of healthcare. Unfortunately, Lou is right that many of these population health programs require a big investment in technology and processes to make them a reality. Does that mean that as these population health programs progress, that by their nature these programs discriminate against the smaller hospitals who don’t have the money to invest in such programs?

I think the simple answer is that it depends. We’re quickly moving to a reimbursement model (ACOs) which I consider to be a form of population health management. Depending on how those programs evolve it could make it almost impossible for the small hospital or small practice to survive. Although, the laws could take this into account and make room for the smaller hospitals. Plus, most smaller hospitals and healthcare organizations can see this coming and realize that they need to align themselves to survive.

The other side of the discrimination coin comes when you start talking about the patient populations that organizations want to include as one of their “covered lives.” When the government talks about population health, they mean the entire population. When you start paying organizations based on the health of their patient population, it changes the dynamic of who you want to include in your patient population. Another possible opportunity for discrimination.

Certainly there are ways to avoid this discrimination. However, if we’re not thoughtful in our approach to how we design these population health and ACO programs, we could run into these problems. The first step is to realize the potential issues. Now, hopefully we can think about them going forward.

EMR for Analytics, HIT Marketing and PR Junkies, and Hospitals Without Walls

Posted on January 5, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.


Jennifer is highlighting how challenging it is to get data out of an EHR in order to do healthcare analytics. This is certainly an issue and a challenge. Although, as much of a challenge is the integrity of the data that’s entered in the EHR.


I love Beth’s description of the Health IT Marketing and PR conference we announced. It’s been interesting to see people’s reaction to the conference. So many marketing and PR people are use to going to conferences, but they’re always going there to sell their products. It seems that they’ve rarely gone to a conference where they go to learn. It’s such a change in what the word “conference” usually means to them. By the way, the conference is coming together very nicely. It’s going to be a great event.


I love the concept of a hospital without walls. This is happening. A little slower than I’d like, but we’re getting there in a lot of areas. Of course, this will never replace hospitals, but it will be a great compliment to hospitals.

Healthcare Analytics Project Works To Predict Preterm Birth

Posted on August 12, 2013 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

A large northern Virginia hospital and a Massachusetts analytics firm are partnering see if data mined from EMRs can better predict the risk of preterm live birth.

The Inova Translational Medicine Institute at Virginia’s Inova Fairfax Hospital is working with Cambridge, MA-based analytics firm GNS Healthcare to create and commercialize computer models to predict the risk of preterm birth, reports Healthcare IT NewsThe two are using next-generation genomic sequencing technology and EMR data to build the models.

The models will be built using ITMI’s large database, which is stocked with data on both normal and preterm family cohorts. GNS will then attempt to link genetic and molecular factors with clinical data and health outcomes, Healthcare IT News said.

Once created, GNS and ITMI will license the models and software — as well as optional access to underlying ITMI data — to academic researchers, health systems and pharma/biotech companies. The ITMI database includes whole genome sequencing (SNP, CNV, SV), RNAseq expression, CpG methylation, proteomic, metabolomic, imaging, EMR, clinical phenotypes and patient survey data for over 2,400 individuals, Healthcare IT News reports.

The two partners are attacking a large problem. As Healthcare IT News notes, 12 percent of babies born in the U.S. are delivered at less than 37 weeks gestation, which causes nearly 10,000 deaths per year and costs $28 billion annually.  Researchers suspect that genetic factors help to prompt preterm birth, though no specific genes have been identified to date.

But there’s many more problems to take on using this approach, and translational medicine projects of this kind are popping up nationally. For example, recently New York’s Mount Sinai Medical Center launched a new program designed to link information stored in the EMR with genetic information provided by patients. As of May, 25,000 patients had signed up for the biobanking program.

I believe this is just the tip of the iceburg. Using EMR data with genomic information is a very logical way to move further in the direction of personalized medicine. I’m eager to see other academic medical centers and hospitals jump in!

Is Skinny Data Harder Than Big Data?

Posted on May 24, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

On my post about Improving the Quality of EHR data for Healthcare Analytics, Glenn made a really great comment that I think is worth highlighting.

Power to change outcomes starts with liberating the data. Then transforming all that data into information and finally into knowledge. Ok – Sorry, that’s probably blindingly obvious. But skinny-data is a good metaphor because you don’t need to liberate ALL the data. And in fact the skinny metaphor covers what I refer to as the data becoming information part (filter out the noise). Selective liberation and combination into a skinny warehouse or skinny data platform is also manageable. And then build on top of that the analytics that release the knowledge to enable better outcomes. Now …if only all those behemoth mandated products would loosen up on their data controls…

His simple comment “filter out the noise” made me realize that skinny data might actually be much harder to do than big data. If you ask someone to just aggregate all the data, that is a generally pretty easy task. Once you start taking on the selection of data that really matters, it becomes much harder. This is likely why so many Enterprise Data Warehouses sit their basically idle. Knowing which data is useful, making sure it is collected in a useful way, and then putting that data to use is much harder than just aggregating all the data.

Dana Sellers commented on this in this Hospital EHR and Healthcare Analytics video interview I did (the whole video has some great insights). She said that data governance is going to be an important challenge going forward. Although she defined data governance as making sure that you’re collecting the data in a way that you know what that data really means and how it can be used in the future. That’s a powerful concept and one that most people haven’t dug into very much. They’re going to have to if they want to start using their data for good.