Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

How Quick Can We Analyze Health IT Data?

Posted on October 9, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

While at the AHIMA Annual convention, I had a chance to sit down with Dr. Jon Elion, President and CEO of ChartWise, where we had a really interesting discussion about healthcare data. You might remember this video interview of Dr. Elion that I did a few years back. He’s a smart man with some interesting insights.

In our discussion, Dr. Elion led me on an oft repeated data warehouse discussion that most data warehouses have data that’s a day (or more) old since most data warehouses batch their data load function nightly. While I think this is beginning to evolve, it’s still true for many data warehouses. There’s good reason why the export to a data warehouse needs to occur. An EHR system (or other IT system) is a transactional system that’s build on a transactional database. This makes it difficult to do really good data analysis. Thus the need to move the data from a transactional system to a data store designed for crunching data. Plus, most hospitals also combine data from a wide variety of systems into their data warehouse.

Dr. Elion then told me about how they’d worked hard to change this model and that their ChartWise system had been able to update a hospital’s data warehouse (I think they may call it something different) every 5 minutes. Think about how much more you can do with 5 minute old data than you can do with day old data. It makes a huge difference.

Data that’s this fresh becomes actionable data. A hospital’s risk management department could leverage this data to identify at risk patients that need a little extra attention. Unfortunately, if that data is a day old, it might be too late for you to be able to act and prevent the issue from getting worse. That’s just one simple example of how the fresh data can be analyzed and improve the care a patient receives. I’m sure you can come up with many others.

No doubt there are a bunch of other companies that are working to solve this problem as well. Certainly, day old healthcare data is valuable as well, but fresh data in your data warehouse is so much more actionable than day old data. I’m excited to see what really smart people will be able to do with all this fresh data in their data warehouse.

Digital Health: How to Make Every Clinician the Smartest in the Room

Posted on August 21, 2014 I Written By

The following is a guest blog post by Dr. Mike Zalis, practicing MGH Radiologist and co-founder of QPID Health.
Zalis Headshot
Remember the “World Wide Web” before search engines? Less than two decades ago, you had to know exactly what you were looking for and where it was located in order to access information. There was no Google—no search engine that would find the needle in the haystack for you. Curated directories of URLs were a start, but very quickly failed to keep up with the explosion in growth of the Web. Now our expectation is that we will be led down the path of discovery by simply entering what’s on our mind into a search box. Ill-formed, half-baked questions quickly crystalize into a line of intelligent inquiry. Technology assists us by bringing the experience of others right to our screens.

Like the Internet, EHRs are a much-needed Web of information whose time has come. For a long time, experts preached the need to migrate from a paper-based documentation systems – aka old school charts—to electronic records. Hats off to the innovators and the federal government who’ve made this migration a reality. We’ve officially arrived: the age of electronic records is here. A recent report in Health Affairs showed that 58.9% of hospital have now adopted either a basic or comprehensive EHR—this is a four-fold increase since 2010 and the number of adoptions is still growing. So, EHRs are here to stay. Now, we’re now left to answer the question of what’s next? How can we make this data usable in a timely, efficient way?

My career as a radiologist spanned a similar, prior infrastructure change and has provided perspective on what many practitioners need—what I need—to make the move to an all-electronic patient record most useful: the ability to quickly get my hands on the patient’s current status and relevant past history at the point-of-care and apply this intelligence to make the best decision possible. In addition to their transactional functions (e.g., order creation), EHRs are terrific repositories of information and they’ve created the means but not the end. But today’s EHRs are just that—repositories. They’re designed for storage, not discovery.

20 years ago, we radiologists went through a similar transition of infrastructure in the move to the PACS systems that now form the core of all modern medical imaging. Initially, these highly engineered systems attempted to replicate the storage, display, and annotation functions that radiologists had until then performed on film. Initially, they were clunky and in many ways, inefficient to use. And it wasn’t until several years after that initial digital transition that technological improvements yielded the value-adding capabilities that have since dramatically improved capability, efficiency, and value of imaging services.

Something similar is happening to clinicians practicing in the age of EHRs. Publications from NEJM through InformationWeek have covered the issues of lack of usability, and increased administrative burden. The next frontier in Digital Health is for systems to find and deliver what you didn’t even know you were looking for. Systems that allow doctors to merge clinical experience with the technology, which is tireless and leaves no stone unturned. Further, technology that lets the less-experienced clinician benefit from the know-how of the more experienced.

To me, Digital Health means making every clinician the smartest in the room. It’s filtering the right information—organized fluidly according to the clinical concepts and complex guidelines that organize best practice—to empower clinicians to best serve our patients. Further, when Digital Health matures, the technology won’t make us think less—it allows us to think more, by thinking alongside us. For the foreseeable future, human experience, intuition and judgment will remain pillars of excellent clinical practice. Digital tools that permit us to exercise those uniquely human capabilities more effectively and efficiently are key to delivering a financially sustainable, high quality care at scale.

At MGH, our team of clinical and software experts took it upon ourselves some 7 years ago to make our EHR more useful in the clinical trench. The first application we launched reduced utilization of radiology studies by making clinicians aware of prior exams. Saving time and money for the system and avoiding unnecessary exposure for patients. Our solution also permitted a novel, powerful search across the entirety of a patient’s electronic health record and this capability “went viral”—starting in MGH, the application moved across departments and divisions of the hospital. Basic EHR search is a commodity, and our system has evolved well beyond its early capabilities to become an intelligent concept service platform, empowering workflow improvements all across a health care enterprise.

Now, when my colleagues move to other hospitals, they speak to how impossible it is to practice medicine without EHR intelligence—like suddenly being forced to navigate the Internet without Google again. Today at QPID Health, we are pushing the envelope to make it easy to find the Little Data about the patient that is essential to good care. Helping clinicians work smarter, not harder.

The reason I chose to become a physician was to help solve problems and deliver quality care—it’s immensely gratifying to contribute to a solution that allows physicians to do just that.

Dr. Mike Zalis is Co-founder and Chief Medical Officer of QPID Health, an associate professor at Harvard Medical School, and a board certified Radiologist serving part-time at Massachusetts General Hospital in Interventional Radiology. Mike’s deep knowledge of what clinicians need to practice most effectively and his ability to translate those needs into software solutions inform QPID’s development efforts. QPID software uses a scalable cloud-based architecture and leverages advanced concept-based natural language processing to extract patient insights from data stored in EHRs. QPID’s applciations support decision making at the point of care as well as population health and revenue cycle needs.

Hospital M&A Cost Boosted Significantly By Health IT Integration

Posted on August 18, 2014 I Written By

Anne Zieger is veteran healthcare consultant and analyst with 20 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. Contact her at @ziegerhealth on Twitter or visit her site at Zieger Healthcare.

Most of the time, hospital M&A is sold as an exercise in saving money by reducing overhead and leveraging shared strengths. But new data from PricewaterhouseCoopers suggests that IT integration costs can undercut that goal substantially. (It also makes one wonder how ACOs can afford to merge their health IT infrastructure well enough to share risk, but that’s a story for another day.)

In any event, the cost of integrating the IT systems of hospitals that merge can add up to 2% to the annual operating costs of the facilities during the integration period, according to PricewaterhouseCoopers. That figure, which comes to $70,000 to $100,000 per bed over three to five years, is enough to reduce or even completely negate benefits of doing some deals. And it clearly forces merging hospitals to think through their respective IT strategies far more thoroughly than they might anticipated.

As if that stat isn’t bad enough, other experts feel that PwC is understating the case. According to Dwayne Gunter, president of Parallon Technology Solutions — who spoke to Hospitals & Health Networks magazine — IT integration costs can be much higher than those predicted by PwC’s estimate. “I think 2% being very generous,” Gunter told the magazine, “For example, if the purchased hospital’s IT infrastructure is in bad shape, the expense of replacing it will raise costs significantly.”

Of course, hospitals have always struggled to integrate systems when they merge, but as PwC research notes, there’s a lot more integrate these days, including not only core clinical and business operating systems but also EMRs, population health management tools and data analytics. (Given be extremely shaky state of cybersecurity in hospitals these days, merging partners had best feel out each others’ security systems very thoroughly as well, which obviously adds additional expenses.) And what if the merging hospitals use different enterprise EMR systems? Do you rip and replace, integrate and pray, or do some mix of the above?

On top of all that, working hospital systems have to make sure they have enough IT staffers available, or can contract with enough, to do a good job of the integration process. Given that in many hospitals, IT leaders barely have enough staff members to get the minimum done, the merger partners are likely costly consultants if they want to finish the process for the next millennium.

My best guess is that many mergers have failed to take this massive expense into account. The aftermath has got to be pretty ugly.

Big Data is Like Teenage Sex

Posted on November 5, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Yes, that is a catchy headline, but if you’ve read me for anytime you also know I love a good analogy. This analogy comes from Dan Ariely as shared by Iwona during #cikm2013.

For those who can’t load the image it says:
Big data is like teenage sex:
everyone talks about it,
nobody really knows how to do it,
everyone thinks everyone else is doing it,
so everyone claims they are doing it…

As a big proponent of no sex before marriage, this is a little out there for me, but the analogy illustrated the point so well. In fact, I think this is why in healthcare we’re seeing a new line of smaller data project with meaningful outcomes.

What I wish we could change is the final part. How about we all stop hiding behind what we are and aren’t doing. We all deserve to be frank about our actual efforts. The fact is that many organizations aren’t doing anything with big data and quite frankly they shouldn’t be doing anything. Kind of like how many teenagers shouldn’t be having sex.

Healthcare Big Data and Meaningful Use Challenges Video

Posted on October 2, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This Fall we decided to do a whole series of weekly video interviews with top healthcare IT thought leaders. Many of you may have come across our EHR video site and the Healthcare Scene YouTube channel where we host all of the videos. The next interview in that series is happening Thursday, October 3rd at 1:00 EST with Dr. Tom Giannulli, discussing the future of small physician practices. You can join with us live or watch the recorded video after the event. Plus, you can see all the future interviews we have scheduled here.

Last week’s video interview was with Mandi Bishop, Principal at Adaptive Project Solutions and also a writer at EMR and HIPAA. Mandi does an amazing job sharing her insights into healthcare big data and the challenges of meaningful use. We also dig in to EHR data sharing with insurance plans and ask Mandi if meaningful use is completely devoid of value or not.

For those who missed the live interview, you can watch the recorded interview with Mandi Bishop embedded below.

Big Data Impacting Healthcare

Posted on July 19, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The following is a guest post by Sarah E. Fletcher, BS, BSN, RN-BC, MedSys Group Consultant.
Sarah Fletcher
It is generally agreed that bigger is better.  When it comes to data, big data can be a challenge as well as a boon for healthcare.  As Meaningful Use drives electronic documentation and technologies grow to support it, big data is a reality that has to be managed to be meaningful.

Medical databases are becoming petabytes of data from any number of sources covering every aspect of a patient’s stay.  Hospitals can capture every medication, band-aid, or vital sign.  Image studies and reports are stored in imaging systems next to scanned documents and EKGs.

Each medication transaction includes drug, dose, and route details, which are sent to the dispensing cabinet.  The patient and medication can be scanned at the bedside and documentation added in real time.  Each step of the way is logged with a time stamp including provider entry, pharmacist verification, and nurse administration.  One dose of medication has dozens of individual datum.

All of this data is captured for each medication dose administered in a hospital, which can be tens of thousands of doses per month. Translate the extent of data captured to every patient transfer, surgery, or bandage, and the scope of the big data becomes clearer.

With the future of Health Information Exchanges (HIEs), hospitals will have access not just to their own patient data, but everyone else’s data as well.  Personal health records (PHRs), maintained by the patients themselves, may also lend themselves to big data and provide every mile run, blood pressure or weight measured at home, and each medication taken.

One of the primary challenges with big data is that the clinicians who use the data do not speak the same language as the programmers who design the system and analyze the data.  Determining how much data should be displayed in what format should be a partnership between the clinical and the technical teams to ensure the clinical relevance of the data is maximized to improve patient outcomes.  Big data is a relatively new event and data analysts able to manage these vast amounts of data are in short supply, especially those who can understand clinical data needs.

Especially challenging is the mapping of data across disparate systems.  Much of the data are pooled into backend tables with little to no structure.  There are many different nomenclatures and databases used for diagnoses, terminology, and medications.  Ensuring that discrete data points pulled from multiple sources match in a meaningful way when the patient data are linked together is a programmatic challenge.

Now that clinicians have the last thousand pulse measurements for a group of patients, what does one do with that?  Dashboards are useful for recent patient data, but how quickly it populates is critical for patient care. The rendering of this data requires stable wireless with significant bandwidth, processing power, and storage, all of which come with a cost, especially when privacy regulations must be met.

Likely the biggest challenge of all, and one often overlooked, is the human factor.  The average clinician does not know about technology; they know about patients.  The computer or barcode scanner is a tool to them just like an IV pump, glucometer, or chemistry analyzer.  If it does not work well for them consistently, in a timely and intuitive fashion, they will find ways around the system in order to care for their patients, not caring that it may compromise the data captured in the system.

Most people would point out that the last thousand measurements of anything is overkill for patient care, even if it were graphed to show a visual trend. There are some direct benefits of big data for the average clinician, such as being able to compare every recent vital sign, medication administration, and lab result on the fly.  That said, most of the benefit is indirect via health systems and health outcomes improvements.

The traditional paper method of auditing was to pull a certain number of random charts, often a small fraction of one percent of patient visits.  This gives an idea of whether certain data elements are being collected consistently, documentation completed, and quality goals met.  With big data and proper analytics, the ability exists to audit every single patient chart at any time.

The quality department may have reports and trending graphics to ensure their measures were met, not just for a percentage of a population, but each and every patient visit for as long as the data is stored.  This can be done by age, gender, level of care, and even by eye color, if that data is captured and the reports exist to pull it.

Researchers can use this data mining technique to develop new evidence to guide future care.  By reviewing the patients with the best outcomes in a particular group, correlations can be drawn, evaluated, and tested based on the data of a million patients.  Positive interventions discovered this way today can be turned into evidence-based practice tomorrow.

The sheer scope of big data is its own challenge, but the benefits have the potential to change healthcare in ways that have yet to be considered.  Big data comes from technology, but Meaningful Use is not about implementing technology.  It is about leveraging technology in a meaningful way to improve the care and outcomes of our patients.  This is why managing big data is so critical to the future of healthcare.

MedSys Group Consultant, Sarah E. Fletcher, BS, BSN, RN-BC has worked in technology for over fifteen years.  The last seven years have been within the nursing profession, beginning in critical care and transitioning quickly to Nursing Informatics.  She is a certified Nurse Informaticist and manages a regular Informatics Certification series for students seeking ANCC certification in Nursing Informatics.  Sarah currently works with MedSys Group Consulting supporting a multi-hospital system.

Is Skinny Data Harder Than Big Data?

Posted on May 24, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

On my post about Improving the Quality of EHR data for Healthcare Analytics, Glenn made a really great comment that I think is worth highlighting.

Power to change outcomes starts with liberating the data. Then transforming all that data into information and finally into knowledge. Ok – Sorry, that’s probably blindingly obvious. But skinny-data is a good metaphor because you don’t need to liberate ALL the data. And in fact the skinny metaphor covers what I refer to as the data becoming information part (filter out the noise). Selective liberation and combination into a skinny warehouse or skinny data platform is also manageable. And then build on top of that the analytics that release the knowledge to enable better outcomes. Now …if only all those behemoth mandated products would loosen up on their data controls…

His simple comment “filter out the noise” made me realize that skinny data might actually be much harder to do than big data. If you ask someone to just aggregate all the data, that is a generally pretty easy task. Once you start taking on the selection of data that really matters, it becomes much harder. This is likely why so many Enterprise Data Warehouses sit their basically idle. Knowing which data is useful, making sure it is collected in a useful way, and then putting that data to use is much harder than just aggregating all the data.

Dana Sellers commented on this in this Hospital EHR and Healthcare Analytics video interview I did (the whole video has some great insights). She said that data governance is going to be an important challenge going forward. Although she defined data governance as making sure that you’re collecting the data in a way that you know what that data really means and how it can be used in the future. That’s a powerful concept and one that most people haven’t dug into very much. They’re going to have to if they want to start using their data for good.

Medical Apps, $21 Billion EMR Market, and Sick of EMR

Posted on April 21, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.


This is a pretty interesting idea and another way to talk about subjects we’ve talked about many times here. The idea of an app in this case is an app on top of EMR software. I call this making the Smart EMR. It will likely come from these apps. The article is right that many of the data warehouses are clunky and don’t serve the doctors. In fact, there are very few data warehouses focused on the doctors needs at all.


The last EMR incentive numbers I saw were at $10 billion. Does that mean the government has funded half of the market? These numbers are always a little fishy, but it’s interesting to consider how big the EMR market is.


I actually know a lot of doctors who love their EMR and wouldn’t practice medicine without one. What I think most doctors are tired of is all the government regulations. We shouldn’t confuse government regulations with EMR.

Healthcare Doesn’t Do Big Data Yet…It Does BI

Posted on April 15, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

It seems like healthcare big data is the topic du jour lately. Everyone seems interested in how they can tap into the big data in healthcare. I’m not sure what’s caused the flood of healthcare big data people. I expect that some of it comes from the rush of EHR implementations that have happened thanks in large part to the EHR incentive money. I imagine there’s a whole group of hospital CIO’s that are wondering how they can leverage all of that EHR data to benefit their institution and patients.

I think it’s great that healthcare has finally seemed to realize that there’s a lot of value found in healthcare data. The problem is that in every other industry, what we call healthcare big data isn’t very big data at all. In fact, most other industries would describe most of the healthcare data efforts as pretty simple business intelligence. Yes, there are pockets of exceptions, but most of the data initiatives I’ve seen in healthcare don’t even approach the true meaning of the words big data.

I’m not saying that there’s anything wrong with this. In fact, I loved when I met with Encore Health Resources and they embraced the idea of “skinny” healthcare data. Maybe it was a way for them to market their product a little different, but regardless of their intent they’re right that we’re still working on skinny data in healthcare. I’d much rather see a bunch of meaningful skinny data projects than a true healthcare big data project that had no results.

Plus, I think this highlights the extraordinary opportunity that’s available to healthcare when it comes to data. If all we’re doing with healthcare data is BI, then that means there is still a wide open ocean of opportunity available for true big data efforts.

I think the biggest challenges we face is around data standards and data liquidity. Related to data standards is the quality of the data, but a standard can often help improve the data quality. Plus, the standard can help to make the data more liquid as well.

Yes, I’m sure the healthcare privacy experts are ready to raise the red flag of privacy when I talk about healthcare data liquidity. However, data liquidity and privacy can both be accomplished. Just give it time and wait for the healthcare data revolution to happen.

Big Data Analytics vs Focused Patient Analytics

Posted on December 18, 2012 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

One of the most common buzzwords in healthcare right now is “big data.” Everyone is talking about how to leverage big data in healthcare. There is little doubt that there are a whole list of opportunities that are available to healthcare using big data analytics.

When it comes to big data analytics, most people see it as healthcare business intelligence. In other words, how do we take all the data from within the organization and leverage it to benefit the business. Or in the case of a health insurance company, how can we use all the healthcare data that’s available out there to benefit our business. This is really powerful stuff that can’t be ignored. A lot of money can be made/saved by a business that properly leverages the data it holds.

However, I think there’s another side of healthcare big data that doesn’t get nearly enough attention. Instead of calling it big data analytics, I like to call it focused patient analytics.

What is focused patient analytics? It’s where you take relatively small elements from big data that are focused on a specific patient. In aggregate the data that you get is relatively small, but when you consider all of the data is focused around one patient it can be a significant amount of valuable data. Plus, it requires all the healthcare big data silos be available to make this happen. Unfortunately, we’re not there yet, but we will get there.

Imagine how much smarter you could make the EHR if the EHR could tap into the various silos of healthcare data in order to create focused patient analytics. Unfortunately, we can’t even design these type of smart EHR software, because too much of the data is unavailable to EHR software. I love to think about the innovation that would be possible if there was a free flow of data to those that needed it in healthcare.

Certainly there are plenty of security risks and privacy concerns to consider. However, we can’t let that challenge be an excuse for us not to create focused patient analytics that will benefit patients.