Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and EHR for FREE!

Annual Evaluation of Health IT: Are We Stuck in a Holding Pattern? (Part 3 of 3)

Posted on April 15, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous installments of this article covered major regulatory initiatives and standards projects. Some of the same questions have a direct impact on technological advances.

Medical Devices: Always With You, But Neither Here Nor There

One ad I saw compares a fitness device to a friend whispering in your ear wherever you go. Leaving aside control freak issues, what could be better for a modern patient with a condition that responds to behavior change than a personal device? Through such devices, we can implement a 24/7 cycle of medical care. We can also save enormous sums of money by treating the patient in his natural environment instead of a hospital or rehab facility.

The rapid spread of health devices was a foregone conclusion even before Apple thrust it into the mainstream with HealthKit. Last month’s launch of ResearchKit suggests that Apple will do the same for the big data revolution in health care championed by the Personal Genome Project, 23andMe (now back in the business after being reined in by the FDA), PatientsLikeMe, and other pioneer organizations. Apple Watch, an indulgence expected to grab the hearts of the affluent, might pull off the paradigm shift in how we interact digitally that Google Glass aimed at.

For these devices to make the leap from digital pets to real medical intervention, including a strengthening of the bond between clinicians and patients, they must satisfy stringent requirements for safety and accuracy. Current FDA regulations distinguish (in very rough terms–I am not a lawyer) between devices that make diagnoses or recommend treatments and other devices that merely measure vital signs or deliver reminders. If you make a diagnosis or recommend a treatment, you need to undergo a complex and expensive evaluation. People can also submit problems they find about your device to FDA’s medical device database.

Safety, accuracy, and transparency are goals well worth pursuing. The problem is not the cost of certification techniques, but the vast gulf between the development model assumed by certification and the one followed by modern developers of both software and hardware.

Development methods nowadays are agile. Developers incrementally release versions of software or hardware and upgrade them every few months. But certification processes require retesting every time the smallest change is made. And that’s reasonble because any tweak (even a configuration change out in the field) can cause a working device to fail. Such certifications work well for embedded systems in airplanes and nuclear facilities, and even critical medical devices that may live in patients’ bodies for decades. But they slow innovation to a crawl and raise prices precipitously.

Oddly enough, the tension between agile development and certification affects medical devices and electronic health records (EHRs) equally, and EHRs are equally prone to errors or misleading interfaces. Yet medical devices are regulated while EHRs are not. This contradiction must be resolved–but perhaps not by dropping the anvil of safety certification on all software used in medicine. The FDA can search for a more supple regulatory process that blesses certain classes of hardware and software while allowing for variation within them, backed up by guidelines for robust development and testing.

The FDA understands that it’s in an untenable situation but doesn’t know what to do. They have shaved off certain devices and marked them for lower levels of scrutiny, such as devices that transfer or display data collected elsewhere. The FDA has also led a muddled discussion over a national “test bed” for medical devices. More regulatory clarity in the area of both devices and EHRs, along with a push by regulators and users for better development practices, could help the field take off and realize the promise of personal devices.

Conclusion

I’m excited about the possibilities of health IT, but concerned that the current environment is insufficiently friendly for its deployment. On top of all the other factors I’ve cited that hold back the field, consider the urgent shortage of health IT staff. Providers and development firms have been bidding up salaries to steal each other’s employees, and attempts to increase the pool have shown disappointing results.

What I hear is that IT experts would love to get into health care, knowing that it can help the public immensely as well as pay off financially. But they have balked at the technical and working conditions in the field: hide-bound institutions, 50-year-old standards and tools, and of course the weight of standards and regulations to study.

How many of these topics will be covered at HIMSS? FHIR will be widely considered, I know, and the buzz over Meaningful Use is always strong. The question what will prod change in the system. Ultimately, it may come from a combination of consumer demand and regulatory pressure. Progress for the sake of progress has not been a prominent trait of health IT.

Health IT Security: What Can the Association for Computing Machinery (ACM) Contribute?

Posted on February 24, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

A dazed awareness of security risks in health IT has bubbled up from the shop floor administrators and conformance directors (who have always worried about them) to C-suite offices and the general public, thanks to a series of oversized data breaches that recentlh peaked in the Anthem Health Insurance break-in. Now the US Senate Health Committee is taking up security, explicitly referring to Anthem. The inquiry is extremely broad, though, promising to address “electronic health records, hospital networks, insurance records, and network-connected medical devices.”
Read more..

FTC Gingerly Takes On Privacy in Health Devices (Part 2 of 2)

Posted on February 11, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first part of this series of articles laid out the difficulties of securing devices in the Internet of Things (particularly those used in the human body). Accepting that usability and security have to be traded off against one another sometimes, let’s look at how to make decisions most widely acceptable to the public.

The recent FTC paper on the Internet of Things demonstrates that they have developed a firm understanding of the problems in security and privacy. For this paper, they engaged top experts who had seen what happens when technology gets integrated into daily life, and they covered all the issues I know of. As devices grow in sophistication and spread to a wider population, the kinds of discussion the FTC held should be extended to the general public.

For instance, suppose a manufacturer planning a new way of tracking people–or a new use for their data–convened some forums in advance, calling on potential users of the device to discuss the benefits and risks. Collectively, the people most affected by the policies chosen by the manufacturer would determine which trade-offs to adopt.

Can ordinary people off the street develop enough concerned with their safety to put in the time necessary to grasp the trade-offs? We should try asking them–we may be pleasantly surprised. Here are some of the issues they need to consider.

  • What can malicious viewers determine from data? We all may feel nervous about our employer learning that we went to a drug treatment program, but how much might the employer learn just by knowing we went to a psychotherapist? We now know that many innocuous bits of data can be combined to show a pattern that exposes something we wished to keep secret.

  • How guarded do people feel about their data? This depends largely on the answer to the previous question–it’s not so much the individual statistics reported, but the patterns that can emerge.

  • What data does the device need to collect to fulfill its function? If the manufacturer, clinician, or other data collector gathers up more than the minimal amount, how are they planning to use that data, and do we approve of that use? This is an ethical issue faced constantly by health care researchers, because most patients would like their data applied to finding a cure, but both the researchers and the patients have trouble articulating what’s kosher and what isn’t. Even collecting data for marketing purposes isn’t necessarily evil. Some patients may be willing to share data in exchange for special deals.

  • How often do people want to be notified about the use of their data, or asked for permission? Several researchers are working on ways to let patients express approval for particular types of uses in advance.

  • How long is data being kept? Most data users, after a certain amount of time, want only aggregate data, which is supposedly anonymized. Are they using well-established techniques for anonymizing the data? (Yes, trustworthy techniques exist. Check out a book I edited for my employer, Anonymizing Health Data.)

I believe that manufacturers can find a cross-section of users to form discussion groups about the devices they use, and that these users can come to grips with the issues presented here. But even an engaged, educated public is not a perfect solution. For instance, a privacy-risking choice that’s OK for 95% of users may turn out harmful to the other 5%. Still, education for everyone–a goal expressed by the FTC as well–will undoubtedly help us all make safer choices.

FTC Gingerly Takes On Privacy in Health Devices (Part 1 of 2)

Posted on February 10, 2015 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Are you confused about risks to privacy when everything from keystrokes to footsteps is being monitored? The Federal Trade Commission is confused too. In January they released a 55-page paper summarizing results of discussions with privacy experts about the Internet of Things, plus some recommendations. After a big build-up citing all sorts of technological and business threats, the report kind of fizzles out. Legislation specific to the IoT was rejected, but several suggestions for “general privacy legislation” such as requiring security on devices.

Sensors and controls are certainly popping up everywhere, so the FTC investigation comes at an appropriate time. My senator, Ed Markey, who has been a leader in telecom and technology for decades in Congress, recently released a report focused on automobiles. But the same concerns show up everywhere in various configurations. In this article I’ll focus on health care, and on the dilemma of security in that area.

No doubt about it, pacemakers and other critical devices can be hacked. It could be a movie: in Scene 1 a non-descript individual is moving through a crowded city street, thumbing over a common notepad. In Scene 2, later, numerous people fall to the ground as their pacemakers fail. They just had the bad luck to be in the vicinity of the individual with the notepad, who implanted their implants with malicious code that took effect later.

But here are the problems with requiring more security. First, security in computers almost always rests on encryption, which leads to an increase in the size of the data being protected. The best-known FTC case regarding device security, where they forced changes for cameras used in baby monitors, was appropriate for these external devices that could absorb the extra overhead. But increased data size leads to an increase in memory use, which in turn requires more storage and computing power on a small embedded device, as well as more transmission time over the network. In the end, devices may have to be heavier and more costly, serious barriers to adoption.

Furthermore, software always has bugs. Some lie dormant for years, like the notorious Heartbleed bug in the very software that web sites around the world depend on for encrypted communications. To provide security fixes, a manufacturer has to make it easy for embedded devices to download updated software–and any bug in that procedure leaves a channel for attack.

Perhaps there is a middle ground, where devices could be designed to accept updates only from particular computers in particular geographic locations. A patient would then be notified through email or a text message to hike it down to the doctor, where the fix could be installed. And the movie scene where malicious code gets downloaded from the street would be less likely to happen.

In the next part of this article I’ll suggest how the FTC and device manufacturers can engage the public to make appropriate privacy and security decisions.

101 Tips to Make Your EMR and EHR More Useful – EHR Tips 36-40

Posted on September 27, 2011 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Time for the next entry covering Shawn Riley’s list of 101 Tips to Make your EMR and EHR More Useful. I hope you’re enjoying the series.

40. Do NOT let the finance department drive the EMR choice or deployment
I’m far too much of a physician advocate to even imagine a finance department driving the EMR choice and deployment plan. Ok, I understand that it happens, but it’s a travesty when it does. Considering the finance department will almost never use the system, it should make sense to everyone to have the users of the system help drive the EMR choice and deployment. After all, they will have to use the system once deployed.

Let’s not confuse what I’m saying. I’m not saying that finance shouldn’t be involved in the EMR choice. I’m not saying that finance can’t provide some great insights and an outside perspective. I also am not saying that users of the EMR should hold the hospital hostage with crazy demands that could never be met. It’s definitely a balance, but focus on the users of the EMR will lead to happy results.

39. Ensure work flow can be hard coded when necessary, and not hard coded when necessary
Related to this EHR tip is understanding when the EHR company has chosen to hard code certain fields or work flows. You’ll be surprised how many EHR have hard coded work flows with no way to change them. In some cases, that’s fine and even beneficial. However, in many other cases, it could really cause you pain in dealing with their hard coded work flows.

Realize which parts of the EHR can be changed/modified and which ones you’re stuck with (at least until the next release..or the next release….or the next release…).

38. You can move to population based medicine
You’re brave to do population based medicine on paper. Computers are great at crunching and displaying the data for this.

37. Safety is created by design
Just because you use an EHR doesn’t mean you don’t need great procedures that ensure safety. Sure, EHRs have some things built in to help with safety, but more often than not it’s a mixture of EHR functionality and design that results in safety. Don’t throw out all your principles of safety when you implement your EHR.

36. Medication Reconciliation should be a simple process
I’m not sure we’ve hit the holy grail of medication reconciliation in an EHR yet, but we’re getting closer. It’s worth the time to make this happen and will likely be required in the future.

If you want to see my analysis of the other 101 EMR and EHR tips, I’ll be updating this page with my 101 EMR and EHR tips analysis. So, click on that link to see the other EMR tips.