Selecting the Right AI Partner in Healthcare Requires a Human Network

Artificial Intelligence, or AI for short, does not always equate to high intelligence and this can have a high cost for healthcare systems. Navigating the intersection of AI and healthcare requires more than clinical operations expertise; it requires advanced knowledge in business motivation, partnerships, legal considerations, and ethics.

Learning to Dance at HIMSS17

This year I had the pleasure of attending a meetup for people interested in and working with AI for healthcare at the Healthcare Information and Management Systems Society (HIMSS) annual meeting in Orlando, Florida. At the beginning of the meetup Wen Dombrowski, MD, asked everyone to stand up and participate in a partner led movement activity. Not your average trust fall, this was designed to teach about AI and machine leaning while pushing most of us out of our comfort zones and to spark participants to realize AI-related lessons. One partner led and the other partner followed their actions.

Dedicated computer scientists, business professionals, and proud data geeks tested their dancing skills. My partner quit when it was my turn to lead the movement. About half of the participants avoided eye contact and reluctantly shuffled their feet while they half nursed their coffee. But however awkward, half the participants felt the activity was a creative way to get us thinking about what it takes for machines to ‘learn’. Notably Daniel Rothman of MyMee had some great dance moves.

I found both the varying feedback and equally varying willingness to participate interesting. One of the participants said the activity was a “waste of time.” They must have come from the half of the room that didn’t follow mirroring instructions. I wonder if I could gather data about what code languages were the specialty of those most resistant. Were the Python coders bad at dancing? I hope not. My professional training is actually as a licensed foreign language teacher so I immediately corroborated the instructional design effectiveness of starting with a movement activity.

There is evidence that participating in physical activity preceding learning makes learners more receptive and allows them to retain the experience longer. “Physical activity breaks throughout the day can improve both student behavior and learning (Trost 2007)” (Reilly, Buskist, and Gross, 2012). I assumed that knowledge of movement and learning capacity was common knowledge. Many of the instructional design comments Dr. Dombrowski received while helpful, revealed participants’ lack of knowledge about teaching and cognitive learning theory.

I could have used some help at the onset in choosing a dance partner that would have matched and anticipated my every move. The same goes for healthcare organizations and their AI solutions.  While they may be a highly respected institution employing some of the most brilliant medical minds, they need to also become or find a skilled matchmaker to bring the right AI partner (our mix of partners) to the dance floor.

AI’s Slow Rise from Publicity to Potential

Artificial Intelligence has experienced a difficult and flashy transition into the medical field. For example, AI computing has been used to establish consensus with imaging for radiologists. While these tools have helped reduce false positives for breast cancer patients, errors remain and not every company entering AI has equal computing abilities. The battle cry that suggested physicians be replaced with robots seems to have slowed robots. While AI is gaining steam, the potential is still catching up with the publicity.

Even if an AI company has stellar computing ability, buyers should question if they also have the same design for outcome. Are they dedicated to protecting your patients and providing better outcomes, or simply making as much profit as possible? Human FTE budgets have been replaced by computing AI costs, and in some instances at the expense of patient and data security.  When I was asking CIOs and smaller companies about their experiences, many were reluctant to criticize a company they had a non-disclosure agreement with.

Learning From the IBM Watson and MD Anderson Breakup

During HIMSS week, the announcement that the MD Anderson and IBM Watson dance party was put on hold was called a setback for AI in medicine by Forbes columnist Matthew Herper. In addition, a scathing report detailing the procurement process written by the University of Texas System Administration Audit System reads more like a contest for the highest consulting fees. This suggests to me that perhaps one of the biggest threats to patient data security when it comes to AI is a corporation’s need to profit from the data.

Moving on, reports of the MD Anderson breakup also mention mismanagement including failing to integrate data from the hospital’s Epic migration. Epic is interoperable with Watson but in this case integration of new data was included in Price Waterhouse Cooper’s scope of work. If poor implementation stopped the project, should a technology partner be punished? Here is an excerpt from the IBM statement on the failed partnership:

 “The recent report regarding this relationship, published by the University of Texas System Administration (“Special Review of Procurement Procedures Related to the M.D. Anderson Cancer Center Oncology Expert Advisor Project”), assessed procurement practices. The report did not assess the value or functionality of the OEA system. As stated in the report’s executive summary, “results stated herein are based on documented procurement activities and recollections by staff, and should not be interpreted as an opinion on the scientific basis or functional capabilities of the system in its current state.”

With non-disclosure agreements and ongoing lawsuits in place, it’s unclear whether this recent example will and should impact future decisions about AI healthcare partners. With multiple companies and interests represented no one wants to be the fall guy when a project fails or has ethical breaches of trust. The consulting firm of Price Waterhouse Coopers owned many of the portions of the project that failed as well as many of the questionable procurement portions.

I spoke with Christine Douglas part of IBM Watson’s communications team and her comments about the early adoption of AI were interesting. She said “you have to train the system. There’s a very big difference between the Watson that’s available commercially today and what was available with MD Anderson in 2012.”  Of course that goes for any machine learning solution large or small as the longer the models have to ‘learn’ the better or more accurate the outcome should be.

Large project success and potential project failure have shown that not all AI is created equally, and not every business aspect of a partnership is dedicated to publicly shared goals. I’ve seen similar proposals from big data computing companies inviting research centers to pay for use of AI computing that also allowed the computing partner to lease the patient data used to other parties for things like clinical trials. How’s that for patient privacy! For the same cost, that research center could put an entire team of developers through graduate school at Stanford or MIT. By the way, I’m completely available for that team! I would love to study coding more than I do now.

Finding a Trusted Partner

So what can healthcare organizations and AI partners learn from this experience? They should ask themselves what their data is being used for. Look at the complaint in the MD Anderson report stating that procurement was questionable. While competitive bidding or outside consulting can help, in this case it appears that it crippled the project. The layers of business fees and how they were paid kept the project from moving forward.

Profiting from patient data is the part of AI no one seems willing to discuss. Maybe an AI system is being used to determine how high fees need to be to obtain board approval for hospital networks.

Healthcare organizations need to ask the tough questions before selecting any AI solution. Building a human network of trusted experts with no financial stake and speaking to competitors about AI proposals as well as personal learning is important for CMIOs, CIOs and healthcare security professionals. Competitive analysis of industry partners and coding classes has become a necessary part of healthcare professionals. Trust is imperative and will have a direct impact on patient outcomes and healthcare organization costs. Meetups like the networking event at HIMSS allow professionals to expand their community and add more data points, gathered through real human interaction, to their evaluation of and AI solutions for healthcare. Nardo Manaloto discussed the meetup and how the group could move forward on Linkedin you can join the conversation.

Not everyone in artificial intelligence and healthcare is able to evaluate the relative intelligence and effectiveness of machine learning. If your organization is struggling, find someone who can help, but be cognizant of the value of the consulting fees they’ll charge along the way.

Back to the dancing. Artificial does not equal high intelligence. Not everyone involved in our movement activity realized it was actually increasing our cognitive ability. Even those who quit, like my partner did, may have learned to dance just a little bit better.

 

Resources

California Department of Education. 2002. Physical fitness testing and SAT9 Retrieved May 20, 2003, from www.cde.ca.gov/statetests/pe/pe.html

Carter, A. 1998. Mapping the mind, Berkeley: University of California Press.

Czerner, T. B. 2001. What makes you tick: The brain in plain English, New York: John Wiley.

Dennison, P. E. and Dennison, G. E. 1998. Brain gym, Ventura, CA: Edu-Kinesthetics.

Dienstbier, R. 1989. Periodic adrenalin arousal boosts health, coping. New Sense Bulletin, : 14.9A

Dwyer, T., Sallis, J. F., Blizzard, L., Lazarus, R. and Dean, K. 2001. Relation of academic performance to physical activity and fitness in children. Pediatric Exercise Science, 13: 225–237. [CrossRef], [Web of Science ®]

Gavin, J. 1992. The exercise habit, Champaign, IL: Human Kinetics.

Hannaford, C. 1995. Smart moves: Why learning is not all in your head, Arlington, VA: Great Ocean.

Howard, P. J. 2000. The owner’s manual for the brain, Austin, TX: Bard.

Jarvik, E. 1998. Young and sleepless. Deseret News, July 27: C1

Jensen, E. 1998. Teaching with the brain in mind, Alexandria, VA: Association for Supervision and Curriculum Development.

Jensen, E. 2000a. Brain-based learning, San Diego: The Brain Store.

Reilly, E., Buskist, C., & Gross, M. K. (2012). Movement in the Classroom: Boosting Brain Power, Fighting Obesity. Kappa Delta Pi Record, 48(2), 62-66. doi:10.1080/00228958.2012.680365.

About the author

Janae Sharp

Healthcare as a Human Right. Physician Suicide Loss Survivor.
Janae writes about Artificial Intelligence, Virtual Reality, Data Analytics, Engagement and Investing in Healthcare. Founder of the Sharp Index.
twitter: @coherencemed

   

Categories