Friday, June 22, 2018
Some Interesting Questions Worth Thinking About Around AI In Healthcare.
This appeared a few days ago.
As health care becomes more complex, technological and data-driven, there’s a risk physicians will lose some of their autonomy
By Pamela E. Hepp on June 14, 2018
It has been said that the practice of medicine is both an art and a science. For example, different patients react differently to the same medication for the treatment of anxiety, and what works for one patient will not work for another. It can be a trial-and-error process to find the correct medication and optimum dosage. Why do side effects to medications or complications to treatments arise in some patients and not others? Every patient is unique, and the art of medicine is the component of the practice that addresses such uniqueness with compassion and care. Will artificial intelligence (AI) change all of that?
Of course, medicine is also a science, and health care is no stranger to technology. Health care is and has been for many years highly dependent upon technology, from x-ray machines and laboratory equipment; to more advanced medical equipment such as MRIs, ultrasounds and CT scanners; to systems that dispense medications within the hospital; to electronic medical records.
A hospital’s electronic health record may connect with the electronic health record of other hospitals and physicians to create a “health information exchange” (HIE), so that providers have access to all of a patient’s medical information regardless of where he or she may have been treated. The goal of providing access to a patient’s comprehensive medical information is to decrease medical errors as well as eliminate the provision of duplicate tests or procedures that may occur when a specialist may not have access to another provider’s records.
Access to all of that information, however, is creating an information overload for many physicians—yet under the current reimbursement system, where providers are paid a set fee per patient encounter or procedure, providers are being pressured to see more patients in a given day, thereby having less time to spend on any given case. Similarly, hospitals are paid a set amount for the entire hospital stay for a specific patient diagnosis. Accordingly, hospitals are incentivized to discharge patients much quicker than they once did.
Lots more here:
This is a fascinating post, written from a legal perspective, that raises all the issues of liability, responsibility, regulation and complexity associated with the use of AI in support of care delivery.
We have already had a taste of the problem with ‘alert fatigue’ from clinical decision support systems (CDS) but as the complexity and capabilities of AI improve it is clear things are going to become that much harder and more tricky.
How responsibility and blame will be apportioned when things go wrong using systems that the users don’t fully understand is an interesting and open question.
Well worth a browse and a think about.
Posted by Dr David G More MB PhD at Friday, June 22, 2018