Quote Of The Year

Timeless Quotes - Sadly The Late Paul Shetler - "Its not Your Health Record it's a Government Record Of Your Health Information"

or

H. L. Mencken - "For every complex problem there is an answer that is clear, simple, and wrong."

Friday, September 09, 2022

It Is Important For Clinicians To Understand Just What Liability They Face When Relying On Clinical Decision Support (CDS) To Determine Treatment!

This appeared a little while ago.

Who is at fault when medical software gets it wrong?

Clinical decision support software is beneficial but if it malfunctions, a doctor’s duty of care likely makes them liable

By Dr Megan Prictor, University of Melbourne

August 4, 2022

Doctors are being increasingly encouraged to rely on digital technology to guide care, but who carries the blame if doctors rely on software that makes mistakes, leading to patient harm?

Imagine this. A patient has recovered enough from a heart attack to be discharged from hospital. The presiding doctor sorts out the discharge using a hospital computer that has clinical decision support software, which compares the patient’s data with inbuilt algorithms to make recommendations for their care.

Clinical decision support tools are increasingly used throughout our healthcare system to promote high-quality care aligning with evidence and guidelines.

In this case, the software generates a pop-up alert recommending that the doctor prescribe a specific medication on the basis that the patient isn’t already taking it. The doctor prescribes the medication, and the patient goes home. A few days later, they die. An investigation finds that the patient had twice the recommended amount of the medication in their system.

It turns out the patient was already taking a dose of this same medication in a tablet that was combined with another drug. As a result, because of the new prescription, the patient had actually been taking a double dose of the medication, which proved to be fatal.

Information about the other medication the patient was already taking was in their medical record, but the clinical decision support tool was flawed – it didn’t recognise the existing medication the patient was on as being in the same category as the newly-prescribed medication.

The doctor was well aware of the rule against combining both medications but had relied on the computer alert. Who is responsible under the law for the patient’s death?

A scenario like this isn’t far-fetched; in fact, it’s based on one story in a recent study of flawed clinical decision support software that led to patient harm.

There is a lot of research showing that clinical decision support software is generally beneficial. For instance, it reduces medication prescribing errors and enhances the chance that doctors will follow guidelines for delivering high-quality healthcare. Yet there is also increasing awareness that malfunctions in clinical decision support software are more common than we think.

The person responsible for the mistake should bear responsibility for the harm. But who, in a situation like this, was really responsible? Was it the software company that created the flawed product and didn’t test it properly? Or was it the doctor, who should have realised the alert was wrong and overridden it?

As a legal academic, I have been working with a University of Melbourne team developing a new clinical decision support tool. I was interested in where a patient would find a legal remedy if they were harmed in this type of situation, and who they could hold accountable.

The doctor could also be harmed in some ways too; for instance, they could face disciplinary action and develop mental health problems. Their job may be at risk.

My newly published research into Australian law has found that most of the legal risk is faced by the doctor and not the software developer. This is because doctors have a fundamental duty of care to their patients, which they can’t delegate to a computer when the computer is only providing recommendations and not independently carrying out decisions.

Clinical decision support software is designed to have a human in the decision-making chain; it’s intended that a doctor will use their own judgment about whether to follow each software alert. As a result, it’s quite likely that the doctor in the story would be found to have acted negligently, breaching their duty of care.

The doctor might also be liable under Australian Consumer Law for not providing services with ‘due care and skill’ (section 60).

More here:

https://pursuit.unimelb.edu.au/articles/who-is-at-fault-when-medical-software-gets-it-wrong

To me the  real takeaway is that the legal researcher finds that when a piece of software’s advice is followed that “Australian law has found that most of the legal risk is faced by the doctor and not the software developer”. I find this a pretty sobering finding and one that should be better known by clinicians.

The research makes it clear the clinician needs to carefully assess any advice offered by a CDS and carefully check the basis on which the advice is based!

Sobering stuff I reckon but not out of line with the view that ultimate responsibly for their recommended treatment!

David.

 

1 comment:

Sarah Conner said...

Thanks David, great article and a good reminder of the issues the Department of Health sweeps under the carpet but is what they should be tackling rather than adding to the minefield medico liabilities with things like the MyHR