Wednesday, September 01, 2021

This Is An Idea Whose Time Should Have Well And Truly Come But Somehow Never Has.

This appeared a few days ago.

Safety net for doctors gets the tick from investors

By Cara Waters

August 24, 2021 — 12.00am

When Doctor Thomas Kelly was sent on a placement as an intern at the Royal Melbourne Hospital to Horsham in country Victoria it would often be just him and a second year doctor running the entire emergency department overnight.

“We saw hundreds of different patients who had lots of different either near misses or entirely missed diagnoses and other issues that could have been easily avoided if we had better tools,” he said. “One of the main things that I wished for at that time was just something that could have listened to the conversation that I was having with the patient and suggested a couple of questions that I might have missed. Just like a safety net really.”

Dr Kelly set about creating Oscer, an online platform that enables medical students to practise their clinical reasoning skills with virtual patients powered by artificial intelligence.

The newly launched startup has raised $5 million in funding led by Blackbird Ventures with participation from January Capital, Inventures, Archangel Ventures and angel investors Brendan Hill and Jeff Bargmann.

Misdiagnosis is an ongoing problem for the medical system with research published in the Medical Journal of Australia estimating diagnostic errors occur in up to one in seven clinical encounters, with more than 80 per cent of diagnostic errors preventable.

Dr Kelly and his co-founders Waleed Mussa and Yu Liu plan to use Oscer’s launch product, which is being piloted by The University of Melbourne, The University of Sydney and a number of universities in the United States, to collect data and use it to create a diagnostic support tool for doctors.

“You need a huge database of questions and answers and all the different symptoms and conditions that people can have, and we thought that an interesting way to build that database would be to create a product for students,” Dr Kelly said.

er’s clinical tool would enable clinicians to get real-time second opinions through transcription, automated coding and analysis of consults and Dr Kelly said it could be particularly useful in settings where there is a lack of resources, such as rural hospitals.

……

“This is not incremental, if successful it is a step function change in broad-based standards of clinical care,” he said. “We all make mistakes in our jobs even doctors, who are the best and brightest and most talented humans. Oscer is about offering them the peace of mind to help us.”

More here:

https://www.smh.com.au/business/entrepreneurship/safety-net-for-doctors-gets-the-tick-from-investors-20210823-p58l7d.html

The idea of a diagnostic support computer system is as old as the hills and systems that could talk symptoms and signs and do a pretty good job of suggesting the possible diagnoses have been around since the 1970s or even earlier.

There is a review of some of the various sorts of computer assisted diagnosis here.

https://en.wikipedia.org/wiki/Computer-aided_diagnosis

You can read about a current manifestation of the type of software here.

https://www.isabelhealthcare.com/

The catch is that very few seem to actually move into routine successful use – despite being technically pretty reliable and accurate. People seem to find it hard to find the time for regular use of pretty much any app!

So to me the key to Oscer’s success is not only to have a great useful product but to work out and solve the barriers to routine regular use. Do that and they will have a winner!

For mine the way that the use can be enhanced is to have it operating in the background watching what is going on and when what is going on drifts away from optimum then make a gentle suggestion or two. Harder than what is proposed here but certainly possible in 2021.

BTW it is important to realise that pretty much no support system can protect against lack of care in assessment and not paying careful attention to what the patent is saying!

What do others think?

David.

1 comment:

  1. And then there's that behemoth EPIC with its AI/ML. Just be careful with vendors and their lack of transparency. I wouldn't trust them any more that I'd trust the government

    Amid a Pandemic, a Health Care Algorithm Shows Promise and Peril
    https://undark.org/2021/05/27/health-care-algorithm-promise-peril/

    A machine learning-based score designed to aid triage decisions is gaining in popularity — but lacking in transparency.

    In the midst of the uncertainty, Epic, a private electronic health record giant and a key purveyor of American health data, accelerated the deployment of a clinical prediction tool called the Deterioration Index. Built with a type of artificial intelligence called machine learning and in use at some hospitals prior to the pandemic, the index is designed to help physicians decide when to move a patient into or out of intensive care, and is influenced by factors like breathing rate and blood potassium level. Epic had been tinkering with the index for years but expanded its use during the pandemic. At hundreds of hospitals, including those in which we both work, a Deterioration Index score is prominently displayed on the chart of every patient admitted to the hospital.

    The Deterioration Index is poised to upend a key cultural practice in medicine: triage. Loosely speaking, triage is an act of determining how sick a patient is at any given moment to prioritize treatment and limited resources. In the past, physicians have performed this task by rapidly interpreting a patient’s vital signs, physical exam findings, test results, and other data points, using heuristics learned through years of on-the-job medical training.

    Ostensibly, the core assumption of the Deterioration Index is that traditional triage can be augmented, or perhaps replaced entirely, by machine learning and big data. Indeed, a study of 392 Covid-19 patients admitted to Michigan Medicine that the index was moderately successful at discriminating between low-risk patients and those who were at high-risk of being transferred to an ICU, getting placed on a ventilator, or dying while admitted to the hospital. But last year’s hurried rollout of the Deterioration Index also sets a worrisome precedent, and it illustrates the potential for such decision-support tools to propagate biases in medicine and change the ways in which doctors think about their patients.

    The use of algorithms to support clinical decision making isn’t new. But historically, these tools have been put into use only after a rigorous peer review of the raw data and statistical analyses used to develop them. Epic’s Deterioration Index, on the other hand, remains proprietary despite its widespread deployment. Although physicians are provided with a list of the variables used to calculate the index and a rough estimate of each variable’s impact on the score, we aren’t allowed under the hood to evaluate the raw data and calculations.

    Furthermore, the Deterioration Index was not independently validated or peer-reviewed before the tool was rapidly deployed to America’s largest health care systems. Even now, there have been, to our knowledge, only two peer-reviewed published studies of the index. The deployment of a largely untested proprietary algorithm into clinical practice — with minimal understanding of the potential unintended consequences for patients or clinicians — raises a host of issues.

    etc etc

    ReplyDelete