Quote Of The Year

Timeless Quotes - Sadly The Late Paul Shetler - "Its not Your Health Record it's a Government Record Of Your Health Information"

or

H. L. Mencken - "For every complex problem there is an answer that is clear, simple, and wrong."

Sunday, August 20, 2023

A Major Deadline Is Fast Approaching To Have E-Prescribing Fully Set Up!

This appeared last week:

Deadline looms for sign-up to centralised prescribing

The Department of Health and Aged Care is urging practices that have not registered for the national Prescription Delivery Service to do so.

Jolyon Attwooll

17 Aug 2023

More than 150 million e-prescriptions have been issued since May 2020.

Reimbursement for an e-prescription SMS will stop from 30 September unless prescribers have signed up to the new national Prescription Delivery Service (PDS), the Department of Health and Aged Care (DoH) has warned.
 
It said practice owners and managers should check whether they have registered and if their software providers advise any necessary updates.
 
The centralised PDS, run by eRx Script Exchange, is designed to streamline prescription delivery and dispensing, according to the DoH.
 
A $99.6 million deal was signed this May, with eRx Script Exchange contracted to provide the PDS from 1 July this year until 30 June 2027.
 
According to Services Australia, clinicians and pharmacies need to connect to the PDS by the end of next month to continue prescribing or dispensing eligible medications.
 
Reimbursements for e-prescription tokens are currently paid by the Australian Digital Health Agency (ADHA), an arrangement that had been extended several times while DoH officials considered a permanent solution.  
 
The Government-funded prescription exchange will continue to cover SMS fees but only for practices that have signed up to the new PDS.
 
The DoH says the move to the new system will simplify the prescribing process, as well as give ‘long-term funding certainty to enable innovation and efficiency … clearer governance … and enhanced capacity for patient-centred support and care’.
 
The DoH says other policy reforms, including the mandated use of e-prescribing for high risk and high-cost medicines, ‘are on the horizon’.
 
The move to a model directly contracted by the Federal Government was announced as part of the 7th Community Pharmacy Agreement, with the tender going out in June last year.
 
The majority of practices are believed to have signed up to eRx Script Exchange already, and do not need to take further action.
 
Set up in April 2009, the prescription exchange service is a subsidiary of the Fred IT Group, which is part owned by the Pharmacy Guild.
 
The other software vendor that runs a prescription exchange service, MediSecure, will continue providing private prescriptions, which will remain free to send by SMS or email after the transition period according to a statement on the company’s website.
 
The ADHA states there have now been more than 150 million e-prescriptions issued since May 2020, with the pandemic proving a significant catalyst.

More here:

https://www1.racgp.org.au/newsgp/professional/deadline-looms-for-sign-up-to-centralised-prescrib

It seems we have just set up a small partial national monopoly here with the Pharmacy Guild at least somewhat involved. At least it is not the total monopoly I am sure they would have liked!

It is certainly a good thing all this will not we stable and can be bedded in to provide what will become an essential service I am sure.

I guess we will just have to wait and see how well the arrangements work in the longer term, but it is good to have a national e-prescribing service in place!

David.

 

AusHealthIT Poll Number 710 – Results – 20 August, 2023.

 Here are the results of the poll.

Overall, Has The Australasian Institute Of Digital Health Made A Positive Difference To The Australian Health System Since Its Founding In February 2020?

Yes                                                                     10 (24%)

No                                                                      32 (76%)

I Have No Idea                                                     0 (0%)

Total No. Of Votes: 42

A clear outcome suggesting that a majority of readers feel the AIDH is only partially successful at best!

Any insights on the poll are welcome, as a comment, as usual!

A good number of votes. But also a very clear outcome. 

0 of 42 who answered the poll admitted to not being sure about the answer to the question!

Again, many, many thanks to all those who voted! 

David.

 

Friday, August 18, 2023

This Is A Useful Introduction To AI In Healthcare.

This appeared a few months ago, but is well worth the read:

We need to chat about artificial intelligence

Enrico W Coiera, Karin Verspoor and David P Hansen

Med J Aust || doi: 10.5694/mja2.51992
Published online: 12 June 2023

With the arrival of large language models such as ChatGPT, AI is reshaping how we work and interact

Long foretold and often dismissed, artificial intelligence (AI) is now reshaping how we work and interact as a society.1 For every claim that AI is overhyped and underperforming, only weeks or months seem to pass before a new breakthrough asks us to re-evaluate what is possible. Most recent, it is the very public arrival of large language models (LLMs) such as the generative pre-trained transformers (GPTs) in ChatGPT. In this perspective article, we explore the implications of this technology for health care and ask how ready the Australian health care system is to respond to the opportunities and risks that AI brings.

GPTs are a recent class of machine learning technology. Guided by humans who provide it with sample responses and feedback, ChatGPT was initially trained on 570 gigabytes of text, or about 385 million pages of Microsoft Word, and, at first release, the language model had 175 billion parameters.2 This massive model of the relationships between words is generative in that it produces new text, guided by the model, in response to prompts. It can answer questions, write songs, poems, essays, and software code. Other generative AIs such as DALL‐E, which is trained on images, can create startlingly good pictures, including fictitious or “deep fake” images of real people.3

Today's LLMs are story tellers, not truth tellers. They model how language is used to talk about the world, but at present they do not have models of the world itself. The sheer size of ChatGPT means that it can perform tasks it was not explicitly trained to do, such as translate between languages. ChatGPT amassed 100 million users in the first two months that it was available.4 So compelling are the linguistic skills of LLMs that some have come to believe such AI is sentient,5 despite the prevailing view that as statistical pattern generators, they cannot have consciousness or agency. Australian singer Nick Cave called ChatGPT “a grotesque mockery of what it is to be human” after seeing it generate new songs in his style.6

The health care uses of generative models will soon become clearer.7 Epic has agreed with Microsoft to incorporate its GPT‐4 model into their electronic health records, which have been used for over 305 million patients worldwide.8 LLMs are likely to find application in digital scribes, assisting clinicians to create health records by listening to conversations and creating summaries of the clinical content.9,10 They can create conversational agents, which change the way we search medical records and the internet, synthesising answers to our questions rather than retrieving a list of documents.11

We should prepare for a deluge of articles evaluating LLMs on tasks once reserved for humans, either being surprised by how well the technology performs or showcasing obvious limits because of the lack of a deep model of the world.12 Especially when it comes to clinical applications, producing text or images that are convincing is not the same as producing material that is correct, safe, and grounded in scientific evidence. For example, conversational agents can produce incorrect or inappropriate information that could delay patients seeking care, trigger self‐harm, or recommend inappropriate management.13 Generative AI may answer patients’ questions even if not specifically designed to do so. Yet all such concerns about technology limitations are hostage to progress. It would be foolish indeed to see today's performance of AI as anything other than a marker on the way to ever more powerful AI.

The unintended consequences of AI

It is the unintended consequences of these technologies that we are truly unprepared for. It was hard to imagine in the early innocent days of social media, which brought us the Arab Spring,14 just how quickly it would be weaponised. Algorithmic manipulation has turned social media into a tool for propagating false information, enough to swing the results of elections, create a global antivaccination movement, and fashion echo chambers that increasingly polarise society and mute real discourse.

Within two months of the release of ChatGPT, scientific journals were forced to issue policies on “non‐human authors” and whether AI can be used to help write articles.15 Universities and schools have banned its use in classrooms and educators scramble for new ways to assess students, including returning to pen and paper in exams.16 ChatGPT is apparently performing surprisingly well on questions found in medical exams.17

The major unintended consequences of generative models are still to be revealed.18 LLMs can produce compelling misinformation and will no doubt be used by malicious actors to further their aims. Public health strategies already must deal with online misinformation; for example, countering antivaccination messaging. Maliciously created surges of online messages during floods, heat events, and pandemics could trigger panic, swamp health services, and encourage behaviours that disrupt the mechanics of society.19

The national imperative to respond to the challenges of AI

With AI's many opportunities and risks, one would think the national gaze would be firmly fixed on it. However, Australia lags most developed nations in its engagement with AI in health care and has done so for many years.20 The policy space is embryonic, with focus mostly on limited safety regulation of AI embedded in clinical devices and avoidance of general purpose technologies such as ChatGPT.

Much more here: Here is the link:

https://www.mja.com.au/journal/2023/219/3/we-need-chat-about-artificial-intelligence

This is a good start on the way to starting to understand the AI in healthcare field and it is well worth following up the various leads in the article. We are much closer to the beginning of all this than the end!

David.

 

Wednesday, August 16, 2023

I Am Not Sure The AMA Really Has Its Head Around AI And Its Possible Impact

This appeared last week:

https://www.miragenews.com/robust-rules-enable-ai-to-enhance-australian-1064459/

Robust Rules Enable AI to Enhance Australian Healthcare

Media release

Medical care delivered by human beings should never be replaced with Artificial Intelligence (AI), but AI technology can potentially achieve improved healthcare, the AMA said today.

The AMA’s first Position Statement on the use of AI in healthcare outlines a set of ethical and regulatory principles based on safety and equity which should be applied to the application of AI technologies in healthcare.

The position statement covers the development and implementation of AI in healthcare and supports regulation which protects patients, consumers, healthcare professionals and their data.

AMA President Professor Steve Robson said with appropriate policies and protocols in place, AI can assist in the delivery of improved healthcare, advancing our healthcare system, and the health of all Australians.

“The AMA sees great potential for AI to assist in diagnosis, for example, or recommending treatments and at transitions of care, but a medical practitioner must always be ultimately responsible for decisions and communication with their patients.

“There’s no doubt we are on the cusp of big changes AI can bring to the sector and this will require robust governance and regulation which is appropriate to the healthcare setting and engenders trust in the system.

“We’d like to see a national governance structure established to advise on policy development around AI in healthcare.

“Such a structure must include all health-sector stakeholders like medical practitioners, patients, AI developers, health informaticians, healthcare administrators and medical defence organisations.

“This will underpin how we carefully introduce AI technology into healthcare. AI tools used in healthcare must be co-designed, developed and tested with patients and medical practitioners and this should be embedded as a standard approach to AI in healthcare.

“Decisions about healthcare are the bedrock of the doctor-patient relationship and these will never be replaced by AI. People worry when they hear that machine learning is perfecting decision-making, but this is not the role AI should play in healthcare. Diagnoses, treatments and plans will still be made by medical practitioners with the patient – AI will assist and supplement this work.  

“We need to get ahead of any unforeseen consequences for patient safety, quality of care and privacy across the profession. This will require future changes to how we teach, train, supervise, research and manage our workforce.

“One of the key concerns for any healthcare organisation using AI must be the privacy of patients and practitioners and their data. The AMA’s position is very clear about protecting the privacy and confidentiality of patient health information. This is where regulation and oversight is really important; the healthcare sector must establish robust and effective frameworks to manage risks, ensure patient safety and guarantee the privacy of all involved.

“The AMA’s position statement shows doctors are engaging with this rapidly evolving field and laying down some guiding principles. If we can get the settings right, so that AI serves the healthcare needs of patients and the wider community, we think it can enable healthcare that is safe, high quality and patient centred.”

Read the AMA position statement

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.

This release has a bit of a feel of “well we are not sure about all this but we need to say something!” about it. There is a lot of ‘motherhood’ here I reckon!

IF ever there was an area / domain where the technology is far ahead of the administrators this has to be it!

It is good we have a specialist committee of experts in this domain to help the AMA when they get out of their depth! See https://aihealthalliance.org/

David.

 

Sunday, August 13, 2023

There Might Be A Lesson Here For Aspiring Digital GP Practices In Australia.

 This appeared a little while ago:

Babylon looks to sell UK business amid bankruptcy fears

Babylon Health is looking to sell its UK business, including its 100,000 patient NHS GP practice, and may fall into administration, the company has announced.

Jordan Sollof


Following the story back in May that shares in Babylon fell sharply on news that the company was being taken private as part of a new debt plan, the firm announced this week that a $34.5m attempt to restructure and return to private ownership fell through.

Babylon GP at Hand, an online-first GP practice with over 100,000 registered NHS patients around London, is the firm’s main remaining NHS service.

The company revealed that it is now “exploring strategic alternatives in order to find the best outcome for its UK business”, which includes the possibility of selling off the UK business. According to HSJ, senior figures in Babylon are confident that its UK business, including the entities that deliver NHS services, will not close.

Last month, the company was delisted from the New York Stock Exchange and has since announced that London-based investment firm AlbaCore Capital are taking over its assets without shareholders’ approval, and that it was calling administrators in the UK.

Babylon said in a statement that it “cannot provide assurance that it will be able to secure sufficient liquidity to fund the operations of the Group’s business”.

“To the extent that Babylon is unable to secure additional funding and complete a Third Party Sale of a particular business, the applicable entities of the Group will file for bankruptcy protection or implement other alternatives for an orderly wind down and liquidation or dissolution,” the firm added.

Babylon Health is a UK AI firm that was promoted as the future of the NHS by the then health secretary Matt Hancock. Founded in 2013 by former UK Iranian banker Ali Parsa, the company claimed its AI could revolutionise healthcare through virtual appointments and diagnostic chatbots such as its GP at Hand.

Despite Babylon winning a number of NHS contracts, thanks in part to Hancock’s promotion of GP at Hand in 2018, experts continuously warned the technology was unproven and overhyped.

The company now faces collapse after losing almost the entirety of its $4.2 billion valuation.

Here is the link:

https://www.digitalhealth.net/2023/08/babylon-looks-to-sell-uk-business-amid-bankruptcy-fears/

It seems to me that radical change is really hard and that Babylon was really trying very hard with a pretty radical agenda – and so demonstrated just how hard that level of change can be!

A salutary tale!

David.

 

AusHealthIT Poll Number 709 – Results – 13 August, 2023.

 Here are the results of the poll.

Have We Reached The Stage Where It Is Impossible To Tell Between Human-Generated Article Content And AI-Generated Articles?

Yes                                                                     10 (28%)

No                                                                      25 (69%)

I Have No Idea                                                     1 (3%)

Total No. Of Votes: 36

A clear outcome suggesting that a majority of readers feel we are not there quite yet!

Any insights on the poll are welcome, as a comment, as usual!

A good number of votes. But also a very clear outcome. 

1 of 36 who answered the poll admitted to not being sure about the answer to the question!

Again, many, many thanks to all those who voted! 

David.

Sunday, August 06, 2023

I Think This Would Be A Good Time To Rethink The Place And Role Of The AIDH.

The departure a week or so ago of the previous substantive CEO of the AIDH (Dr Louise Schaper) has provided an opportunity to some fundamentals of the Aust. Institute Of Digital Health (AIDH) to be re-examined and questioned.

Among these are:

Is change needed to the structures and functions of the AIDH?

Should the Academic functions and the Conference Organising functions be separated?

Does it make any sense to have two different sets of credentials (CHIA and FAIDH etc)?

How could a purely academic Digital Health group be sustained and supported, if desired?

What should be the functions of the different parts of the AIDH – or should they be totally separate?

Is the AIDH an appropriate organization to be offering credentials like CHIA. FAIDH etc. and, if so, how should they be managed and supported?

How should AI be managed within a Digital Health initiative?

I am sure there a zillions of other questions that should be also considered, as should be a mechanism to reshape the AIDH as members desire!

I hope the AIDH Board can take these issues up and really design a worthwhile way forward for all stakeholders!

What change do you think is needed?

David.

 

AusHealthIT Poll Number 708 – Results – 6 August, 2023.

 Here are the results of the poll.

Are You Satisfied That Patient Safely Can Be Properly Protected In AI Assisted Medical Diagnosis And Consultation?

Yes                                                                     0 (0%)

No                                                                    37 (100%)

I Have No Idea                                                  0 (0%)

Total No. Of Votes: 37

A very clear outcome suggesting in that a vast majority of readers felt we need continued human involvement in diagnosis and treatment!

Any insights on the poll are welcome, as a comment, as usual!

A good number of votes. But also a very clear outcome. 

0 of 37 who answered the poll admitted to not being sure about the answer to the question!

Again, many, many thanks to all those who voted! 

David.