This sponsored commentary popped up last week,
6 June 2022
You seek the Holy Grail?
The new federal government is going to have its hands full attending to the health of the nation and ensuring, as Prime Minister Anthony Albanese pledged, that no person is left behind.
Aged care, the NDIS and public hospitals all require urgent attention, but let’s add another item to the list: the medical software industry is crying out for federal leadership and reform.
The inaugural Australasian CXO Cloud Healthcare Summit heard last week that while Australia talks the talk when it comes to data interoperability, when it comes to walking the walk, we have a long way to go.
Let’s take primary care as a case study. An estimated 6000 GP clinics and 15,000 specialist clinics are likely to operate more than 21,000 computers. Typically, these computers require secure, airconditioned rooms, off-site back-up systems, on-call IT support teams and software installations including but not limited to:
- An IHI adaptor to support the My Health Record and e-prescribing
- eRx medication dispensing software
- Secure messaging legacy systems (7+ vendors)
- Booking engines such as HotDoc or HealthEngine
- Data extractor such as Polar or Pen
- Research tools – NPS or other
That’s an estimated 252,000 software implementations to manage, in addition to ubiquitous operating system and antivirus updates.
These legacy systems are expensive and prone to integration failures. They present multiple barriers to data sharing with researchers, innovators, hospitals – and patients.
Industry representatives via its associations and incumbent providers talk about an ecosystem that is interoperable and working well. Indeed, that was the theme of a panel discussion held at the Cloud Healthcare Summit. But maintaining the status quo comes at a cost.
Get your head in the clouds
There’s widespread acknowledgment that cloud-based solutions offer significant advantages in flexibility, scalability, accessibility, environmental sustainability, security, speed to implementation, and cost efficiency, when compared to legacy systems.
However, the reality for cloud vendors is that the health care system is far from user friendly and genuine interoperability remains a Holy Grail.
Among the plethora of products and platforms cloud vendors need to connect with, few do not require installation of third-party software to facilitate a connection. One of those is Medicare Web Services, which recently moved to a cloud-friendly technology. That leaves most of the nation’s “digital health infrastructure” incompatible with cloud-technologies.
Installation and configuration of the third-party systems may mean that an additional vendor requires access to secure servers, increasing points of entry and therefore avenues for cyber-attack. An array of bolt-on application executables, adaptors, patches, and other needs that add costs, time, and potentially additional infrastructure, are also inevitable.
Imposing third-party workarounds, because switching to interoperable industry data standards remains a bridge too far for too many, compromises the capacity of cloud solutions to deliver efficiency, scalability, and improved affordability.
True interoperability cuts out the cumbersome middle layer. APIs (application programming interfaces) define how data is exchanged and the end users manage availability and security. Multiple parties (or tenants) can access and use the same software and data; simplicity and standards-driven consistency of design enable scalability and efficiency. Any software updates or security measures are applied centrally, for all tenants.
In the United States, legislating to enforce patients’ rights to access their health data in a secure and timely manner, wherever they are in the healthcare system, had the effect of accelerating interoperability between software systems.
The 21st Century Cure Act was a bipartisan bill, signed by President Barack Obama in 2016. It allowed time for health care bodies to reshape their systems for interoperability, only taking full effect in April 2021.
Opportunity beckons for our new federal government. If Australia is ever going to achieve a truly interoperable health system, someone is going to have to cattledog a cowed software industry down the track.
Matthew Galetto is the founder and CEO of MediRecords
Here is the link:
https://medicalrepublic.com.au/you-seek-the-holy-grail/70458
Given that the business of clinicians is to be care providers and not technology support providers this all makes a great deal of sense and frankly, were I still providing individual care I would run, not walk, to cloud based that met my functional requirements and kept my data on-shore!
Does anyone think is makes sense to wait to move to ‘the cloud’ any longer?
David.
3 comments:
"There’s widespread acknowledgment that cloud-based solutions offer significant advantages in flexibility, scalability, accessibility, environmental sustainability, security, speed to implementation, and cost efficiency, when compared to legacy systems."
That may be true, but cloud-based solutions are not very resilient, especially in times of emergency.
Hospitals tend to have local no-break electricity generators because they cannot afford to be without power if the grid goes down.
Any heath-care provider that provides critical care must always have their IT systems available. Cloud-based systems are dependent on both local power and communications. Multiple, redundant communication systems are very expensive and even then are susceptible to systemic failures.
When it comes to resilience, the better solution is local, independent, autonomous capability.
By all means implement in the cloud non-critical, non-time sensitive systems that can recover from communications outages. This will make them more expensive to design and implement but the priority with healthcare systems is availability in time of need.
Running your software in the "cloud" needs to be split into the software and hardware components.
On the hardware side there are advantages and security, UPS quality and the sharing of high end computers along with the centralized management of backup, firewalls and software/OS upgrades. Well managed this should be superior to most practice setups but is probably not cheaper, just better quality.
On the software side there are some cloud based application architectures that can only be run using specific cloud providers, but they still require a lot of software development investment to function and you do become dependant on internet access. I have run the clinical side of our day surgery in the cloud for over 20 years with redundant internet connections and have had little downtime, and the downtime was generally related to lightening strikes that also would have caused issues with local systems.
There is nothing magical about the cloud software however, you still have to move data around and standards and compliance remain the limiting factor for interoperability. It does not matter where the software runs if the messages are non compliant and can't possibly work unless the data is massaged into shape. The advantage is possibly getting access to the data doesn't involve talking your way past the receptionist and IT person to get the access to fix issues.
I think the My Health Record is good proof that a central monolithic system that contains all the data is not a solution and that is the ultimate end point of a giant cloud based system that "solves" the interoperability issues. It does not solve anything.
In the end, as always, its the quality of the hardware, data and software that matters and there are issues in all areas. Poor hardware management is an issue in practices, non standard compliant data has not been addressed at even a basic level and the software is often poorly tested and actually fails with compliant data so is feed hacked non compliant data as that's the only thing that works and there is little interest by vendors in fixing those issues with fixes for major issues outstanding for years. It used to be that if it worked with Medical Director it was right and we are a bit past that, but the overall compliance has probably gone down as the expertise has withered away from disuse. No one cares about glaring compliance issues any more because they have learnt that no one will ever call them out on that and they can just bluff their way past it and these terrible middleware vendors will fix it so that it works.
As one of the terrible middleware vendors I would love to stop fixing bad data. as its often impossible to fix things reliably 100% of the time, because the errors are quite random and ever changing with new releases. Talking about the cloud pushes the raw wounds around and covers up some of them, but they are festering under the bandages, and it will be causing clinical problems, its just that no one is doing any actual accident investigation. It's like doing aircraft crash investigation by looking at a photo of the plane and saying the paint job looked fine before take off.
* Backup notebook computer, backup generator, UPS & satelite internet needed to survive local emergencies (governments need to supply emergency space to HCPs).
* Larger facilities aim for local onsite cloud with mirror to offsite cloud & remote backup. Still need backup power & protection/mitigation from fire/flood/storm/earthquake damage (so as much as practical the computers, equipment & consumables can be saved).
* Government/public buildings become hubs to establish temporary internet to essential services (including health care, welfare, emergency response).
Post a Comment