Wednesday, October 26, 2011
Another Expert Points Out How NEHTA is Off The Rails and Not Getting The Basics Right - Worth A Close Read!
Dr Andrew McIntyre posted this a few days ago. (Reposted with Permission)
Extreme failure in e-Health programs is in the news and Australia, as it usually does, appears destined to repeat the mistakes of others.
There is clearly a fundamental error in the approach to the problem and like the global financial crisis, I would postulate the error is ignoring the lessons of history and the folly of “generic management”, who do not have a deep understanding of what they are managing. Large IT projects fail frequently and this is well established in the IT world. Top down centralised management by persons without a deep understanding of IT or Medicine virtually assures failure and Australia has large doses of both.
You cannot build a complex system on poor foundations and that is what is attempted time after time. Just like getting a building out of the ground is an important milestone when building a physical structure, having reliable, well tested base level functionality is an important foundation for a working e-Health system. Instead we describe castles in the air, like the PCEHR (Person Controlled Electronic Health Record). I am yet to be convinced that it is the right castle, but to build it, inter-provider messaging needs to be in place first and the lessons learned and infrastructure reused. Instead we have an array of unproven, and in most cases non-existent “proto-standards” proposed as the foundations. In software engineering circles the term “code smell” is often used to describe something in the code that is clearly wrong, even if it appears to be working at the moment. To most in the medical software industry the code smell of the PCEHR is overpowering.
We have no solid standards based messaging, with the SMD specification created with a dependency on a non-existent NASH (National Authentication Service for Health) and a non-existent ELS (Endpoint location Service) and a dependency on the recently hacked and over complex WS-Security. Despite having a working and costly Certificate authority with most GPs having Medicare location certificates the wheel has to be reinvented to satisfy someone’s love of xml based web services.
The AMT (Australian Medications Terminology) is brain dead, with no ability to do proper allergy checking or drug disease interactions. We have a license for SNOMED-CT but minimal market uptake of any quality usage of it and scant localization.
Our Unique patient identifiers have no published quality measures or risk assessment and yet all the risk has been hoisted on the users. Our provider identifiers have had no real use, no freely published API and are fundamentally flawed because they are not location specific and cannot easily be used for pathology messaging because of this. We need location identifiers badly, but this is optional in the plans!
We will continue to use HL7 V2 for pathology (and clinical messaging) but no attempt has been made to ensure basic patient safety is protected with many non-compliant implementations and an inability be confident that data will be reliably read at the endpoint. Instead we are to introduce new standards without fixing what is in use and will continue to be used for a long time.
To build a complex system you need all these building blocks functioning reliably, with compliance expectations on both the sending and receiving sides. This is obviously not sexy enough for the politicians, but we have spent several billions on e-Health in Australia with little return so hopefully at some point someone will try a different approach and spend a couple of million on program to mandate compliance with existing proven standards.
We appear to be able to insist that new drugs have trials, but can continue to hoist unproven standards and systems on users without any proper trial. The potential effects of bad e-Health are just as bad as any other unproven treatment and it’s time to take patient safety seriously and use proven standards with an expectation of compliance by all players, including the government sector. It would be costly for many non-compliant systems to become compliant, but this would be money well spent, money that has to be spent and it would have long lasting benefits. The returns on our current castles in the air will be non-existent.
So what would a good strategy look like? Simply mandate compliance with existing standards and as a result create vendor interest in participating in the standards process. The users need to pay the costs of this compliance and funding could be directed to that end, but the focus of the industry needs to be on quality, compliance and creation of standards. If that was mandated then end users would have no choice but to pay the increased costs initially, but over time the free flow of reliable clinical data would result in increased efficiency and patient safety would be ensured. The privacy issues of provider to provider messaging are also already known and solved. A base of high quality implementations would also allow for gradual enhancement of the semantics of the content. Without basic compliance and quality in place the grand plans are a pipe dream.
This entry was posted on Sunday, October 23rd, 2011 at 1:41 pm
The blog is found here:
Andrew’s message on patient safety is an important one - and ought not be ignored!
Posted by Dr David G More MB PhD at Wednesday, October 26, 2011