Quote Of The Year

Timeless Quotes - Sadly The Late Paul Shetler - "Its not Your Health Record it's a Government Record Of Your Health Information"


H. L. Mencken - "For every complex problem there is an answer that is clear, simple, and wrong."

Wednesday, December 26, 2018

Stop Me If You Have Already Heard This One Before!

This popped up last week – again:

Media release - Secure messaging to replace outdated fax machines

20 December 2018: The future of secure messaging and interoperability across the Australian health sector was discussed at a recent workshop held by the Australian Digital Health Agency (ADHA) in Sydney, attended by more than 50 state and federal government officials, industry stakeholders and international experts.
One of the ADHA’s key priorities is to help healthcare providers across Australia communicate quickly, easily and securely, and to reduce the sector’s current reliance on outdated technologies like the fax machine.
For consumers, secure messaging means that that when they are being treated by different doctors in different locations and transferred from one doctor or hospital to another, their doctors and healthcare providers can securely communicate and provide safer and better care.
At the second Secure Messaging Industry Collaboration Workshop held in Sydney on 27 November 2018, Medical Software Industry Association President Emma Hossack, ADHA Chief Operating Officer Bettina McMahon, and Dr Nathan Pinskier, Clinical Advisor to the ADHA’s Secure Messaging Program, signed a communique committing to further collaboration on the adoption and implementation of secure messaging.
Several clinical software vendors are already providing secure messaging solutions to specific markets within Australia’s healthcare system — but the aim going forward is to implement a nationwide solution that embraces existing solutions and unifies them seamlessly.
“Secure messaging is a foundational capability enabling interoperability and safe, seamless, and secure information sharing between healthcare providers,” Ms McMahon said.
“Nationwide adoption of secure messaging will enhance the security, safety and efficiency of clinical information sharing across all sectors — ultimately aiding the provision of better healthcare for the community.
“To realise this goal, through events like the Secure Messaging Industry Collaboration Workshop, the ADHA is working collaboratively with industry, suppliers of secure messaging solutions, and clinical software vendors to reduce existing barriers to adoption and to provide pragmatic and implementable solutions.”
A key priority for the ADHA is the creation of a transparent, national directory of service providers that can be used for securing messaging.
At the workshop, participants agreed to continue working towards this goal, with the target of delivering a minimum viable product by June 2019.
The end result will be the equivalent of a national ‘yellow pages’ for all registered healthcare providers, enabling them to easily contact each other.
Working groups were also established to help identify and troubleshoot barriers to the adoption of secure messaging, improve the clinical experience, and to develop an industry alliance participation agreement and trust framework.
Ms McMahon said discussion at the Secure Messaging Industry Collaboration Workshop was highly constructive — and the future of secure messaging is in good hands.
“The Secure Messaging Industry Collaboration Workshop brought together some of Australia’s and the world’s best and brightest minds in healthcare IT,” she said.
“We will continue to work together collaboratively to strengthen and develop secure messaging and interoperability within the healthcare landscape, for the benefit of all Australians.”
Here is the link:
All I can say was that it must have been a pretty slow news day to have this put out!
Just in passing I had an appointment with a neurologist during the week and was really impressed technologically with what I saw.
He was able to access test results, reports and the actual images from both the local private and public hospitals.
He conducted a lot of complex tests with results captured and then placed in his letter to the referring doctor.
He used electronic speech recognition to dictate his letter back to the referring doctor with the need to correct with the keyboard very infrequently.
For his purposes the myHR is an irrelevant waste of time as he already has the tech support he needs to do all he needs to serve his specialist practice.
Of course he also has a fax machine for those not quite at his level

PS - Remember the rest of the world is also still working on the fax 'problem'.

Most hospitals still use mail or fax to exchange data

Dec 19, 2018 12:46pm

More than 7 in 10 hospitals still use fax to transmit records. (Getty/IPGGutenbergUKLtd)
Health systems use numerous methods to exchange patient medical records, but providers continue to rely heavily on the old-fashioned approach of mail or fax, according to new federal data on interoperability.

Nearly three-quarters of non-federal acute care hospitals routinely use fax or mail to receive summary of care records from providers outside their system, according to new data released by the Office of the National Coordinator for Health IT. Two-thirds of health systems use fax or mail to send records.

Paper-based methods outpaced electronic transmission, particularly in receiving records. More than 4 in 10 systems said they “often” use mail or fax while 27% said they often use a standalone health information service provider like DirectTrust.
Centers for Medicare & Medicaid Services (CMS) Administrator Seema Verma has pushed the industry to adopt electronic methods for data exchange, calling on developers help “make every doctor’s office in America a fax free zone by 2020.”



Anonymous said...

The language exposes the depth of appreciation of the outcomes. I do like the use of outdated technologies. I wonder if they mean the GovHR system.

Have a great New Year David and thanks for operating an open and transparent blog, it helps balance the debate and reminds there is a future.

Anonymous said...

Do you mean that are we still where we were? A scan of the ADHA SMD specs indicates not much happened after 2012 (Knowledge gone I am sure). Then there are these parallels- http://www.aph.gov.au/DocumentStore.ashx?id=aa98750a-bafd-4706-8a9a-39d54c172f59

Same shit, different ass as they say.

Anonymous said...

December 26, 2018 5:58 PM. “The language exposes the depth of appreciation of the outcomes”. That would be somewhere between a puddle and a dry riverbed then. The ADHA should only be seen for what they are, Federal chequebook. They can buy a seat at the table but other than picking up the bill they have little to contribute and this (insert) the fax is just silly.

tygrus said...

As they always say "Don't let the fax get in the way of a good story".

Have a Happy New Year and best wishes to all for 2019.

Andrew McIntyre said...

I think this is a great example of: "The Vision of the Anointed: Self-Congratulation as a Basis for Social Policy"


The SMD standard makes all the easy bits complex and uses crazy inefficient technology like xml encryption while being heavily biased towards outdated store and forward technology. They have virtually ignored payload compliance and that makes any messaging very unreliable. ADHA are clueless as to what needs to be done to progress the field....

Dr Ian Colclough said...

That is a very powerful statement Andrew. You are either right or wrong I don't think there is a valid halfway position. Over the years you have been consistent in stating your views; a consistency which I have not seen evident coming from ADHA, the Department, the Government or the Peak bodies. This is most troubling which I can only account for as being due to a combination of confusion, politics, mis-information, various degrees of ignorance and a failure to understand and precisely define the whole problem to be solved and parts thereof. Even more disconcerting is the lack of support being publicly forthcoming from some of the more high profile bodies which purport to speak on behalf of the technology vendors. I have yet to hear any good reason not to support your well reasoned views; perhaps this now offers some the opportunity to speak up and be heard.

Andrew McIntyre said...

Its just common sense Ian. What do you think happens when a message fails to import properly or does not display correctly. Sometimes it vanishes silently, which is a worry, sometimes it crashes the endpoint system, which generates irate users and blame for the messenger even if the message was 100% compliant. Sometimes it goes into the wrong section of an endpoint system, or is incompletely displayed, potentially hiding information that on occasions would be critical. Many of the errors are less critical, but the lack of reliability in moving data is a clinical concern that we deal with with a constant cycle of testing and release via continuous integration, releasing new versions, at times within days of problems being identified or appearing in new versions of endpoint systems. My experience is that the quality of even lab data is getting worse rather than better and the ADHA and NEHTA have been oxygen thief's and the focus of quality in the pathology space has been neglected.

Currently safe delivery only works when we have control of delivery from start to finish, with knowledge of end point weaknesses and a relationship with the recipient that allows us to investigate and fix problems. Solving the actual delivery (messaging) problem is a very small amount of the picture and until systems can actually reliably exchange messages via a system of comprehensive compliance checking (that is mandatory to operate in the space), adding interoperable messaging will act as a "n squared" multiplier for the risks of poor compliance in both receiver (most important in fact) and message creator systems. In the end its patients that suffer because of a lack of reliable data related to their care. Delivering unreliable data can be worse than no data.

Oliver Frank said...

Andrew McIntyre's report that sometimes messages that are not imported properly vanish silently touches on a concern that I have expressed in the last couple of weeks to my GP colleagues and to the MSIA.

My concern is that GPs are not being actively notified immediately if a message can’t be delivered.

My 20th article in Medical Observer published on 4th September 2017 about how to improve GPs’ software outlined this need and proposed a solution that I now believe is not enough by itself: https://www.dropbox.com/s/qsrhub6ja584ewu/Number%2020%20Show%20delivery%20of%20secure%20messages%20Medical%20Observer%20published%204%20Sep%202017.docx?dl=0.

The MSIA circulated this article to its members at that time, and, I believe, has had it available on its Website for members since then. Despite this, GPs still don’t receive any notification from their clinical software packages if a message that they send via a secure messaging provider can’t be delivered.

This allows potential serious clinical and legal risks. GPs find might out that a letter or other message wasn’t delivered only if and when:
- they learn from the patient, the intended recipient of the message or the patient’s lawyer that the patient wasn’t seen as intended
- they happen to view the relevant log files in their clinical or secure messaging system

For GPs, being able to search in various logs for failed messages is not good enough. Nobody in any general practice is going to do this regularly or frequently (enough).

Argus sends an email message to an email address specified by the practice (usually this will be the practice manager’s email address) if a message hasn’t been acknowledged within a specified number of days by the recipient’s system. This is a halfway solution and is better than nothing.

I understand that ReferralNet can be configured to show pop up notifications about the success or failure of delivery. This is better because it notifies the user immediately.

At least one secure messaging provider feels that it is better for secure messaging providers to send failure messages to the user’s clinical software system, which will then notify the sender of the failure.

This issue really needs to be met if we want GPs to adopt secure messaging as the preferred and default method of clinical communication.

Bernard Robertson-Dunn said...

Why not utilise the secure messaging functionality that has been developed for, and which was required by, the PCEHR?

As the ConOPs said in November 2011, Section 2.1:

"In order to fully realise the benefits of this investment, the states and territories will need to continue their planned or expected investments in core health information systems. States and territories will also need to provide the complementary investments to build their capacity in readiness for connection to the PCEHR System.

The Department of Health and Ageing and NEHTA are currently working with each of the state and territory health departments to implement a range of foundations, including Healthcare Identifiers, Discharge Summaries and Secure Messaging, all of which will be required for the PCEHR System."

Maybe this is one reason why the Conops is not available from the ADHA website - it's a vision that has never been properly implemented and is full of vacuous promises. It is a damning indictment of the government's ability to deliver anything useful or of value to medical practitioners.

But still they promise and still they waste money and opportunities.

Anonymous said...

Bernard, the secure messaging work was stopped mid 2011 by order of the PCEHR executive. The team was slowly provided opportunities to spend time with family and then the last remaining ones were dumped in the July 2012 “restructure”. This lines up with all major releases of specifications, tools and technical standards.

Just another example of how the PCEHR/MyEHR has sucked the life out of the agenda.

Andrew McIntyre said...

December 28, 2018 10:48 AM

This has been my take on the various national eHealth authorities. They have been an active force to stop advancement of point to point eHealth and existing standards. There seems to be a grand plan to replace that with a central database with mostly non atomic data (eg all pathology is pdf in CDA).

Why anyone would do this is beyond me, I doubt they can build (or afford) a system to cope with the load that that would generate. It would also mean decision support was impossible. I don't think there is any sensible direction from the top, but just a folly of the anointed? There is no appetite to improve what is working at a basic level, it really just needs an expectation of, and testing of, standards compliance, but unless people are told to do that its clear they won't bother, as it would require significant effort to fix the problems we know exist in the real world.

Bernard Robertson-Dunn said...

Fax is a point to pint solution. It is used by the health industry because it is a solution, but not necessarily the best or even a good solution - as in now becoming more and more obvious.

I would have thought that the health care industry needs, at a minimum, a one to many solution, if not a mesh solution on which to build capability that finds, acquires (pulls) and transfers (pushes) information based upon consent and need-to-know models.

A bit like the original vision for the PCEHR, except there was no consent or need to know models.

As the county yokel said to the lost stranger - if you want to get to the city, I wouldn't start here. myhr is not a good thing to start from.

Anonymous said...

So Bernard @7:11 PM the logical extension of what you are saying is that replatforming the myhr is also not a good place to start from.

That being the case the only logical reasoning which explains why the Federal and State Governments and Territories continue pouring hundreds of millions of dollars a year into the myhr is purely an act of face saving incompetence.

Bernard Robertson-Dunn said...

...or the data in myhr has value to the government that has nothing to do with clinical medicine or even some sort of benefit from putting "your health record in your hands" whatever that bit of nonsense actually means.

Anonymous said...

So if messaging is not necessarily “standards” compliant and GP’s and or specialists across the board are not informed if a transmission has been received and actioned then surely the MyHR must be rejecting them as well? I mean the myhr requires clinical payloads to be comfortably to a specific set of criteria prior to being accepted and presumably filed. So therefore in the name of patient safety the ADHA must be aware of and thus informing the senses and the patient that a record was presented but failed to met the acceptance criteria and explain what that criteria is?

Seems to me the myhr could be losing anywhere between 1% and 70% of all records. I base this on the huge gap between the number of accounts created in the myhr and the number of active records, volume of PBS and TGA data. All cups tea estimates.

In the ADHA Defense I am sure they track this and have done so from the outset, no one is stupid enough not to monitor such events in a system of record and especially a national database being communicated as a game changing, patient empowering, 360, faxination to all our health woes. Surely patient safety is a top priority and the privacy of patient and caregiver is paramount.

Anonymous said...

Surely patient safety is a top priority and the privacy of patient and caregiver is paramount.

silly boy (or girl)

Andrew McIntyre said...

MyHR does not use SMD and has different formats to what is used for clinical communication so it has no impact of general clinical communication, which has been ignored for years. In general the pathology that will go into MyHR is CDA wrapped pdf with no atomic data, unlike what is sent to providers, which has some atomic data. This was all developed before the federal government decided to "help" eHealth by creating the anointed authorities to "regulate" eHealth. Point to point messaging has been the poor cousin, with no significant funding for years.

Bernard Robertson-Dunn said...

"MyHR does not use SMD and has different formats to what is used for clinical communication ..."

i.e. it's not a good place to start from, not when it needs fixing first.

Heather Leslie said...

It is disappointing that such little progress has been made. After all we are talking about a largely technical solution which, in principle and with clear leadership, it should be 'relatively easy' to put in place. (Clearly it hasn't.)
Still the 'elephant in the room' for me is the 'payload compliance' that Andrew mentioned in one the early posts in this thread. Even when we finally have a messaging solution that works, if we haven't considered standardisation of the payload then we will find ourselves little further advanced in terms of interoperability of health information.
I recently asked Dr Alan Finkel, our Chief Scientist, about what Australia was doing to standardise clinical data. He proceeded to inform me that it was the holy grail that people have been trying to crack for over 40 years but that there has been no success to date. I beg to differ. One approach has been available for more than 10 years and is a home grown Australian invention - openEHR.
One of the little known and briefest 'achievements' during the time of NEHTA occurred in February/March 2016, when all of the State Health CIOs signed off on using the NEHTA Clinical Knowledge Manager tool as a means to collaborate on and create a national clinical data dictionary - a foundation for digital health interoperability, independent of any single vendor, application, project or implementation technology. They requested GeHCo to provide some training to underpin this - https://www.digitalhealth.gov.au/news-and-events/news/nehta-s-clinical-knowledge-manager-ckm-featured-in-health-it-to-lead-or-be-led-seminars.
3 months later, NEHTA morphed into ADHA, the leaders of this work left NEHTA and effectively all corporate knowledge of the basis of this decision were lost! 12 months ago the license for CKM lapsed and the tool is no longer available, with no one in ADHA seeming to understand why they had it or what value it might provide.
In fact, the CKM contained the results of clinician and vendor collaboration on structured data that formed the basis for the CDA messages out of the primary care systems and into the first iteration of the PCEHR. Evidence of that collaboration has effectively been lost, although the outputs have been leveraged and further refined in the international openEHR CKM. In turn it has also contributed significantly to the FHIR Allergy resource following a cross SDO collaboration effort.
In Europe and South America there is increasing momentum in multilingual interoperability of health data using openEHR, with 'messaging payload' being considered only one component of the depth and breadth of clinical data standardisation that will be required for serious interoperability, beyond the obvious limitations of v2 and FHIR messaging.
In comparison to our problems with messaging, standardising clinical content is quite different - a largely sociotechnical issue, about coordinating the humans and gathering and collating their collective knowledge. A summary of the international effort to date has been posted as part of our Christmas message to the openEHR community - you may be interested to read...In Australia we would benefit enormously from participating into this international effort - lets stop reinventing the semantic wheel! https://openehr.atlassian.net/wiki/spaces/healthmod/blog/2018/12/21/373358615/Christmas+greetings+from+the+CKM+team+-+2018

Kind regards


Anonymous said...

Oliver, how often does your Internet bank system lose messages which vanish silently?

Anonymous said...

Oliver, how often does your Internet bank system lose messages which vanish silently?”

What does an bank have to do with it? Other than both the banking sector and the ehealthnidusyry need a royal commission I don’t get the reference.

If a message is silently lost how would you ever know?

Andrew McIntyre said...

While openEHR has some good ideas, some of which, like Archetypes I fully support, I don't think wholesale change to something new is appropriate, as the path to that is very rocky, especially when people will need to continue to support things like PIT, HL7V2 and CDA. What we need is actual tested compliance with what they currently support, and current standards, rather than running to something new as a solution to poor implementations. There is a lot of value in HL7V2 that we are not even close to exploiting. The problem is poor implementations in both senders and receivers rather than a limitation of the format we are using. New formats like FHIR or openEHR will do nothing, we need a culture of safe high quality, tested implementations of whats in use, and then if we decide to use something else as well implementations should also be tested and compliant with anything new before an endpoint is allowed to send or receive it. The issue at the moment is limitations in implementations and not limitations in standards. The shortest path to good eHealth is compliant implementations or current formats without actually adding anything new, which would just dilute the ability of vendors to improve their abilities. MyHR, PCEHR and HealthConnect have all diluted the focus of vendors.

Andrew McIntyre said...

On the issue of disappearing messages, I often speak to secretaries who say "No we don't have any results from you". Medical-Objects is fully traceable and I can say, no, it was delivered at 10:21am and imported by your PMS system at 10:23am so please have another look or do an import. The issue is systems that silently fail an import, but just ignore it or place content in a failed folder without notifying the user. A bigger issue is incorrect display and any compliance testing needs to start with a set of test messages that stress systems ability to handle difference datatypes and follow the display rules and unescape reserved characters, and read different versions of HL7 in the correct way. The problems are always related to implementation issues, and not the standard itself. Without formal testing vendors just insist that their interpretation is correct and its not their problem. Once you have 30 vendors, with this attitude and no one doing any formal testing, inter-operability is a pipe dream, so you need an interface engine in between them so they can all be "right" and things still "work".

Heather Leslie said...

The value of good standards for messaging is not in dispute. I have no preference for message formats. It is a purely technical issue from my POV.

For the past 15 years my work has been in standardising the clinical content in a vendor-neutral and technology-neutral way, such that it can be used to support interoperability in any message, EHR/EMR/CDR, data registry, CDS, AI etc etc. I'm talking about standardising the 'little data' as a foundation for any/all health information exchange or persistence. But defined by the clinicians and outside of the vendor, project, organisation or application silo. It is both a vision for the future that has been tested and refined in over 20 years of implementations.

It is currently transforming implementations in Europe, especially of the NHS trusts and across the 5 UK nations and throughout the Scandic countries. In these areas there is a rapidly growing groundswell of clinicians, informaticians, vendors, CIOs and organisations where they are refusing to engage in healthIT religious wars but using a variety of standards and tools, each according to it's strengths - openEHR and SNOMED CT alongside FHIR, V2, and IHE. Common sense is prevailing for once. Semantic interoperability (not limited by messages alone), and high quality health data that is vendor-agnostic, is the intent.

We should all be watching and learning, not perpetuating the old ways of doing things that have not delivered as promised.

Here’s a quick video by Andy Blofield, CIO at Plymouth NHS trust about what they are calling the ‘bimodal’ approach - https://www.youtube.com/watch?v=EQ3CStL8oNo

The Bimodal approach is explained further here:
- This video on the ‘Post Modern EHR’, quoting Gartner re openEHR, is what is underpinning this growing European view. Up to 5:15 is setting the context; thereafter compelling viewing... The guy presenting is Toma┼ż Gornik, who is co-chair of the openEHR Foundation and CEO of the Slovenian company, Marand.
- Apperta Foundation, supported by NHS Digital and NHS England – white paper on open platforms in health - https://apperta.org/openplatforms/.



Anonymous said...

One of the little known and briefest 'achievements' during the time of NEHTA occurred in February/March 2016,

Hate to break it to you but that was just part of a broader and brutal game where many good people were treated terribly by some who were simply nasty sods. If the person you refer to is who I think you mean, then it is no loss at all. A long trail of betrayal and damage.

I would suggest that the right standards be used for their intended purpose and where there is a need consensus on how standards coexist and work together for a. Specific implementation within a broader system of system understanding. Some are good for interior, some good for exchange and some good for EHR. As Andrew rightfully points out compliance to what is available and in use would be a remarkable forward step.

Heather Leslie said...

It is a truly bizarre experience trying to respond to an anonymous person making vague comments about unknown people doing unnamed things.

I identified one event that not many people are aware of but may be of interest. If you're not interested, that's fine.

We're all advocating for using the right standards for the right purpose. I've provided some examples of where it is working well elsewhere in the world. Again, if you're not interested, I'm also fine with that.

Unfortunately, we keep trying to do the same things the same old way and expecting different results... when will we learn from the success of others?

There was a link that went missing in my last message - the one about the 'Post Modern EHR'. Here'tis - bit.ly/2xCATKs



Dr Ian Colclough said...

I think Heather's work, and that of those who have gone before, focusing on standardising clinical content and the development of archetypes has great potential and in time may well provide a widely acceptable formalised structure for the way clinical content is captured and recorded. I started on that path a long time ago, in 1972-1974 at King's College Hospital. Approaches and techniques have advanced hugely since then, thanks to the maturation of tools, technology and understandings. Keep up the good work Heather - patience is a virtue.

Andrew, I believe, has a somewhat more pragmatic view. One which I strongly support; given the health sector environment, and the hugely complex and conflicting vested interests, cultures and politics operating across the entire health sector.

At the risk of being repetitive, Andrew is correct when he says:
(i) wholesale change to something new is not appropriate, as the path to that is very rocky, especially when people will need to continue to support things like PIT, HL7V2 and CDA.

(ii) what we need is actual tested compliance with what they currently support, and current standards, rather than running to something new as a solution to poor implementations.

(iii) the problem is poor implementations in both senders and receivers rather than a limitation of the format we are using.

(iv) we need a culture of safe high quality, tested implementations of what’s in use, and then if we decide to use something else as well, implementations should also be tested and compliant with anything new before an endpoint is allowed to send or receive it.

(v) the shortest path to good eHealth is compliant implementations or current formats without actually adding anything new, which would just dilute the ability of vendors to improve their abilities. MyHR, PCEHR and HealthConnect have all diluted the focus of vendors.

(vi) the problems are always related to implementation issues, and not the standard itself. Without formal testing vendors just insist that their interpretation is correct and it's not their problem. Once you have 30 vendors, with this attitude and no one doing any formal testing, inter-operability is a pipe dream, so you need an interface engine in between them so they can all be "right" and things still "work".

Best summed up by saying "It's just common sense".

Grahame Grieve said...

If Andrew is correct, and it's so obvious, then why hasn't anything happened? If it's a culture problem, why do we bleat about the government as if it can magically fix the culture, and blame it for being part of the culture?

I think that there are multiple underlying factors here. One of them is the fact that the messaging format is old with custom - and tricky and poorly documented - syntax rules. Another is that the messaging format offers no leverage to make investment worth while. For those reasons (mainly the second) it isn't wrong to migrate to new and better formats. If they really are.

I do agree with Andrew, though. The implementations are crap - poorly tested and with woeful error handling. What can change that? I think there's a clue here: "the problems are always related to implementation issues, and not the standard itself" - why not? What good is the standard if it doesn't drive quality implementations? but whatever, that's only a partial solution, because until the purchasers of the software care, they won't pay, either in $$ or attention.

Having said all that, there is a new group coming out of the ADHA/MSIA work on Secure messaging that is chartered to do something about this. I'll do what I can to help, but it remains to be seen whether there's real drive to really do anything.

Dr David G More MB PhD said...

Grahame and others,

What everyone is saying is that there needs to be determination and commitment to have any real outcome.

The evidence that this exists from Kelsey et. al. is bugger all.

This has dragged on for over a decade so there is blame to be shared around ++++

Fucking killing faxes is OK if you have a viable, usable replacement...

The end users are really wondering will anything ever happen that is safe and useful?

Jinx the incompetence and pathetic game playing is a joke. What about the experts just get in a room and fix it????

Sad and deeply annoyed observer of a level of bullshit that even ScoMo and Dutton could not arrange.


Grahame Grieve said...

Actually, many years ago, we (the experts) did just get in a room and fix it, at the behest of the MSIA. We listed all the problems we knew about, agreed what the correct solution to them all was, wrote that up as a specification, and created and published a set of test messages. Andrew's team wrote and published additional testing software (kudos!).

And then.... nothing. It didn't go anywhere. But i put it to you: it's not the experts we are waiting on right now.

Though I think that one thing this does demonstrate is that what the attempt to create a single national clinical repository has so far only served to distract us from real working solutions, as Ian said (and as I said in my senate submission).

Dr Ian Colclough said...

Grahame @ 9:0 PM asks "If Andrew is correct, and it's so obvious, then why hasn't anything happened?"

One thing is clear, it's not through lack of resources; there has been ample funds available.

Have there been too many cooks? or too many committees? or too many stakeholders? or a lack of discipline? or too much hype and flying by the seat of the pants? or a lack of intelligent, informed, competent leadership? or any / and or all of the above? to mention just a few possibilities.

Andrew McIntyre said...

Its a governance problem, well actually a lack of governance that is the problem. The government have attempted to lead/innovate as only government people can do, ie badly. The simple truth is that if critical medical information is to travel electronically it needs to be safe and the governments role is to simply say "if you have published a standard and peoples lives depend on it then we insist that everyone that uses it meets a high standard in creation and consumption" They don't even have to be the testing body, just insist that those tests be developed and done.

Adding some $ to GPs to compensate for the increased cost of software would make the transition easier. It should not go direct to software companies, as it has in the past, but into the hands of the users of software, as long as their software is proven to be compliant (and will be more expensive as a result).

As Graham said, a lot of work was done to detail the problems that need fixing and I have personally spoken to, in an increasingly hostile manner, to every version of the national eHealth authority, and pleaded and of late demanded that they act on this issue, but nothing happens. Someone with more interest/knowledge in the politics of this can perhaps tell me why not, as its not exactly expensive compared to the overall budget that has been pored down the drain. I suspect some of those drains are connected to troughs but don't have direct evidence of that.

The politicians/AMA/RACGP etc don't really have enough technical knowledge of the issues to grasp the situation, and have been sold enough lemons over the years, but I guess the people keep changing, only to be replaced by a new technically naive person, who gets sold a newly polished lemon. To his credit Tony Abbott smelt a rat, but failed to fix it. I suspect the only political fix is the mother of all recessions, when health $$ can't be thrown around by flashing the credit card. That might happen?

Anonymous said...

@5:56 PM ... You ask how would you ever know if a message was silently lost in your banking system. I would certainly know if I made a $3,000 transfer from one bank account to another and the transaction was silently lost leaving me $3,000 out of pocket.

Bernard Robertson-Dunn said...

Andrew said: "if critical medical information is to travel electronically"

define critical.

Spend a week in hospital and look at the amount of data they generate about you. Measurements taken very few hours, test results, details of drugs prescribed, progress reports etc etc. There's pages and pages of the stuff.

How much of that is critical? important? Who says so? What's the cost of anaylsing the data and producing a valuable discharge summary?

Once you leave hospital, most of that data is probably useless. That's the big problem with health data - it ages and is very dependent on context. Too much data, especially if it is badly curated, is worse than too little.

Interoperability and secure messaging are technical problems but, on their own, deliver limited benefits and have significant costs and risks.

Real value comes from interpreting the data and making good quality, targeted predictions. I don't see anything in myhr that delivers either. Do clinical systems? When I see my GP, he seems to spend a lot of time looking at his computer screen and entering data. I don't get the impression that the system is telling him anything useful.

Dr Ian Colclough said...

Indeed Bernard. You raise another part (and there are many) of the multi-segmented complex (and wicked) problems. I have never seen a crystal clear 'definition' of a medical record including all its components and how they are 'used', structured, updated and accessed. Multiple players seem content focussing on that part of the problem which interests them most and leaving other parts for others to deal with.

Andrew McIntyre said...

Its concerns about too much data that lead to the concept of a referral, which is designed to include a summary of the relevant information without a 3 year bowel chart to wade through. The real advantage of atomic pathology data is that it can be evaluated in software and should in a cumulative fashion. 50 ELFTs over 10 years can be shown on one screen and you can see any trends of lack of trends at a glance. Transferring large amounts of atomic data is much less of a concern that transferring renderings of results that require a human to view them. Currently most referrals just copy the text version of pathology data, so I always request the data direct from the lab and I myself have the ability to transfer any pathology data to anyone in its original atomic format without going via the lab.

If we fast forward to MYHR with 50 pdf versions of ELFTs over 10 years, each with 20 different analytes its a different story and you have to view and download 50 pdfs and just like paper flick back and forward to try an look for any significant changes. Its time consuming and difficult to see trends and you can't write decision support algorithms to evaluate it for you. This is the issue I have had since the 80s and it was only in the late 90s with the arrival of HL7V2 pathology that this could be solved. Over a long period of care many patients have 2 monthly blood tests, with probably 50 individual atomic data points each time blood is collected, often from different labs with different reference ranges, units and even different analytes from different labs. This is where software can contribute to actually supporting decisions, and none of this is possible with the data in MyHR. I guess you would have to do what I used to do in the 90's - get out the paper and transcribe your own cumulative results!

Even from a pure patient perspective atomic data would allow people to write algorithms that scan their data looking for patterns that indicate early disease. Doing that with pdf versions is near impossible. As our understanding of things like metabolic disease advances we could go back over old results with new algorithms to alert users to issues that we didn't recognize when they were done. Ideally this would happen continuously, scanning all results with new algorithms to look for past missed patterns. This is something I am currently very interested in. MyHR data is useless for this. Patient held data should not leave its original format so patients have access to this type of technology as well.