This popped up last week and struck a chord.
A Wide Range Of Problems Still Holding Back Health Data Interoperability
April 17, 2020
A new health data interoperability survey by Healthcare IT Today with 82 responses suggests that when it comes to interoperability, healthcare organizations are struggling with many of the same challenges they faced five or even 10 years ago.
Without a doubt, respondents are still interested in fostering health data sharing. Their objectives for doing so include improving patient care, streamlining revenue cycle management functions, increasing the accuracy of patient records, making sure patients have access to those records, supporting population health and value-based contracting, and automating more processes.
However, few organizations are meeting these goals, according to the respondents. “Most systems are only using the interoperability piece such as direct messaging to throw information back and forth to satisfy the requirements for MIPS/MACRA,” one health IT leader wrote. “Clinics are still relying on faxing and phone calls to obtain the data they need. Staff go through the motions to send and receive the data into their EMRs but rarely use the data for caring for the patient.”
The truth is that there are still many hurdles providers face when meeting this goal, including the limitations of their EHR, foot-dragging by vendors, a lack of adequate standards, difficulty adopting what standards do exist and the cost of making interoperability work, respondents said.
For example, Health IT pros complained that their EHRs were ill-equipped to handle robust data-sharing. “Outside vendors aren’t able to share data and resources, and there’s no consistent reporting of the same information between organizations,” one respondent wrote. “This makes interoperability difficult.”
When asked who’s most responsible for the lack of interoperability in healthcare, EHR vendors (42%) were by far the most cited entity, followed by hospitals, health systems and medical practices (30%), government (23%) and doctors (6%). (Respondents were offered “patients” as a choice but none of the respondents selected it.)
Survey respondents were particularly unhappy with the problems EHR vendors are bringing to the table, including a lack of co-operation with their peers and continued resistance to embracing data sharing generally.
In addition, many argued that despite years of intense effort, data interoperability standards aren’t where they need to be. “Programs that allow for cross-communication should be able to talk to multiple providers no matter what [providers] are using,” one respondent noted.
Not only that, some seemed resigned to data sharing remaining awkward for the foreseeable future. “The horse is already out of the barn,” another respondent wrote. “This should have been done years ago when the installations were taking place.”
More here:
It has always been my view that anything that was obviously useful and could be easily achieved would typically be done in a trice.
Since pretty much everyone has struggled with similar issues for what seems an eternity suggests reaching the goal is much harder and more complex than it seems as first glance!
David.
18 comments:
I'm not sure if 82 responses to a survey is sufficient to shed any new light on this issue, although it does highlight the need for key foundational 'building blocks', notably common patient identifiers.
However, as an active participant in the HL7 FHIR Project, I do believe that this represents a significant step forward in facilitating interoperability, notably its recognition that a modern, fit-for-purpose, standard needs to be more than just a set of published artifacts. It also requires a highly-engaged community and, in the case of an international standard, a large worldwide group committed to working together and producing common, best-of-breed solutions.
The fact that FHIR is based on Web Standards and the FHIR Community can engage freely on that platform is a genuine 'game-changer'.
I agree to some extent that FHIR offers a potential solution to the problem of technical interoperability - getting any data in any format from system A to system B. Without doubt that is steps in the right direction for the industry.
This is possibly because FHIR uses much more modern ways to express data than prior standards, which are still in use and make up the majority of systems in use.
There is enormous variation in the way information is collected, coded, and processed. Not an easy fix even with adopting mappings. However progress is (or was) being made.
I would be careful with using terms like ‘game-changing’, what is the game you are playing? What is this change? It is a bit like using ‘transformation’ it means many different things to different people, you quickly form mental models and expectations that are difficult to satisfy and everyone walks away dissatisfied and less likely to champion the next ‘game-changer’
I am in agreement that FHIR is more than just a collection of interesting artefacts and the community that has formed globally is impressive and demonstrates standards can be developed in many ways without losing what makes standards so valuable.
"There is enormous variation in the way information is collected, coded, and processed. Not an easy fix even with adopting mappings. However progress is (or was) being made."
This is the big issue and at the heart of the complexity of interop.
You can take it from me the the Father of FHIR (Grahame Grieve) is well aware of the issue. FHIR was born as a response (in part) to the complexity of all this in HL7 V3.
To me it is clear that without proper data standardisation you will get nowhere fast with interop - and don't get me started on semantic interop! This is all really hard.....
David.
I think most, if not everyone, involved in this domain is well aware of the data quality issues and it's true that this won't be solved by a (predominantly) health information exchange standard alone. Certainly, in my day job which involves developing and supporting common components in national systems (e.g. GP2GP), I see this as the major barrier to interoperability: particularly with regard to 'legacy' data.
Here in New Zealand, we are now seeing some promising 'Data Governance'initiatives - another step in the right direction which I'm hoping will be linked to international initiatives on the usability of healthcare systems such as the 'Reducing Clinical Burden' project undertaken by the HL7 International EHR Work Group.
As for the expression 'game-changer', sure that is a rather overloaded term. In this case, I was referring to the standards development and implementation processes which have been transformed/disrupted/revolutionised (choose your term of choice) by the HL7 FHIR Project.
Peter there are two distinct issues I believe - both very hard to fix.
1. Data quality.
2. Data standardisation, representation, stucturing and coding - including semantics.
Real interop needs both..
David.
If we use the analogy of the international telephone system there are two domains
1. Getting information from one place to another. That is an engineering problem full of engineering standards.
2. Enabling people to use the system to communicate with each other. At the moment, people have to be able to speak and understand the same language. As long as the participants in a conversation agree to a common language they can communicate. Thats dot point 2 in David's comment.
What FHIR seems to be trying to do is to provide a common language that any two people can use even if they do not speak the same language. They need to translate their language into the common language and the reverse when they receive a reply.
If this is true then I suggest its a very very difficult problem to a) solve and b) to maintain the solution as the requirements change, which they most certainly will.
If the problem could be solved it would bring great benefits, but (and there is always a but) has anyone looked at the problem that includes all the associated issues and requirements outside the more narrow scope of the common language?
Is interoperability today's Esperanto? A good idea in theory but it failed in practice.
Of course, we might say that English has become the common language rather than Esperanto, but English is less than an optimal solution, although it might be the best we can do.
In practice, English is used by native English speakers and some who have learned English as a second language, although there is probably scope for misunderstandings in the second group.
There are many parts of the world where other languages are used within countries and groups of countries with a common language.
Excuse my ignorance, but is FHIR an analogue of English, any single other language, or interlanguage i.e can translate between different languages?
To put it another way is the aim to enable a doctor who is using a Chinese eHR interoperate with an Australian doctor using an English eHR?
Attempts to standardise may well prove futile. There is so much in clinical medical terminology and the practice thereof in terms of workflows, procedures, processes and techniques that will elude attempts to standardise forever.
Perhaps new technology which provides greater flexibility on capturing, manipulating and presenting information in multiple ways to address information flows in complex clinical situations might overcome the standardisation barriers and illuminate the pathway to a new way of moving forward.
Radical? I know. Possible? Probably. Where to start? At the beginning.
To David's point, I think that those two issues are strongly related. I see data quality as the outcome and standard representations of that data as a means of achieving that outcome.
In response to Bernard, I don't believe it's accurate to view FHIR as the equivalent of Google Translate for healthcare information. Howevever, as it is build upon existing web standards, FHIR permits users to specify their language of choice when exchanging data using the http protocol.
The key to the shared understanding and usage of that data is FHIR's common information (resource) model. This applies no matter what language, or format, that model is rendered into when exchanged.Shared understand is further enhanced when the value of individual data elements is expressed using standard terminologies, such as SNOMED CT.
Re; In response to Bernard, I don't believe it's accurate to view FHIR as the equivalent of Google Translate for healthcare information.
I wasn't suggesting FHIR was a translation process, I was likening it to "FHIR's common information (resource) model"
Your next sentence is "This applies no matter what language, or format, that model is rendered into when exchanged." There would appear to be a translate process into and out of the FHIR model.
Is this correct? If so, it would appear to be analagous to (e.g.) translating from English into Esparanto and then Esparanto into (say) Chinese.
The point I was attempting to make, if not very clearly, is that language translation is not a specific concern of FHIR any more than it is of any request-response exchange over the internet; whether it's coming from any web application that provides an API. If you're interested attempts at creating common spoken and written languages, such as Esperanto, I'd suggest reading a book that Grahame Grieve recommends to all at HL7 - "In the Land of Invented Languages" by Arika Okrent.
The kind of translation that occurs in systems using FHIR is more commonly a mapping between an application's native information model and the FHIR Resource Model. For example, an application may contain a class of information called a 'Problem List' and this would be mapped to the Condition Resource in FHIR.
If we're really going to pursue a linguistic metaphor, then the right metaphor is chinese, where 2 different languages (mandarin and cantonese) share the same written form which works incompletely well (as compared to pinyin). FHIR would be the written langauge in that metaphor.
And "In the land of invented languages" is a great read ;-)
From my wide-ranging, high-level perspective, and notwithstanding Bernard and Grahame's comments in particular, my inclination is to embrace the views proposed by/@ 11:57 AM.
It seems to me that by doing so it allows new thinking and a new approach to get underway, all the while accepting that as Grahame's FIHR matures and takes hold it can in due course be brought into the 11:57 AM project and embedded therein.
What I have been wrestling with in this discussion, and for many years prior, is the 'apparent' resistance of technology advocates (like FHIR and others) to embrace 'this new approach and new thinking' in parallel, without letting emerging, promising, yet unproven, technology impede the new approach to unravelling the information solution.
Perhaps I am naive. Perhaps the technology advocates should be the ones to lead the way. Yet, somehow, after a decade or more of following the technology-pipers we remain knotted-up in circular arguments resulting in no real progress.
Perhaps the technology advocates should be the ones to lead the way. Yet, somehow, after a decade or more of following the technology-pipers we remain knotted-up in circular arguments resulting in no real progress.
I wonder if it is more that the agenda keeps getting highjacked by various parties and as the results start to hit the fan they lob the ball back along with the blame. Might also be useful to define what is meant by technology?
Thanks to Peter for starting a useful discussion
@12:14 PM It makes sense to me with one important point of clarification required before I could give it my full endorsement.
Does FHIR act like an API, whereby software already developed and widely deployed can become interoperable with other software already developed and widely deployed by having data pass back and forth between both software applications through FHIR acting as the intermediary?
In reply to @5:01 PM - FHIR is a platform specification that, among other things, can be used to build APIs that enable different healthcare software applications to exchange information.
With regard to some of the previous comments that question what's perceived to be a technology-first approach, I believe that many of the FHIR Community are well aware that engagement with all stakeholders is required for us to succeed in achieving interoperability at all levels. This is the only way in which clearly-defined requirements - the key to the success of any project - and 'coalitions of the willing', crucial to any interoperability project, can be created. As HL7 doyen Wes Richel once observed...“Interoperability is not a boat race. One team can’t win by rowing better than another. We are all rowing the same boat.”
From 5:01 PM Thank you Peter HL7NZ your opening comment is reassuring. Your clarification enables me to give FHIR my full endorsement.
Apropos your subsequent comment ...."that engagement with all stakeholders is required for us to succeed in achieving interoperability at all levels" ..... I do not think lack of engagement with all stakeholders will block the achievement of interoperability.
Rather, the central objective should be to ensure (all) participants in (a) project adopt and embrace the rules-of-engagement. These participants form your "coalition of the willing".
Subsequent participants seeking to join (a/the project) and remain an active participant must adopt and embrace the rules-of-engagement. Failure to do so will result in their immediate removal from the project.
Do you have any problems / concerns with this stance?
The problem is that health data is complex, changing and 2 equivalent clinicians will not even agree on what the right data actually is! We need a combination of formats, templates and terminology to make this work to any degree and that is a stack of functionality that results in huge complexity once real world problems are tackled.
The lower layers in a stack must work flawlessly or the whole stack quickly falls over. FHIR has addressed compliance in a more robust way than V2 ever did. FHIR uses formats that are more up to date and there are off the self tools, but there are costs in it in terms of file size etc. HL7V2 requires a low level parser, which does require some computer science ability to construct and V2 reuses data structures in different contexts, with different fields conditionally values where as FHIR has more specific single purpose data structures, but ends up with a lot more of them. Complexity rarely evaporates.
In the end complex problems require well engineered solutions and much of the problem with eHealth is the solutions are not well engineered enough, rather than a "bad format". There are trade offs no matter which way you go, but the level of complexity is much the same in the end and unless a lot of effort is spend understanding the layers, and testing, testing testing to ensure all levels are robust you will end up in a mess, which is where we are currently. The lack of requirements for compliance with standards, many of which are lower level layers, dooms big picture solutions to failure as the stack of functionality sinks into the swamp of errors in the lower layers. Imagine transmitting and evaluating a digital signature if tcp/ip garbled a few bites of data at seemingly random times. The message format standards sit above these layer (at Health Level 7 in fact) and need to be just as reliable as transport if you want safe, meaningful health care at the other end of the wire. We need robust engineering of existing solutions in use, not quests for the holy grail.
Andrew makes a critical point on this and other threads on this blog - the need for granular implementation testing. This applies to all standards and, in fact, software application development and operations a whole. In my long career, both in and outside of the healthcare, domain, lack of independent, professional-level testing ranks alongside inadequate requirements as the most common source of failure.
Post a Comment