Friday, February 25, 2011

This Issue Is A Real Sleeper as We Move More Into E-Health. How To Preserve Important Health Information over the Long Term?

The following note appeared a day of so ago.

How to Preserve EHRs for the Long-Term?

HDM Breaking News, February 18, 2011

Federal agencies will sponsor a two-day workshop, April 5-6 in Bethesda, Md., on long-term preservation and management of electronic health records.

Sponsors include the National Institute of Standards and Technology, National Institutes of Health/National Library of Medicine, Department of Veterans Affairs, and the National Archives and Records Administration. Presenters include renowned informaticists and CIOs in addition to federal policymakers.


Registration soon will be available at

--Joseph Goedert

Here is the description of the workshop from the web-site above.

Workshop on Long-term Preservation & Management of Electronic Health Record

Background: Electronic health-related patient information is vital for clinical care and medical research. However, systems interoperability for preservation, storage, and accessibility of such health data have not yet been defined. Clinical data in digital form represents a digital library, and inherits all the same administration and technical issues faced by digital libraries in other fields: what to retain and for how long; how to handle obsolescence of hardware and software; interchange of information; costs; assignment of responsibility; standards. In addition, clinical data involves issues of privacy, legal constraints, economics, and data ownership that complicate preservation even further. If preservation of clinical information is not addressed, valuable and irreplaceable information will become inaccessible, or disappear over time with disastrous consequences for patient care and research value. Replacing lost data even if possible, will entail huge costs for patients, clinicians, administrators, pharmacists, and potentially, the entire country’s economy.

Challenges: How to preserve and provide access of electronic clinical data as electronic health record (EHR) for a sufficiently long period of time to maximize value to patient, caretaker, and scientist.

Actions: To ascertain current practices for long-term preservation and lifecycle management of EHR, including an interoperability framework which supports a wide variety of data types, data formats/records, and data delivery mechanisms, while providing technology-independent infrastructure to acquire, store, search, retrieve, migrate, replicate, and distribute EHRs over time.

The expected outcomes will be the following:

  • Understand the current landscape on EHRs
  • Survey current practices and identify best strategies to be used as models
  • Begin to develop requirements, technologies, standards and best practices for long-term preservation and life-cycle management on EHRs
  • Differentiate between requirements for patient care and those for secondary use
  • Identify cultural and technological challenges
  • Catalog current legal requirements for retention of EHRs
  • Identify interested collaborators to form a WG on this area
  • Discuss possible test scenarios and datasets for collaboration and testbed

Participants: Policy makers, EHR experts, hospitals, laboratories, pharmacies, consumers, attorneys, representatives of CMS and ONC

----- End Extract

There are a huge number of issues raised here and there is no doubt that exactly the same issues apply in Australia.

When you consider that there are already General Practice Systems in Australia that have well over a decade’s worth of information stored already every year that goes by makes these records more valuable and potentially more useful.

At present we have no agreed Standards for the storage and formatting of Health Information that permits portability of the information and access to that information in a useful way into the future.

This looks like a job that the giant intellects in NEHTA should really address and sooner rather than later. Maybe a similar workshop in Australia, after attending the US workshop might not be a bad idea?



Anonymous said...

This was attempted via the MSIA just a few years ago, but DoHA completely screwed up the funding arrangements and it didn't get off the ground.

Reason: When HCN pulled the pin, the project became an impossibility as the removal of their market share from the pie meant that DoHA's target percentage for the project could not be achieved, even if every other vendor except HCN played ball.

This said, technically, there isn't much risk of data being locked up in GP or Specialist systems as mature data conversions exist for all products that I'm aware of, including many defunct products that have gone by the way side.

However the ad hoc transfer of individual patient records from one practice to another would greatly benefit from a resuscitation of the aforementioned MSIA project, and DoHA should fund this as a priority, ideally without NEHTA's involvement.

Anonymous said...

How naive: MSIA - just a whole lot of feral cats chasing their tails.

Anonymous said...

The MSIA project was heavily flawed and did not require vendors to produce standard messages, but was to use an "API", silly idea, they need to produce and consume high quality messages and then things will start working!

Anonymous said...

Friday, February 25, 2011 8:57:00 PM said "The MSIA project was heavily flawed" - yes and yet the MSIA and the members it chose to do the job put themselves up as 'the e-health experts'.

It is fair and reasonable to ponder the question that if the so-called e-health experts set out on a heavily flawed pathway where would we be if they had been able to continue. In a mess?

What pisses me off most is that these 'experts' put themselves up as knowing how to do it and expect the rest of us to listen to them. Then the moment they get a chance to demonstrate their prowess and competence they trip over their bloody shoelaces and fall flat on their faces. And then, because of intermittent memory loss, they stick their heads up a year or two later and try to convince those who will listen with the same old rhetoric.

If they were so good they wouldn't have screwed up in the first place. Who were these people? Were they a faceless committee of vested interest vendors trying to feather their own nests? These are important questions which need to be answered transparently so that lessons can be learnt so that MSIA can perhaps recover some credibility before it tries to step up to the plate again.

If we can't rely on the expertise buried deep within its cohort of members - what can we rely on?

Anonymous said...

"If we can't rely on the expertise buried deep within its cohort of members - what can we rely on?"

Well, the MSIA membership List shows 100 member companies which includes most of the Australian Health & Medical software developers. In other words MSIA, through its members, would seem to have access to a broad, deep, range of ehealth expertise and there is no reason to believe otherwise. Even so, we do have to be very concerned that this illustrious group of ehealth problem solvers couldn't solve the problem of how to work together when they were given the chance to do so as commented on above.

It's all well and good to blame HCN, as Friday, February 25, 2011 3:52:00 PM did, but that's a pretty lame way of finding someone to blame irrespective of who "pulled the pin". Surely MSIAs whole approach was not predicated upon HCNs co-operation. If it was then whoever came up with that strategy should be shot at dawn.

Anonymous said...

The MSIA project did not *require* the use of an API. The MSIA project specified a data format to be used in the the transfer of data. An API was recommended as part of the solution because it was thought that some vendors would have difficulty creating or processing the data format. A vendor could have used the API or the wire format.

A model for this was Medicare Online Claims which has an API and a data format. Medicare prefers that vendors use the API.