Friday, June 18, 2010
If Ever There Was Some Research To Learn From This is It!
The following was published a day or so ago in the British Medical Journal.
Trisha Greenhalgh, director1, Katja Stramer, senior research fellow2, Tanja Bratan, research fellow2, Emma Byrne, research fellow3, Jill Russell, senior lecturer2, Henry W W Potts, lecturer3
1 Healthcare Innovation and Policy Unit, Centre for Health Sciences, Barts and The London School of Medicine and Dentistry, London E1 2AD, 2 Division of Medical Education, University College London, 3 Centre for Health Informatics and Multiprofessional Education, University College London
Objective To evaluate a national programme to develop and implement centrally stored electronic summaries of patients’ medical records.
Design Mixed-method, multilevel case study.
Setting English National Health Service 2007-10. The summary care record (SCR) was introduced as part of the National Programme for Information Technology. This evaluation of the SCR considered it in the context of national policy and its frontline implementation and use in three districts.
Participants and methods Quantitative data (cumulative records created nationally plus a dataset of 416 325 encounters in participating primary care out-of-hours and walk-in centres) were analysed statistically. Qualitative data (140 interviews including policy makers, managers, clinicians, and software suppliers; 2000 pages of ethnographic field notes including observation of 214 clinical consultations; and 3000 pages of documents) were analysed thematically and interpretively.
Results Creating individual SCRs and supporting their adoption and use was a complex, technically challenging, and labour intensive process that occurred more slowly than planned. By early 2010, 1.5 million such records had been created. In participating primary care out-of-hours and walk-in centres, an SCR was accessed in 4% of all encounters and in 21% of encounters where one was available; these figures were rising in some but not all sites. The main determinant of SCR access was the identity of the clinician: individual clinicians accessed available SCRs between 0 and 84% of the time. When accessed, an SCR seemed to support better quality care and increase clinician confidence in some encounters. There was no direct evidence of improved safety, but findings were consistent with a rare but important positive impact on preventing medication errors. SCRs sometimes contained incomplete or inaccurate data, but clinicians drew judiciously on these data along with other sources. SCR use was not associated with shorter consultations or reduction in onward referral. Successful introduction of SCRs depended on interaction between multiple stakeholders from different worlds (clinical, political, technical, commercial) with different values, priorities, and ways of working. The programme’s fortunes seemed to turn on the ability of change agents to bridge these different institutional worlds, align their conflicting logics, and mobilise implementation effort.
Conclusions Benefits of centrally stored electronic summary records seem more subtle and contingent than many stakeholders anticipated, and clinicians may not access them. Complex interdependencies, inherent tensions, and high implementation workload should be expected when they are introduced on a national scale.
The full paper and extras can be accessed from this link.
Some early commentary is available here:
16 Jun 2010
UCL has spent three years evaluating the Summary Care Record and has now issued a 234 page report on the subject. Fiona Barr reads how ‘The Devil’s in the Detail’ of this huge, but apparently disappointing, undertaking.
Long awaited and much anticipated, the final report of UCL’s independent evaluation of the Summary Care Record has just been published.
Just a look at the title – ‘The Devil’s in the Detail’ – tells you a lot of what the researchers want you to know. This is a complex issue with no simple outcomes or pat answers.
Nevertheless, the report’s discovery that there have been only modest benefits from the SCR and really no benefits from HealthSpace might lead to questions about why the two schemes should not be scrapped.
A key caveat – tucked away in a line at the beginning of the report – is that the evaluation was carried out at a time when few SCRs existed and the functionality of HealthSpace was much less than its creators hoped it would be.
This may enable those in favour of the projects to argue that the long term benefits have yet to appear - although, conversely, the argument that benefits will only come once a scheme becomes universally adopted has its own flaws.
Those in favour could also argue that emergency summary record systems have already been delivered elsewhere in the UK; so why should England not follow suit?
Although it is promising to consider the evaluation’s findings, the Department of Health sounds as if it has already made up its mind, and that the SCR programme will continue. A future for a functionally rich HealthSpace is harder to envisage.
However, if the DH decides its £1m investment in the evaluation is worth considering in detail, its final report has a wealth of information on what progress has been made so far and what it implies for the steps that will need to be taken in the future.
Lots more here:
The message for Australia, NEHTA and DoHA is crystal clear. Shared care summary records are a very difficult undertaking in a range of dimensions that far exceed the technical.
These conclusions say it all:
“Conclusions Benefits of centrally stored electronic summary records seem more subtle and contingent than many stakeholders anticipated, and clinicians may not access them. Complex interdependencies, inherent tensions, and high implementation workload should be expected when they are introduced on a national scale.”
If there is even the slightest pursuit of truth and honesty existing within NEHTA and DoHA they need to bring the full report to Government’s attention with their plans as to how they will overcome the issues identified in the UK.
To do less would just be dishonest. Australia must not replicate the mistakes made in England and the way to do that is to learn very carefully from their experience. No centralised system should be contemplated without good answers to all the issues raised in this evaluation.
The evaluation full report is available here:
Mandatory weekend reading!
(And before anyone feels the need to tell me about them, yes, I am aware of some simpler models that are apparently working better – but still with significant issues – in Scotland and Wales).
Posted by Dr David G More MB PhD at Friday, June 18, 2010