Thursday, June 28, 2012

A Really Worthwhile Look Back At The UK NHS Program for Health IT. It Started A Decade Plus Story That Needs To Be Told!

The following pair of very useful articles appeared a little while ago.

Horrible history part one: here comes the 21st century

Ten years ago, the document that led to the creation of the National Programme for IT in the NHS was launched. Lyn Whitfield re-visits ‘Delivering 21st century IT’.
11 June 2012
It is June and the government has set out a ten-year “vision” for information in the NHS. As the result of a new strategy, the patients of the future will “see that their health records are always available to staff” and be able to “help to maintain the quality of those records” by getting access to them.
The time that healthcare staff spend with patients will be “spent more effectively” because of the information at their fingertips. Data will also be opened up to healthcare managers and researchers and to new services such as telemedicine, which will become “commonplace.”
Of course, it is not June 2012 and the strategy is not ‘The Power of Information: putting us all in control of the health and care information we need.’ Instead, it is June 2002, and the strategy is ‘Delivering 21st century IT support for the NHS:  national strategic programme.’
Ruthless standardisation
‘Delivering 21st century IT’ is the document that paved the way for the National Programme for IT in the NHS. Its big innovation was not its vision – which it shared with earlier NHS IT strategies, as well as later ones – but the mechanisms it put in place for delivering that vision.
As it said upfront in its opening paragraphs: “The core of our strategy is to take greater control over the specification, procurement, resource management, performance management and delivery of the information and IT agenda.
“We will improve the leadership and direction given to IT and combine it with national and local implementation based on ruthless standardisation.”
Specifically, a ministerial taskforce was to be established under the chairmanship of Lord Hunt, a former head of the NHS Confederation, who had been made a Labour peer after the 1997 general election and was health minister in the Lords.
A new NHS IT programme director was to be appointed to lead on what Lord Hunt himself described as “the IT challenge of the decade.”
Standards for data and data interchange and system specifications for a new, National Health Record Service were to be set at a national level. And there was to be a big shake-up of procurement arrangements, with “consortia of suppliers” bidding for the work.
Finally, strategic health authorities were to appoint chief information officers to make sure that primary care trusts and providers “implement and use the core IT solutions determined at a national level.”
A product of its time
‘Delivering 21st century IT’ did not come out of nowhere. In 1998, the NHS had published ‘Information for Health’, a well-received strategy written by NHS IT pioneer Frank Burns, that proposed a rather different set of delivery mechanisms.
An NHS Information Authority was set up to create a national IT infrastructure, to run electronic patient record ‘beacon’ projects, to set standards for increasingly sophisticated ‘levels’ of EPR functionality, and to measure progress against targets for deploying that functionality to hospitals.
However, it left trusts to procure their own systems to meet these targets. And by the start of 2002 it was obvious that they were going to be missed.
IfH’s failure was blamed on technical issues, on trusts spending money that was supposedly ring-fenced for IT on other pressures, including a fledgling reform programme, and on the sheer difficulty of procuring systems from a “cottage industry” of suppliers.
But while the strategy had faltered, the pressure on the NHS to make better use of IT had grown. IfH was launched against a background of Tory “cuts” in the health service and Labour promises to restrain growth during its first term in office.
The strategy itself was to be funded from a £5 billion modernisation fund that had other calls upon it.
Yet in January 2000, Prime Minister Tony Blair was bounced into promising a massive increase in NHS funding in response to media stories about the NHS failing to cope with winter pressures.
The Department of Health quickly insisted that more money would have to be accompanied by ‘reform’ and launched The NHS Plan.
This included some ideas for getting the NHS to adopt the kind of consumer-facing technology that had been adopted by other industries – such as ‘airline-style booking.’
Meanwhile, a furious Treasury had asked a former banker, Derek Wanless, to investigate the demands that the health service would need to make on it in the future.
At the start of 2002, Wanless (who died recently) reported that if spending was going to be kept under control, the population would need to become healthier and the NHS would need to become more efficient.
He saw a big uptick in IT adoption as part of the second half of the equation, and proposed that NHS spending on IT should rise to £2.7 billion a year over a three-year period to deliver big gains in productivity.
The final part of the jigsaw was that Downing Street was keen on NHS IT, thanks to a seminar at Downing Street at which Microsoft chief executive Bill Gates, in the UK to promote Windows XP, persuaded Blair and his advisors of replacing a local approach to NHS IT with a national one.
The rest of the beginning story is found here:

Horrible history part two: things fall apart

The National Programme for IT in the NHS got off to a flying start; but soon started to go off-track. Lyn Whitfield looks back.
13 June 2012
Lots omitted as the rush to disaster seemed to accelerate.
So good they abolished it twice
As the programme struggled, the Department of Health’s commitment to its approach declined. In 2006, in a neatly Orwellian touch, it announced a ‘national local ownership programme’ to give SHAs much more responsibility for shaping and delivering NHS IT requirements.
Then, after Richard Granger completed an extended “transition” out of his post in 2008, Matthew Swindells, the chief executive of Royal County Hospital NHS Trust and a ministerial advisor, was brought in to carry out a review.
This urged trusts to focus on creating a “tipping point” in demand for strategic IT systems, by focusing on what became known as the Clinical 5 - a patient administration system, order comms, discharge letters, scheduling and e-prescribing.
Christine Connelly, who succeeded Swindells as the NHS chief information officer, went further. She talked about recasting the national elements of the programme and creating an “app store” for the NHS (which became the interoperability toolkit).
She promised a new “connect all” approach, in which there would be more choice for trusts and “multiple systems in different places” as a result of that choice. And at the end of its term in government, Labour lopped £500m off the nominal cost of the programme.
Despite this, it remained an irresistible target for media pundits and politicians. In opposition, Prime Minister David Cameron the project the “NHS supercomputer”; in government, his health ministers abolished it not once but twice.
In September 2010, Simon Burns announced that £700m would be cut from NPfIT, that the oversight of national projects would move from CfH to “new arrangements” by 2012, and that trusts would be allowed to choose from “a more plural system of IT and other suppliers.”
A year later, following a scathing report from the National Audit Office, and an even more scathing investigation by the Commons’ health select committee into what the programme had delivered by way of health records and into what the NHS had paid for them, Burns did the same again.
The zombie NPfIT
Yet, as EHI editor Jon Hoeksma noted at the time, NPfIT continues to have a kind of zombie existence. After more than 18 months of drafting, ‘The Power of Information’ failed to explain what will happen to CfH, or will be responsible for infrastructure, standards and national projects in the future.
CSC has been locked in negotiations over a new LSP deal for the NME for 18 months. A deal that would have delivered what the government called “savings” of £1 billion on its £3 billion contract looked close this spring. But a ‘standstill agreement’ between CSC and the DH was recently extended to 31 August.
Trusts in the South that were not covered by the BT deal were promised a systems procurement using the Additional Supply Capability and Capacity framework. But this collapsed at the end of December 2011, after almost two years of effort.
These trusts have yet to hear whether an alternative way to deliver national support and funding will be found. But then, amazingly, the £700m of legal action triggered by Fujitsu’s departure has yet to be resolved.
And, of course, the vision of ‘Delivering 21st century IT’ has not been delivered. The patient experience of the NHS has not been transformed by technology, health staff continue to lack universal access to sophisticated health records, data for commissioning and research remains hard to gather and analyse.
‘The Power of Information’ did not make the mistake that ‘Delivering 21st century IT’ made of drawing up a national plan to try and impose its vision on local NHS organisations.
On the other hand, it said virtually nothing about how its remarkably similar ten year vision for NHS IT would be achieved. So the question may be: will no plan succeed where the ‘national strategic programme’ didn’t get results?
Full article here:
This really makes just riveting reading as you see 10 years pass over just a few minutes.
The strategic instability, the lack of clinician engagement and so on are all there and most worrying is the length of time the programme has persisted after so many attempts to kill it off and maybe start again.
The parallels to what is presently happening in Australia are all too obvious.
All in all - compulsory reading.


Cris Kerr said...

An honourable overall mission for Australia's ehealth program should have been to achieve a continuous cycle of measurable improvement in public health, medical, and research outcomes through prioritizing and directing national public health and medical research and funding to where it could contribute the greatest value to;

 improving long term quality and sustainability of life, to;

 minimize unnecessary suffering, and;

 fulfil unmet public health and medical research needs;

to enhance national productivity and economic sustainability of government subsidized public health and medical research, treatment and care.


Ehealth is not a public health system. It is simply a tool for collecting, transferring, and accessing pieces of information.

You build or employ tools to make or achieve something (a primary mission supported by defined primary and secondary goals).

You then plan to achieve those goals through detailed strategic and operational unit planning guided by what you value, eg; privacy, security, integrity, transparency, quality, natural justice, customer satisfaction, productivity, sustainability, etc.

If you don't succinctly define exactly what you want to achieve through developing and using any tool, then you will go around in circles, which is what has confounded most ehealth implementations throughout the world.

Australia had a unique opportunity and we could have learned from the experience of others.

But no plan or strategy was developed to fully utilize this ehealth tool to maximum benefit for all Australians.

No population/public health data set was ever planned. When a request for submissions on draft PCEHR concept of operations went out, it asked submitters to list potential benefits???

No secondary use for de-identified data for health and medical research was included (thankfully, this is now on the table).

In consideration of everything I've read to date, I still believe consumers should be cautioned by those representing consumer interests, who have a voice or who are in a position to do so.

Anonymous said...

Great poll today - asking how well informed the discussion on the eHealth, and particularly PCEHR, is.

Spot the spin if you can... The biggest information security issue has not been addressed - although there is plenty of spin to mis-direct uninformed. Government translation 101 - 'there is a focus on strong security in central systems' - read as - no focus on security in health sector systems.
Ask your doctor or allied health professional if they have been given Government help to go onto high speed broad band connections - most have. Ask them what they 'actually' know about the adequacy of the security on their computer systems - as this IS where the shared PCEHR data will be stored.
Most presume the systems are built to some sort of standards or requirements - they are not!!
And the scariest thing - given many in the sector 'need it to just work'- the software vendors will probably have PCEHR capabilitiy at a system level on by default. And lets not really believe the privacy requirements are being monitored (eg. many pharmacy systems have automatically assumed all customers consent to electronic prescriptions - ie information passing to a third-party that is NOT government)
Small business can not be expected, make that SHOULD NOT be expected to protect all of this informaiton without more support from the Government that is making them a bigger and easier target for cyber attack.

Anonymous said...

And the pot called the kettle black: "there is plenty of spin to mis-direct uninformed". hmm, perhaps so. But this:

"the software vendors will probably have PCEHR capabilitiy at a system level on by default"

so? What use is the capability without the tokens? *They* don't come by default.

Anyway, some bad guy penetrates a real clinical system. And we're concerned about some non-functional pcEHR functionality?

On the more general question: yes, healthcare security is a terrible problem (this is true in the non-ITS sense too though). Is your solution for the government to shut the clinician's desktop down, to control it's content? really?

Anonymous said...

I think you are missing the point about security, anonymous @1800...where the security resides or who controls it not the issue. the problem is that NeHTA and the Feds have trumpeted this brave new world without thinking it through, or apparently taking enough advice - and we the taxpayer are hundreds of millions in the hole. Surely with all this money available to be pissed up against the wall, the vendor community could have come up with some interoperability to provide the information to make healthcare safer and an actual continuum. Greed and ineptitude appear to have been the order of the day. Pink Batts, do I hear someone say?

Anonymous from @(3:)1800 said...

well, that's much better than the previous comment - no irrelevant spin or misinformation, just the heart of the matter.

Or maybe not:

"Surely with all this money available to be pissed up against the wall, the vendor community could have come up with some interoperability to provide the information to make healthcare safer and an actual continuum."

I do think we the vendors could have done it as well for that much money. Much better use of money - waste it on me! Now, how would that deal with the security problems?

Anonymous said...

Anonymous@1200 here
I would like to confirm that i do think that eHealth initiatives have a great opportunity to assist/improve healthcare - and maybe increase efficiency.
Anon@1800 - there are a number of issues about having PCEHR enabled by default, even if no tokens exist or if no patients or doctors are using the functionality. I am an information security specialist - i could use such functionality to: fingerprint a network/system - thereby knowing it was a rich source of identity information, and worth a bit of extra effort; also, functionality on by default (and particularly not used) is often where hacks/injects are performed (tokens are only needed if you want to use the functionality as designed).
It is understood that there is currently plenty of systems also in use, this however is a much more public launch (read as - announcement to malicious groups), and it represents a consistent attack vector across different software platforms.
Lastly - there are a number of solutions/approaches that could enhance the security significantly with little or no impact on the current business uses of computers by the SMB healthcare sector. And if they (healthsector) understood the impact on their patients of their current security approaches (distress that occurs when identity theft happens) then they would probably be prepared to maybe take some security steps that may require change - like not using admin accounts on doctors desktop machines, not sharing passwords between staff, not installing software from the internet, ... AND importantly ...
ASK software vendors to provide assurance that they use secure development approaches and run development/systems that comply with security standards. I have found PLENTY that refer to security, but NONE that make any claims that seem to be more than a marketing brochure reference..

Agreed - The PCEHR is not creating the healthsector security problem, it is just taking it further down the path. What has to be recognised is that the Online risks, in about 2005 when the Governments Broadband for Health initiatives handed out connection money, are now MUCH worse - by a magnitude. BUT the IT aspect of security in healthcare is more of a concern than non-IT as the threat vector is GLOBAL and not someone with physical proximity to a healthcare facility.

What upsets me is that i believe eHealth has much to offer - i just don't want it to fail and create community mis-trust that will take much longer to overcome.

Anonymous said...

Hello Anon@1200, this is Anon@1800

"there are a number of issues about having PCEHR enabled by default, even if no tokens exist or if no patients or doctors are using the functionality. I am an information security specialist - i could use such functionality to: fingerprint a network/system - thereby knowing it was a rich source of identity information, and worth a bit of extra effort; "

hmm. I'm not particularly an information security specialist, but I am a pcEHR specialist. So, some software has implementations of the public (nearly, anyway) specifications, and it's "on" whatever that means - public algorithms, but missing the keys to identify and activate the functionality. And this is useful, to a hacker, because....?

There's a lot of other much more useful stuff in a medical software
package. It's not like security isn't a massive concern. And the vendors, could they charge their customers twice as much as they do now for "more secure" software? Where's the marketing advantage there? (none, right now, because the purchaser's don't care)

"ASK software vendors to provide assurance that they use secure development approaches and run development/systems that comply with security standards" - hell, how would they know? They try, but really, that's hard. If we were really concerned, we'd get together a project where software on the market got tested by skilled hackers, and the results published. But that won't happen until the government agrees to pay the purchaser's twice as much to pay for software such a project passes.

An acceptable proxy would be for some doctor to get taken to the cleaners for running an insecure system.

But until either of those things happen, people talking about security are just flapping their lips :-(