Ms Roxon’s intervention in the discussions on the PCEHR made me wonder if there was actually a risk analysis of the PCEHR Program.
See here to read her comments I reported a day or so ago:
Reading the comments really made me wonder if the Minister - or her advisors - grasp the scale of risk associated with the proposed PCEHR project.
Any large public projects carry substantial risk and large IT projects - and especially health IT projects - seem to be prone to nasty surprises and career ending outcomes.
Among the larger ones that have come seriously unglued we can think of the first attempt by Kaiser Permanente to undertake a major update of its IT infrastructure - that led to a write of in the many hundreds of millions of dollars - and the UK National Program for Health IT’s experience which is till playing out.
A very useful reference regarding all this is found here:
J Am Med Inform Assoc. 2009 May-Jun; 16(3): 291–299.
Health IT Success and Failure: Recommendations from Literature and an AMIA Workshop
Bonnie Kaplan, PhD and Kimberly D. Harris-Salamone, PhD
The full text is here:
Here is the Abstract:
Abstract
With the United States joining other countries in national efforts to reap the many benefits that use of health information technology can bring for health care quality and savings, sobering reports recall the complexity and difficulties of implementing even smaller-scale systems. Despite best practice research that identified success factors for health information technology projects, a majority, in some sense, still fail. Similar problems plague a variety of different kinds of applications, and have done so for many years. Ten AMIA working groups sponsored a workshop at the AMIA Fall 2006 Symposium. It was entitled “Avoiding The F-Word: IT Project Morbidity, Mortality, and Immortality” and focused on this under-addressed problem. Participants discussed communication, workflow, and quality; the complexity of information technology undertakings; the need to integrate all aspects of projects, work environments, and regulatory and policy requirements; and the difficulty of getting all the parts and participants in harmony. While recognizing that there still are technical issues related to functionality and interoperability, discussion affirmed the emerging consensus that problems are due to sociological, cultural, and financial issues, and hence are more managerial than technical. Participants drew on lessons from experience and research in identifying important issues, action items, and recommendations to address the following: what “success” and “failure” mean, what contributes to making successful or unsuccessful systems, how to use failure as an enhanced learning opportunity for continued improvement, how system successes or failures should be studied, and what AMIA should do to enhance opportunities for successes. The workshop laid out a research agenda and recommended action items, reflecting the conviction that AMIA members and AMIA as an organization can take a leadership role to make projects more practical and likely to succeed in health care settings.
----- End Quote:
A key section is found here:
What We Know—Lessons from Experience
Participants drew lessons from their research and experiences on how management might improve project success. These included:
- provide incentives, remove disincentives
Users may perceive that they have no time, or that what they are being asked to do moves work to them and away from others. Physicians, for example, would be more engaged if they experienced applications that helped them directly rather than providing disincentives to adopt the system. As an incentive, for example, physicians could get rounds done more easily if patient lists were ready when shifts begin.
- identify and mitigate risks
Determine the social risks, the IT risks, the leadership risks, the user risks, etc, and consider them early and often during the project. These risks and possible ways to mitigate them should become part of new or existing policies and procedures pertaining to the new system and incorporated into training.
- allow resources and time for training, exposure, and learning to input data
Participants described systems where clinicians had never used a keyboard or had exposure to computers, yet training was very limited. Sufficient training and time to learn need to be part of the implementation, and need to be on-going afterward.
- learn from the past and from others
Participants spoke of the need for studies of successes, failures, and how failing situations were turned around. They suggested longitudinal studies, qualitative studies, more focus on health care teams as a whole, and incorporating insights from change management, diffusion of innovation and technology, social science and sociotechnical theory, and multilevel frameworks. Although participants suggested drawing on existing theories and knowledge and also incorporating project management and methodology issues, they advised caution when doing so because of differences between health care and the business settings where models were developed. There also were calls for measurable evidence, including evidence of publication bias concerning project failure, and for various databases to be created
----- End Quote
As I see it we have a system in the PCEHR which pretty much goes against all of this:
First - as noted in the last few days the incentives to use the proposed PCEHR are distinctly lacking (see the News in yesterday’s blog and various Ministerial comments)
Second we note the is no ‘risk assessment’ or ‘risk analysis’ in the PCEHR Concept of Operations.
Third we find the Program / Project being driven by a politically driven rather that pragmatic time frame for delivery.
Last we not the PCEHR spends all of about 2-3 pages of 160 pages in the Concept of Operation on analysis of previous national and international projects - hardly an in-depth analysis.
As well as careful analysis of relevant past experience any IT Project Risk Analysis needs to assess:
1. Strategic Risks - what is out there that can or may impact on the shape and delivery of the project?
2. Budgetary, Cost and Resourcing Risks - is the budget allocation secure and adequate - with a contingency - to deliver the program.
3. Management Risks - are those charged with delivery of the program experienced and capable to deliver the program. In this case it might also be termed execution risk.
4. Timing Risks - is the project plan and proposed timing structured in such a way as to make delivery reasonably achievable. Have the sequence of activities and their duration been properly planned to make delivery possible?
5. Technical Risks - are all the technologies, Standards etc. all proven and known to work and will the desired outcome be possible from a technical perspective.
6. Cultural Risks - is what is planned suitable for use in the work environment for which it is intended?
7. Security and Privacy Risks - As any IT project needs to address but most especially Health IT projects.
8. Sustainability Risks - What are the plans to continuing support of the outcomes of the project?
It seems to me the ConOps should offer an analysis of all these as well as a few pages on risk mitigation to be used starting with addressing this list at least. Of course all this should have been done before we stated off on this journey.
David.
3 comments:
Following up on David's risk issues, I suggest that one of the most fundamental risks is that there is an incomplete and/or incorrect understanding of what "Personally Controlled" actually means.
This is going to cause all sorts of problems in the community and will negatively impact the acceptance of the system.
In the ConOp it states that:
Start Quote
1.2 The PCEHR System
The national PCEHR System places the individual at the centre of their own healthcare by enabling access to important health information, when and where it is needed, by individuals and their healthcare providers.
Individuals can choose whether or not to have a PCEHR. If they choose to participate, they will be able to set their own access controls. With the individual’s permission, key pieces of health information may be viewed by participating healthcare providers across different locations and healthcare settings.
End Quote
What the ConOP and the Legislation Issues Paper do not define is the nature and extent of the "control" that an individual has over their record.
It seems to be limited to
a) whether to have an eHR or not
b) a small set of access controls, some of which are quite complex to set up and operate (Limited Access)
The "Celebrity" Kludge is an indicator that a band-aid/fire-fighting mentality is in operation in NEHTA.
IMHO the granularity of the access controls is nowhere near sufficient enough to meet the needs of a complex society such as we have in Australia.
I haven't done any serious analysis of the range of access controls over their health information that Australian citizens, as a group, may have, but here's a few situations:
- A female may want to specify that only females can see their eHealth Record.
- A person of a particular religious faith may require that only people of that faith can see their eHealth Record.
- A person of a particular religious faith may require that people of a different, but specific, faith cannot see their eHealth Record.
The proposed PCEHR system seems to be treating the Australian population as a relatively homogeneous group. This simplistic assumption may well come back to bite NEHTA.
If we pander down to that level (only someone wearing a red hat on a Tuesday can view my notes) we are doomed.
This is why I would of rather the Govt make frameworks, support standards and make it easier for people to access their information from the Govt Hospital systems.
You want a copy of your d/summ here it is electronically we will forward it to your GP, you or will upload to a 3rd party company you nominate.
Let the private sector provide the solution.
Napolean
@Napolean and re:
"If we pander down to that level (only someone wearing a red hat on a Tuesday can view my notes) we are doomed"
Pander is a strange word to use in this context.
IMHO, a social system like eHealth should reflect community values and needs. Taking a "we know what is best for people and they can lump it or leave it" attitude does not resonate with the Australia with which I am familiar.
However, doomed is a good word to use, but I suspect we have different reasons for saying so.
Post a Comment