Quote Of The Year

Timeless Quotes - Sadly The Late Paul Shetler - "Its not Your Health Record it's a Government Record Of Your Health Information"

or

H. L. Mencken - "For every complex problem there is an answer that is clear, simple, and wrong."

Wednesday, March 22, 2017

It Is Very Important To Make Sure Data Mining Patient Records Is Properly Managed.

Here is a saga that has just started to unwind and be revealed.

DeepMind's first deal with the NHS has been torn apart in a new academic study

  • Mar. 16, 2017, 8:07 AM
A data-sharing deal between Google DeepMind and the Royal Free London NHS Foundation Trust was riddled with "inexcusable" mistakes, according to an academic paper published on Thursday.
The "Google DeepMind and healthcare in an age of algorithms" paper — coauthored by Cambridge University's Julia Powles and The Economist's Hal Hodson — questions why DeepMind was given permission to process millions of NHS patient records so easily and without patient approval.
"There remain many ongoing issues and it was important to document how the deal was set up, how it played out in public, and to try to caution against another deal from happening in this way in the future," Powles told Business Insider in Berlin the day before the paper was published.
DeepMind and Royal Free say that the study "completely misrepresents the reality of how the NHS uses technology to process data" and that it contains "significant mistakes."
Powles and Hodson said the accusations of misrepresentation and factual inaccuracy were unsubstantiated, and invited the parties to respond on the record in an open forum.
DeepMind has used the data access it was given to create a mobile app called Streams, which was initially designed to help clinicians monitor patients with acute kidney injury (AKI).
Powles, a research associate in law and computer science at St John's College, Cambridge, and Hodson, a technology correspondent for The Economist, argue in the paper published in the Health and Technology journal that the terms of the initial deal (a subsequent one has been made) were "highly questionable" and that they lacked transparency.
DeepMind has tried to defend the deal by saying that it's providing something known in the healthcare industry as "direct care," which assumes that an identifiable individual has given implied consent for their information to be shared or uses that involve the prevention, investigation, or treatment of illness.
"The specific problems are they had access to the data of every patient in the hospital on the legal basis that they were providing direct care to every patient," said Powles. "We think that's problematic given that they only ever asserted that they were interested in helping patients with acute kidney injury. They've since pivoted to look at a bunch of conditions but they haven't addressed the gap in the initial deal between what the purpose was and why they had the access they did."
She added: "If they'd had a small sample set with appropriate consents that proved the results, that showed that this app was working, and then they engaged patients and said we're going to roll it out on these terms, for this period and with this amount of money passing hands, then it would be a totally different game."
Powles believes that the case should be considered a "cautionary tale" for the NHS and other public institutions that are look to work with innovative tech firms.
Vastly more is found here:
This is a hard one to me. Clearly the proponents of the program think they are doing valuable and useful work while the authors of the report have a range of concerns and issues.
It seems to me it is likely the friction is due to a failure of governance and planning before the work began and in that there is certainly a lesson. We are clearly going to have more discussions like this as AI and data mining come together - so we all need to be alert to the risk of problems and issues.
The article is well worth a read for the various nuances.
David.

5 comments:

Anonymous said...

https://www.lightbluetouchpaper.org/2014/02/04/untrue-claims-by-nhs-it-chief/

Anonymous said...

I am getting more concerned as every day goes by this and a few other appointments. I sense we are close to a irreversible stuffup. I hope Mr Hunt and Mr Bowles know what they are doing, there some good people their I hope the don't get scarred or sacrificed.

Anonymous said...

5:46am In agreement, we can change PM's without blinking, maybe we should adopt the same with CEO's. A good innings, seen the country, time for someone with real depth.

Anonymous said...

And from digital health news U.K., today:

Care.data, the now defunct NHS patient data sharing scheme, casts a long shadow.

Even when not acknowledged, the fears that the scheme fostered, that the NHS was using, perhaps even selling, our medical data without our say-so, persists.

That fear erupted again this month, as reports emerged that the Information Commissioner’s Officer has raised “data protection compliance concerns” about TPP’s SystmOne.

SystmOne is the second most popular GP electronic record, used by 2700 practices and holding the record of millions of patients. It’s important that the information held is secure. The ICO hasn’t expand much on the nature of these concerns. They relate to the system’s “data sharing function” and whether it holds patient data securely and processes it in a “fair and lawful” way.

Everything else is hotly contested.

Media reports have claimed that the sharing function allows “thousands of strangers” to look at your medical records, providing they have a log-in to TPP.

The “breach” was described as “truly devastating” by MedConfidential and “serious issues with potentially huge implications” by GP IT leader Dr Paul Cundy.

But TPP is adamant that it is all much ado about nothing.

The company points out that sharing has been turned on since 2012, and was rolled out with the full blessing of Connecting for Health, and input from the BMA and RCP. Except in the case of emergencies, patients must give consent for their records to be viewed and, if records are viewed inappropriately, there is a full audit trial to catch the culprit. So far there had been not one patient complaint about the system, the company says. Oh, and lives have been saved.

The truth is probably somewhere in between.

NHS leaders, to neutralise the concerns about a care.data like scheme, is moving to a more regional patient data sharing model. TPP’s sharing function is nationwide and proudly so. Likewise, there is a move towards building an architecture for data sharing where it’s not technically possible to see or change a patient’s data without the right permissions (often supported by excitable slides about blockchains and distributed ledgers).

Like many other data sharing schemes, TPP relies instead on information governance and monitoring to make sure the right people look at the right file.

The question now becomes whether that’s enough.

The company could, with some legitimacy, argue that regulatory goal posts have been shifted. That its set-up was good enough in 2012, and nothing has gone wrong since (and did we mention the lives saved?).

Certainly, even critics of the scheme are not advising GPs to turn off the sharing function. It really does save lives.

However, TPP’s position ignores the heightened sensitivity to the handling of patient data since the care.data fiasco.

The public’s trust in the NHS, and by extension its IT systems, as reliable custodian of their health data had been eroded.

Everyone had a part to play in restoring it.

Anonymous said...

So there have been mistakes in the past everywhere, I am sure they have all learned from these minor errors in judgements. I think we just need some more time before we judge, after all SMD is the answer to the IoT in healthcare.