This appeared last week:
Tech dangers lurk in home quarantine trials
John Davidson Columnist
Sep 21, 2021 – 5.00am
Home quarantine systems that use facial recognition and artificial intelligence to police whether citizens stay at home are a dangerous incursion on civil rights and should only be used with the utmost caution, privacy and AI ethics experts have warned.
Worse still, they do not work nearly as well as more conventional methods of home detention, and could endanger public health at the same time they are allowing AI-based surveillance make dangerous inroads into Australia, other experts say.
Last week, the NSW government said it would begin trials of AI-based facial recognition apps, in the hope of allowing returned travellers and airline staff to quarantine for COVID-19 at home, rather than in the state’s controversial hotel quarantine system, which is being wound down as vaccination levels increase.
The Victorian government has also said it will conduct home quarantine trials, for residents trapped by border closures with NSW.
It is unclear whether Australia’s two most populous states will use the government-built home quarantine app already being trialled in South Australia, or the app already in use in Western Australia, G2G Now, made by the small, Perth-based software company Genvis.
(Genvis itself added a privacy page to its G2G Now website, which states that “G2G Now is provided by NSW Police as part of the State of NSW’s response to the public health risk of COVID-19”.)
But regardless of which system is used, they are both dangerous because they both use facial recognition technology to ensure people are abiding by their home quarantine orders, technology ethicists and privacy advocates warn.
The apps require people to take selfies within minutes of being sent a request at random times during the day, and then they use AI and geolocation to determine whether the quarantined person was at home when the selfie was taken.
Dr Catriona Wallace, a leading expert in AI ethics, said facial recognition systems were well known to exhibit bias against women and non-Caucasians, so care must be taken to ensure that the system did not unfairly flag people as being in breach of home quarantine orders just because it struggled to properly recognise their face.
Another of “the many troubling questions” that arose from such technology was whether those photos could be used for anything other than home quarantine, she said.
”The facial recognition system picks up the whole image, it doesn’t just pick up the face,” Dr Wallace told The Australian Financial Review.
“What if I’m standing in my kitchen and there are some drugs on my counter, or there’s a book on terrorism that I’m reading. Or maybe, from my face, it looks like I’m high. Does the government take a note of that?
“There are real privacy concerns that arise when people are at home taking photos of themselves.”
As soon as you allow facial recognition to be used for any other purpose, or to be slightly expanded, we become accustomed to it.
— Professor Graham Greenleaf, privacy and law expert, UNSW
Whatever systems governments opted for, they needed to ensure that the software developers had proper training in the ethics of developing AI systems – training that was lacking in Australia, said Dr Wallace.
In the “Responsible AI Index” paper that she will release this week through her company, Ethical AI Advisory, only a “tiny fraction” of Australian companies get a passing grade.
“If we accept that this is a genuine emergency caused by COVID-19, then measures that we would not normally tolerate, that we would regard as an affront to our civil liberties, can be tolerated under certain circumstances,” said Professor Graham Greenleaf, a privacy and law expert at the University of NSW.
But the circumstances under which Australians might tolerate facial recognition and geolocation in their homes must include strict laws that prevented the apps or their data being used for any other purpose, and there must be a “sunset clause” that forced all data gathered by the apps to be destroyed the moment it was no longer needed, he said.
When a citizen had completed home quarantine without incidents, all their records needed to be immediately scrubbed from the system.
“There also needs to be a legislative requirement that the whole thing be completely dismantled. Destroy the whole thing, once the COVID emergency is over,” Professor Greenleaf said.
“The big danger is, as soon as you allow facial recognition to be used for any other purpose, or to be slightly expanded, we become accustomed to it.
“Facial recognition is one thing we should never be accustomed to, at all.”
More here:
https://www.afr.com/technology/tech-dangers-lurk-in-home-quarantine-trials-20210919-p58t18
The article goes on to point out that AI enabled facial recognition is also not all that reliable and I have to wonder why finger-print recognition is not preferred as it is less intrusive and well understood.
That said there are still issues with retaining data for any period of time. I can’t see why the period hould be any more than 1-2 days, when the data is just deleted. All that is kept is a note that at date and time “x “ you were confirmed as being compliant and were where you should have been – no names, no location and no pack drill kept <grin>
Once the period of quarantine is over all that is needed is a tiny record noting the period required has been served.
The pandemic seems to have triggered a frenzy of Governments taking liberties with all sorts of personal data and we really need to claw these rights and protections back as soon as possible. It becomes harder and harder and harder to remember just how much we used to expect to be private has gone up is smoke at the altar of Government / Admin convenience!
What clever technological approaches do you suggest can keep track of quarantine observance while really protecting privacy?
David.
1 comment:
I suggest that the biggest problem could result from the change in behaviour of people who did not like the whole approach. e.g people might avoid reporting symptoms and/or getting tested.
Post a Comment