Friday, August 27, 2021

From All We Read We Have A Similar Problem With Digital Health Apps Here In OZ.

This appeared last week:

The privacy problem with health-related apps is linked to insecure coding

In his next column for Digital Health, Davey Winder, explores the privacy issues surrounding health-related apps.

Davey Winder 17 August, 2021

A study published in the British Medical Journal has served to confirm an inconvenient truth: mobile health apps may not be as private as you think. I’m not convinced that’s the biggest issue with mobile health apps, truth be told.

47% of apps analysed didn’t comply with their own privacy policy

The cross sectional study, authored by Gioacchino Tangari, Muhammad Ikram, Kiran Ijaz, Mohamed Ali Kaafar and Shlomo Berkovsky, set itself the objective of analysing what user data is collected by health and fitness related apps on Google Play and thus reveal any associated risks to privacy.

The researchers performed their in-depth analysis on a total of 15,838 global apps from the Australian store with a 8,468 non-health apps used for the baseline comparison. Of these, the vast majority (88%) were using either cookies or some form of tracking identifier relating to user activity, and 28% of the apps didn’t have any privacy policy. Of those that did, only 47% of the apps complied with that policy.

What sort of data are we talking about here? The usual device identifiers and cookies plus contact information mostly. The kind of thing that’s used for tracking and profiling by advertisers, in other words. The researchers concluded that when compared with the baseline the mobile health ones “included fewer data collection operations in their code, transmitted fewer user data, and showed a reduced penetration of third party services.” Which is good news. Digging into the data further it became clear that health and fitness apps were more data-hungry than medical apps and more likely to share this data, with “integration of adverts and tracking services” more pronounced.

Most users are ill-equipped to make informed choices

Tom Davison, the technical director at mobile device security specialists Lookout, says that while apps do make use of the “robust permissions models” provided by both Apple and Google, “in order to use an app, users effectively have no choice but to accept permissions and agree to terms and conditions.”

This is as it’s always been, of course, and the decision is ultimately that of the user. But is that decision based on an understanding of the choices offered? Davison argues that the “awareness of users about how they are trading data for functionality remains woefully low.”

I’m inclined to agree, historically speaking, but the privacy labels introduced by Apple for iPhone and iPad users, at least, have gone some way to bringing clarity to what data collected is used to track you, is linked to you and not linked to you. These labels provide users with the opportunity to opt for a less intrusive app before downloading. Android users are still waiting for this transparency nod, with apps on the Google Play store requiring the user to click through links to see the details.

Then there are the cookie notices when you start using an app or visit a site which are a different kettle of fishy smells altogether. Most are so convoluted in their nature that far from clarifying anything they almost seem, and I’m shocked I tell you, designed to direct the user to click ‘accept all’ and move on.

“Most users are not equipped or prepared to sift through the legalese to fully understand the trade-offs,” Davison says, “and other than by reading these lengthy privacy policies, users have very few ways to validate how apps access, store, transmit, secure or share data.”

A Google spokesperson told The Guardian newspaper, “Google Play developer policies are designed to protect users and keep them safe. When violations are found, we take action. We are reviewing the report.”

Privacy policies are the least of your mobile health app worries

OK, I lied: I’m not shocked at all about seeming attempts to obfuscate the whole data collection and usage process when it comes to health-related apps. I’m not actually convinced this is the biggest problem faced by users of them either, and here’s why.

That same study concluded that 23% of the data being transmitted was done so using insecure communication protocols, HTTP rather than HTTPS. That’s the first cybersecurity red flag for me. Others come from an earlier report, published by Which? at the start of the year. This also looked at health and fitness apps, and services, but from a security as well as privacy perspective.

The Which? investigation found everything from apps that allowed the weakest of passwords, passwords stored unencrypted on the device itself, “more cookies than a bakery” in many cases and uncertainty amongst lawyers if they were General Data Protection Regulation (GDPR) compliant, at least in spirit. If you thought the red flags were flying already, it gets worse.

Insecure APIs at the heart of the problem

Alissa Knight, a well-respected security researcher and industry analyst, has authored a report published by mobile security specialists Approov, that exposed application program interface (API) hacking risks to all 30 of the popular mobile health apps investigated. Thirty apps that, the report suggests, have exposed more than 20 million users to potential attacks from cybercriminals.

More details here:

https://www.digitalhealth.net/2021/08/the-privacy-problem-with-health-related-apps-is-linked-to-insecure-coding/

It is fair to say this is all not a pretty sight indeed. All we poor users can do is ask many questions and don’t use apps that appear in any way dodgy or suspicious.

Anyone else have some other ideas for the unsuspecting user?

David.

 

No comments:

Post a Comment