A Collection is a selection of features, articles, comments and opinions on any given theme or topic. It allows you to stay up‑to‑date with what interests you most.
Login here to access your saved articles and followed authors.
We have sent you an email so you can reset your password.
Sorry, we had a problem.
Tags related to this article
Published 1 September 2016
On 7 June 2016, the Commission's final draft code of conduct on privacy for mobile health applications (mHealth apps) (Code) was submitted to Working Party 29 ("WP29") for approval.
The Code is aimed at app developers rather than programmers or app stores as app developers are responsible for determining the extent to which personal data will be processed. The Code has been drafted to ensure compliance with both the Data Protection Directive (Directive) and the GDPR.
Is it sensitive personal data?
The Code attempts to illustrate where data collected by mHealth apps might be mere lifestyle data or might constitute sensitive personal data (and therefore subject to more stringent requirements under the Directive and the GDPR). In some scenarios it will be clear when sensitive personal data is being processed. The example given in the Code is where an app allows a user to track whether prescribed medication has been taken in accordance with medical advice. This is clearly a case of processing of sensitive personal data as it provides information on the health of the user.
The number of steps recorded by a wearable device which is then collated in an app would not, itself, constitute sensitive personal data. This is merely lifestyle data. However, if that data is used to predict health risks then the mHealth app would be processing sensitive personal data.
In the age of Big Data and the Internet of Things, we are increasingly seeing insurance products that are linked to lifestyle. For example, some life insurance products offer benefits for, say, going to the gym or eating well. In these circumstances, the data relating to gym visits and eating your five- a-day would constitute more than lifestyle data. The insurer is making an assumption that the user's health is improved based on the lifestyle data input (and rewarding them for that improvement). This data would therefore constitute sensitive personal data.
To determine whether sensitive personal data is being processed, app developers should be considering not only what is collected but also what it is combined with and what it is ultimately used for.
Practical GuidelinesThe Code goes on to give 11 practical guidelines for the app developer under the following headings:
For privacy professionals, the answers to these questions should be relatively clear. The ICO's words in respect of Big Data projects apply equally here, the processing of personal data in an mHealth app is ‘not a game that is played by different rules1.
However, we have looked at the advice given under two of these headings in a little more detail.
Practical Guidelines - how should I obtain consent of the users of my app?
The Code provides some guidance as to how consent of users can be obtained. As the requirements for consent under the GDPR are more stringent than under the DPA particular consideration should be given to this area to ensure apps are GDPR compliant.
Where sensitive personal data is being processed, consent must be explicit i.e. it requires a clear and unambiguous action from the user e.g. accepting the terms and conditions before being able to use the App and every time the consent requirement changes.
Advice in relation to when consent must be given is contradictory. On the one hand, consent should be given prior to or as soon as users install the app and yet the Code suggests that privacy notices (where the request for consent is likely to be set out) should be given before installation. The Code goes on to state that consent can be obtained at various times during use of the app.
Practically speaking, consent should be obtained before personal data is processed (and for which consent is required). In most cases, this is likely to be prior to installation. However, consideration should be given to whether fresh or additional consent should be obtained where users use new functionality and there is a risk they may not be aware of how sensitive personal data will be processed. Where a mHealth app is a "paid for app" ensure consent is obtained before payment is made. Alternatively, users should be offered a full refund where consent to processing is not given. Under the GDPR, consent must be as easy to withdraw as to give. Usually, this will be through deletion of the app from the user's device. Ensure that where the app has been deleted, no personal information is retained.
The Code goes on to state that for consent to be valid, users need to have been provided with clear and comprehensible information first which shall not be 'embedded in lengthy legal text'. This requirement can be met by use of a layered approach to privacy notices. This is explored further below.
Where the insurance industry is providing mHealth apps at the request of customers as a 'freebie' or other add on, our experience seems to suggest that the data collected in an app is not used for anything other than provision of the features of the app. If this intention changes because a data scientist sees the untapped potential in the data collected, make sure that fresh consent is obtained before a new use is made of a user's personal data.
An example of a condensed notice with long form policy is given in the annex to the code. Beware, the example given is in relation to an app with a narrowly defined purpose. In practice, a condensed privacy notice for the type of mHealth apps that are currently being made available in the insurance industry will have a multitude of functions so may well be more than two paragraphs long!
Governance and Enforcement
Once the Code has been approved, compliance with and enforcement of the Code will be carried out by a monitoring body. The monitoring body will also implement an alternative dispute resolution and complaint handling process where data subject can lodge complaints against app developers and may make a trust mark available which can be applied to apps on the register.
App developers can publicly declare compliance with the Code by completing a privacy impact assessment (PIA) (a form of which is annexed to the Code) and making a self-declaration to the monitoring body. The PIA and self-declaration will be reviewed by the monitoring body and those apps that have been approved will be entered onto a public register maintained by the monitoring body.
App developers will be required to ensure continuous accuracy of the declaration and will be subject to random checks from the monitoring body. This allows the monitoring body to be proactive and not only ensure adherence to the Code by reacting to data subject complaints.
App developers found to be in breach of the Code will be removed from the register and will be required to remove any trust mark. Serious breaches would be subject to enforcement action under the Directive (or GDPR).
RecommendationsAs this is the final draft code, no further major changes are expected. App developers would be well advised to ensure that their mHealth apps adhere to the Code.
To see the Code, please click here.
1 Information Commissioner's Office report: 'Big Data and Data Protection', 28 July 2014
London - Walbrook
+44 (0)20 7894 6577
David Williams, Peter Allchorne
Adam Burrell, David Williams, Peter Allchorne
Claire Laver, Angela Byrne
Claire Laver, Helen Laight
Jasminka O'Hora, Kieran Mitchell
Andrew Parker, Joanna Folan, Adam Ballard
Mark Roach, Rebecca Austin, Chris Lewis
Catherine Chung, Colin Bissett
Stefan Desbordes, Matthew Breakell
Chris Baranowski, Charlotte Miles
Stephen Sadler, Colin Moore
Ian Manners, Neil Scott