European Commission publishes final draft of a code of conduct on privacy of health mobile applications - DAC Beachcroft

European Commission publishes final draft of a code of conduct on privacy of health mobile applications's Tags

Tags related to this article

European Commission publishes final draft of a code of conduct on privacy of health mobile applications

Published 1 September 2016

On 7 June 2016, the Commission's final draft code of conduct on privacy for mobile health applications (mHealth apps) (Code) was submitted to Working Party 29 ("WP29") for approval.

The Code is aimed at app developers rather than programmers or app stores as app developers are responsible for determining the extent to which personal data will be processed. The Code has been drafted to ensure compliance with both the Data Protection Directive (Directive) and the GDPR.

Is it sensitive personal data?

The Code attempts to illustrate where data collected by mHealth apps might be mere lifestyle data or might constitute sensitive personal data (and therefore subject to more stringent requirements under the Directive and the GDPR). In some scenarios it will be clear when sensitive personal data is being processed. The example given in the Code is where an app allows a user to track whether prescribed medication has been taken in accordance with medical advice. This is clearly a case of processing of sensitive personal data as it provides information on the health of the user. 

The number of steps recorded by a wearable device which is then collated in an app would not, itself, constitute sensitive personal data. This is merely lifestyle data. However, if that data is used to predict health risks then the mHealth app would be processing sensitive personal data. 

In the age of Big Data and the Internet of Things, we are increasingly seeing insurance products that are linked to lifestyle. For example, some life insurance products offer benefits for, say, going to the gym or eating well. In these circumstances, the data relating to gym visits and eating your five- a-day would constitute more than lifestyle data. The insurer is making an assumption that the user's health is improved based on the lifestyle data input (and rewarding them for that improvement). This data would therefore constitute sensitive personal data. 

To determine whether sensitive personal data is being processed, app developers should be considering not only what is collected but also what it is combined with and what it is ultimately used for. 

Practical Guidelines
The Code goes on to give 11 practical guidelines for the app developer under the following headings:

  • How should I obtain consent of the users of my app?
  • What are the main principles that I must respect before making a mHealth app available?
  • What information shall I provide to the users before they can use my app?
  • How long can I keep data?
  • Do I have to implement any security measures?
  • Can I show advertisements in a mHealth app?
  • Can I use personal data collected via my mHealth app for secondary purposes e.g. for 'big data' analysis?
  • What shall do prior to disclosing data to a third party of processing operations?
  • Where can I transfer gathered data to?
  • What shall I do if there is a personal data breach?
  • How shall I treat any data gathered from children?

For privacy professionals, the answers to these questions should be relatively clear. The ICO's words in respect of Big Data projects apply equally here, the processing of personal data in an mHealth app is ‘not a game that is played by different rules1.

However, we have looked at the advice given under two of these headings in a little more detail.

Practical Guidelines - how should I obtain consent of the users of my app?

The Code provides some guidance as to how consent of users can be obtained. As the requirements for consent under the GDPR are more stringent than under the DPA particular consideration should be given to this area to ensure apps are GDPR compliant.

Where sensitive personal data is being processed, consent must be explicit i.e. it requires a clear and unambiguous action from the user e.g. accepting the terms and conditions before being able to use the App and every time the consent requirement changes.

Advice in relation to when consent must be given is contradictory. On the one hand, consent should be given prior to or as soon as users install the app and yet the Code suggests that privacy notices (where the request for consent is likely to be set out) should be given before installation. The Code goes on to state that consent can be obtained at various times during use of the app.

Practically speaking, consent should be obtained before personal data is processed (and for which consent is required). In most cases, this is likely to be prior to installation. However, consideration should be given to whether fresh or additional consent should be obtained where users use new functionality and there is a risk they may not be aware of how sensitive personal data will be processed. Where a mHealth app is a "paid for app" ensure consent is obtained before payment is made. Alternatively, users should be offered a full refund where consent to processing is not given. Under the GDPR, consent must be as easy to withdraw as to give. Usually, this will be through deletion of the app from the user's device. Ensure that where the app has been deleted, no personal information is retained.

The Code goes on to state that for consent to be valid, users need to have been provided with clear and comprehensible information first which shall not be 'embedded in lengthy legal text'. This requirement can be met by use of a layered approach to privacy notices. This is explored further below.

Where the insurance industry is providing mHealth apps at the request of customers as a 'freebie' or other add on, our experience seems to suggest that the data collected in an app is not used for anything other than provision of the features of the app. If this intention changes because a data scientist sees the untapped potential in the data collected, make sure that fresh consent is obtained before a new use is made of a user's personal data.

Practical Guidelines - what information shall I provide to the users before they can use my app?
The Code recommends taking a layered approach to privacy notices as already advocated by the ICO i.e. a short form or condensed privacy policy is provided in the first instance which contains key information with a link through to full privacy policy. The key information required for the condensed notice is:

  • the identity of the app developer. In practice this means the data controller, not, for example, an outsourced service providing or developing the app in accordance with the data controller's instructions;
  • a brief description of data processing purposes, how data will be used and how it fits in with the data controller's products and services to ensure fair processing;
  • precise categories of personal data being processed;
  • whether personal data will be transferred to third parties and, if so, who;
  • a description of the user's right to access and correct personal data (in most cases this will be by the user via the app) and to delete personal data. Deletion is likely to be achieved through deletion of the app itself. However, this assumes that the data controller will not retain any personal data after deletion. If data is retained following deletion of the app, a mechanism for users to request deletion of the app needs to be set out;
  • a statement stating that use of the app is voluntary but requires consent to permit the processing of personal data;
  • contact information for data protection queries;
  • a link to the full privacy policy.

An example of a condensed notice with long form policy is given in the annex to the code. Beware, the example given is in relation to an app with a narrowly defined purpose. In practice, a condensed privacy notice for the type of mHealth apps that are currently being made available in the insurance industry will have a multitude of functions so may well be more than two paragraphs long!

As a minimum the condensed notice should be made available before app installation and the full privacy policy should be available through the app following installation (although note comments above in relation to when consent should be obtained).

Governance and Enforcement

Once the Code has been approved, compliance with and enforcement of the Code will be carried out by a monitoring body. The monitoring body will also implement an alternative dispute resolution and complaint handling process where data subject can lodge complaints against app developers and may make a trust mark available which can be applied to apps on the register.

App developers can publicly declare compliance with the Code by completing a privacy impact assessment (PIA) (a form of which is annexed to the Code) and making a self-declaration to the monitoring body. The PIA and self-declaration will be reviewed by the monitoring body and those apps that have been approved will be entered onto a public register maintained by the monitoring body.

App developers will be required to ensure continuous accuracy of the declaration and will be subject to random checks from the monitoring body. This allows the monitoring body to be proactive and not only ensure adherence to the Code by reacting to data subject complaints.

App developers found to be in breach of the Code will be removed from the register and will be required to remove any trust mark. Serious breaches would be subject to enforcement action under the Directive (or GDPR).

As this is the final draft code, no further major changes are expected. App developers would be well advised to ensure that their mHealth apps adhere to the Code.

To see the Code, please click here.

1 Information Commissioner's Office report: 'Big Data and Data Protection', 28 July 2014



Key Contacts

Rhiannon Webster

Rhiannon Webster

London - Walbrook

+44 (0)20 7894 6577

< Back to articles