GDPR Deep Dive: Profiling in the Insurance Industry
Published 21 June 2016
The new rules on profiling are likely to be one of the areas of the General Data Protection Regulation (GDPR) which will significantly affect the insurance industry, particularly in respect of big data projects. In the first of DAC Beachcroft's "deep dives" into the GDPR, we examine this new right for data subjects and the potential impact it will have on the insurance industry.
A new concept
This is a new concept under data protection law and covers:
“any form of automated processing of personal data consisting of using those data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that person’s performance at work, their economic situation, health, personal preferences, interests, reliability, behaviour, location or movements” (Article 4 GDPR).
The scope of the definition is wide enough to capture almost any analysis of an individual carried out by automated (electronic) means. In the insurance industry, this will include any underwriting, direct marketing, targeted advertising and e-recruitment processes which are performed electronically, rather than by a human being.
The Data Protection Directive (95/46/EC) ("Directive"), transposed into UK law through the Data Protection Act 1998, had a more limited right to object to automated processing. The new concept of profiling and the related right have more far reaching consequences.
A new right
The GDPR introduces a new right not to be subject to a decision based solely on profiling which produces legal effects or has a similar, significant effect (Article 22).
That right can be broken down into 3 important elements:
- "the right not to be subject to a decision…";.
The insurance industry makes decisions based on automated means every day. For example, underwriting platforms are designed to price risks and allocate premiums instantaneously and automatically. These are decisions based on automated processing. However, automated processing does not always result in a decision. From our experience, big data projects are often divided into at least two stages: (i) data from numerous sources is inputted into a big data "lake" and analysed; (ii) analysts determine what use, if any, they can make of the data when combined e.g. underwriting pricing or targeted marketing. At the initial stage, they might discover the prevalence of a certain type of person with a particular policy but not make any decisions based on that data. However, if they then proceed to use the results of that analysis, to make a decision, e.g. to increase the premium for a particular class of policyholder or to target a particular policyholder with specific marketing, a decision will have been made;
- "…based solely on profiling…"; if there is any form of human intervention, the right will not apply;
- "…which produces legal effects or has a similar, significant effect". It‘s not clear what will constitute a “legal effect” or "similar significant effect" but, based on the examples provided in the recitals of the GDPR of automatic refusal of an online credit application or e-recruitment, many of the decisions taken in the insurance industry are likely to meet this threshold. A decision to underwrite a risk, the amount of the premium to charge and, an automated fraud analysis are all likely to be decisions which will have a legal or significant effect on an individual.
All three of the above elements must apply in order for a data subject to benefit from the right.
Exemptions for profiling
The right does not apply to profiling using personal data if any resulting decision is:
- necessary for entering into, or performance of, a contract between the data controller and data subject.
We believe the insurance industry has strong arguments that profiling by automated means to rate the risk of a consumer lines policy is necessary for the purposes of the insurance contract with that consumer. It is not practical for a human to do this analysis in every event.
However, the use of this exemption is limited. Note the requirement that the contract needs to be between the data controller and the data subject. Many insurance policies involve the processing of a third party's personal data, in the form of a beneficiary under an insurance policy. This could be a second driver under a motor policy or an employee under a corporate health insurance policy. To the extent a third party's personal data is analysed, this exemption would not apply.
- authorised by Union or member state law to which the data controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or
This means that the UK would need to enact legislation which expressly permits a certain kind of profiling. We are not aware of any legislation which expressly authorises the profiling currently undertaken by the insurance industry, whether it be for underwriting, claims analysis, fraud prevention or targeted marketing.
- has the explicit consent of the data subject.
The requirements for obtaining explicit consent are now stricter under the GDPR. We will be releasing a "deep dive" in relation to consent but, in summary, valid consent must be a freely given, specific, informed and unambiguous indication of the data subject's wishes given by an affirmative action. There are also "conditions for consent" which must be observed, such as presenting a request for consent in a way which is easily distinguishable from other matters (e.g. agreeing to policy terms and conditions).
The GDPR acknowledges that consent can be a condition of entering into a contract, but states that "utmost account" should be taken as to whether the consent has been freely given if the processing in question is not necessary for the performance of the contract.
If a data controller seeks to rely on exemptions (1) or (3), they must implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests and at least give the data subject the right to obtain human intervention, to express his or her point of view and to contest the decision. This could result in insureds questioning the reasons for policy eligibility decisions or premium pricing.
Exemptions and the Insurance Industry
In practice, the insurers and brokers could ask for consent from data subjects to carry out profiling they undertake. The profiling which is strictly necessary for the performance of the contract (i.e. the original underwriting risk analysis) could be a prerequisite of entering into the insurance contract. Any additional profiling, whether for market analysis or targeted advertising would need to be an optional choice for the data subject as it is foreseeable that requiring consent to this additional profiling will lead to challenges as to how how "free" the data subject is to give his or her consent (which therefore questions, if the consent is valid at all). Similar issues have arisen in the context of an employee providing consent to his or her employer; it is now widely acknowledged that such consent is rarely valid due to the imbalance of power.
However, there is much profiling undertaken by the insurance industry which falls somewhere in between the profiling which is necessary for the contract, and that where the insurance industry can feasibly (albeit reluctantly), ask for consent. Of particular concern is about the profiling that the insurance industry undertakes for fraud analysis. It is difficult to say that this profiling is strictly necessary for the contract, but nor can the insurance industry feasibly ask for consent, if data subjects need to be given the option to say no. However, as a matter of public policy, it seems a strange effect of the GDPR that such fraud analysis, which is in the public interest and keeps insurance premiums affordable, should be made essentially unlawful under the GDPR.
In the absence of any legislation over the next 2 years which expressly permits the insurance industry to undertake such automated fraud analysis, the only argument open to the insurance industry that this profiling is necessary for the insurance contract. Without it, premiums would be too high. We would hope that with appropriate explanations of this given to data subjects by the insurance sector, before policy inception, the regulators would agree with this approach.
We have no such solution to any profiling undertaken on third party beneficiaries, who are not party to a contract with the data subject, but merely benefit. To undertake any kind of profiling on these third parties, the insurance industry will need explicit consent. The burden of being able to demonstrate that you have such consent from that third party will be another problem for the insurance industry, who will not necessarily have a direct line of communication with those data subjects.
Profiling using sensitive personal data
There is an absolute restriction on profiling using sensitive personal data unless the data subject has given explicit consent or it is necessary for reasons of substantial public interest.
This will leave certain sectors of the insurance industry with no option but to require consent to profiling as a condition to obtaining cover. An obvious example is health insurers who inevitably must profile using sensitive personal data in order to underwrite a health policy. Less obvious examples include profiling using sensitive personal data relating to criminal convictions (e.g. for a motor policy).
In the event that a data controller is able to profile in compliance with a data subject's rights, there are yet more obstacles to overcome.
A privacy notice must refer to the existence of profiling and provide meaningful information about the logic involved, as well as the significance of it and the envisaged consequences.
Data controllers will need to carefully craft privacy notices. A balance will need to be struck between (i) providing enough information to meet the requirements of the GDPR and (b) providing information which is meaningful (and therefore not over-detailed or technical).
There are additional obligations on data controllers who carry out profiling activities which are spread throughout the GDPR. These include obligations to:
- use adequate mathematical or statistical procedures to guard against inaccuracies and discrimination;
- apply the principles of privacy by default and privacy by design; and
- carry out a privacy impact assessment in respect of any profiling activities.
The Practical Effect
In practice, these restrictions on profiling could lead to multiple consent boxes on proposal forms. For example, a broker selling a travel policy may need to include separate consent tick boxes for its customer's agreement to:
(i) terms and conditions;
(ii) consent for profiling using health data as part of the underwriting process;
(iii) consent for profiling using health data for marketing of other insurance policies.
To the extent profiling is undertaken on third party beneficiaries, we anticipate that they too will need to sign and return such consent notices.
Insurers and brokers should consider which consents they require and start implementing GDPR compliant consent requests well in advance of 25th May 2018.
One way to achieve greater engagement from customers will be to highlight the "value exchange" which occurs when their personal data is collected. Numerous studies have shown that the reason why individuals share data with social media sites or loyalty schemes is because they see the value that they get in return. If they share personal data with a social media site they will be able to communicate with their contacts; if they share data with a retailer they will receive tailored discount offers. The data subject is providing their data in order for a readily identifiable benefit.
Unfortunately, insurance doesn’t always have the same attraction. However, there is work that can be done to educate customers about the uses that insurers and brokers make of their data to illustrate the value exchange. One area of particular concern around the use of profiling in big data projects is that underwriters end up with so much information about a particular person or class of persons that they become uninsurable.
However, there are examples of exactly the opposite occurring and big data being used to make the uninsurable insurable again. For example, a telematics box designed specifically for individuals with criminal convictions who might not otherwise have been able to readily obtain motor insurance.
Publicity of these sorts of profiling activities will serve to highlight the value exchange that insurers and brokers can offer.
Leniency from the UK legislature?
Members States may restrict the scope of data subject's rights in a number of circumstances.
Those of particular relevant to the insurance industry include:
- prevention, investigation, detection and prosecution of criminal offences; the prevention and detection of fraud is likely to fall within this;
- other objectives of general public interest, in particular an important economic or financial interest; this could potentially be relied upon to permit profiling using sensitive personal data in the health insurance sector; and
- prevention, investigation, detection and prosecution of breaches of ethics for regulated professions this could be used to permit profiling in e-recruitment of FCA authorised persons.
It will be for the UK government to provide data controllers with such additional exemptions via national legislation. The insurance industry would be well advised to lobby for such exemptions now.
What does all this mean for profiling post GDPR and what should you be doing now?
Profiling under the GDPR will undoubtedly be more difficult; although not impossible. There are a number of steps that insurers and brokers can take now which will either serve to take profiling outside of the scope of the right or ensure compliance.
Steps to take profiling activities outside of scope
- Anonymise data wherever possible so as to take it outside the scope of the GDPR entirely.
- Reduce profiling to "background profiling only" (e.g. refining algorithms and underwriting models).
- Insert some form of human intervention into any decision making process
Steps to ensure compliance
- Start recording the data that is being put into your big data environment so that you are able to track the particular privacy notice and consents that each data subject has given.
- Ensure systems have the technical capabilities to extract personal data in the event of subject access or withdrawal of consent.
- Start educating data scientists and marketing teams of the new rules. You need to build time for obtaining consents into any new project plan to be incorporated into renewals. You can of course send new privacy policies and requests for consent mid-term but take up is likely to be low.
Submitted by Rhiannon Webster, Partner and Jade Kowalski, Solicitor