8 Min Read

Five years of GDPR – standing the test of time?

Read More

By Hans Allnutt, Jade Kowalski & Stuart Hunt

|

Published 25 May 2023

Overview

On May 25th 2018, the General Data Protection Regulation, widely known as GDPR, fundamentally amended data protection regimes across Europe and influenced legislation worldwide. On its fifth anniversary and amid an ever-changing data protection landscape, we reflect on its impact, lessons learned and emerging challenges.

Our key conclusions are:

  • The fundamental data protection principles have stood the test of time and have the flexibility to adapt to apply to new technologies.
  • Public engagement and education is key for good outcomes.
  • The ICO continues to (intentionally?) use the carrot, rather than the stick, although there are a range of sticks, both big and small, available.
  • It is the courts who are shaping the compensation landscape.

The fundamental data protection principles have stood the test of time

The fundamental data protection principles long pre-date the GDPR. Concepts such as fairness, purpose limitation and data minimisation have been part of UK law since the Data Protection Act 1984.

Brexit created parallel regimes in the UK and EU. As the ICO notes, "the [EU] GDPR is retained in domestic law as the UK GDPR, but the UK has the independence to keep the framework under review," with the key principles, rights and obligations remaining the same.  We have however seen differences in interpretation between the ICO and its peers in the EU.

Some further divergence is now very likely following the introduction of the Data Protection and Digital Information (No.2) Bill. Our detailed assessment can be found here. The changes do not represent a wholesale shredding of the existing GDPR system in the UK, instead focusing on targeted changes and clarifications to a risk-based approach. This can be viewed as a significant vote of confidence that the existing UK / EU GDPR as frameworks are already standing the test of time.

This is a sentiment that is echoed by the Irish Council for Civil Liberties which recently issued a report in response to the fifth anniversary stating that the "GDPR provides strong investigation and enforcement powers to protect people from the misuse of data that enables much of the digital world’s problems."

Nonetheless, the past five years have seen the introduction of new technologies which raise challenges in the context of data protection. Facial recognition technology ("FRT") and artificial intelligence ("AI") in various forms are now issues which must be considered within the GDPR framework. Does the GDPR have the flexibility to address these challenges? Our view is yes. As data protection and other authorities across the UK and EU grapple with these issues as a matter of urgency, considering the scope and form of legislation and/or regulation, the GDPR framework features heavily in those discussions. Indeed, in the short term at least, it is the key piece of legislation which currently exists to govern the technologies as they exist today.

In our view, principles based regulation is the only way to ensure that laws do not quickly become outdated. There have already been efforts on the part of European data protection authorities to seek to ensure that ChatGPT, the pre-eminent generative AI system, is operated and used in compliance with GDPR and the European Data Protection Board (“EDPB”) is seeking to foster cooperation and information exchange on possible national enforcement actions. 

This move to harmonise Member States’ policies towards ChatGPT was preceded by the decision of the Italian data regulator to temporarily block ChatGPT from operating in Italy in April 2023. The activity was prompted by a report of a data breach affecting user conversations and payment details. Similar complaints are being investigated by the French and Spanish data protection authorities.  In the UK, the ICO has reminded organisations using AI software that there are no exceptions to the rules when it comes to using personal data.

The accelerating pace of change increases the risk of prescriptive legislation quickly becoming outdated and regulators struggling to adapt their approach. Continuing to apply fundamental principles such as fairness and transparency will chart a sensible course.  

Public engagement and education is key for good outcomes

The implementation of the GDPR brought about a seismic change in public awareness of data protection and privacy rights. The European Commission reported in 2020 that “69% of the population above the age of 16 in the EU have heard about the GDPR and 71% of people heard about their national data protection authority.[1] We expect this number to have only increased since then in both under and over 16 year olds.

This increase in awareness means that the public is better able to question how personal data is used and is empowered to hold organisations to account.  

However, whilst general awareness is high, there is still much to be done to educate the public further to avoid expectations of rights which go beyond those set out in the law and, conversely, ensure that certain rights are not simply "agreed away". This presents an opportunity for data protection authorities and governments, as well as for organisations.  We consider two examples – DSARs and cookies - below.

In our experience, there is now a good understanding of the existence of the fundamental right of access. This right is one of the cornerstones of data protection law and, when exercised correctly, provides the data subject with details of the personal data processed by an organisation and certain information about how that data is processed. However, the extent of the right and its application is often misunderstood. For example, we frequently deal with requests whereby the data subject is insistent on access to non-personal data or access to documents and does not understand that the right is subject to a number of exemptions (including the balancing of third party privacy rights). This can cause frustration for data subjects, who often view refusals as deliberate obstruction of their rights, and can lead to extensive exchanges of correspondence and subsequent unfounded complaints (including complaints to the ICO). 

Whilst the DPDI (No.2) Bill currently proposes tweaks to this right, particularly in relation to the refusal of 'vexatious or excessive' requests, this is unlikely to curb the existing challenges of dealing with DSARs. In fact, it could increase the number of complaints on the basis that the greater the number of refusals, the greater the prospect of complaints. Whilst the Bill should help to avoid the weaponisation of DSARs, further public education on their scope and exemptions remains crucial to ensure both their proper use and public trust that any limitation is being applied compliantly.

In a similar manner to DSARs, the majority of the general public will now have some familiarity with cookies. A website user will likely click through multiple consent requests every day. Arguably, this is good news; organisations are aware of their responsibilities under both the GDPR and the Privacy and Electronic Communications Regulations 2003 ("PECR"); yet how many users actually know what they are agreeing to? We would suggest that the answer is "not very many". Of course, companies will argue that details regarding their use of cookies are available and the required consents are presented in a compliant manner; the responsibility for ensuring this is properly understood lies with the user.  Whilst that is arguably correct (at least in part), the reality is that many users simply do not have the time or the inclination to read and consider this information. There is much more that could be done, both by organisations and by the regulators, to ensure that the public is educated to make an informed decision about consent to the use of cookies. 

The ICO: More carrot, a different sized stick

The Information Commissioner's Office is tasked with upholding data protection rights under the UK GDPR. As well as by enforcement, it does this by providing extensive and user friendly guidance on rights and responsibilities of organisations.  It has (rightly so) developed a reputation as one of the most pragmatic regulators.

Historically, the ICO has been reticent to make public details of all reported breaches, partly to encourage transparent reporting from organisations. This approach was very much welcomed by those in breach who could report an incident to the ICO safe in the knowledge that it would not be made public unless enforcement action followed. However, as part of a planned change to public transparency set out in the ICO25 strategy, the ICO has started publishing details of complaints about data protection concerns and self-reported personal data breaches.  Information on data security trends is also made available. Commenting on the change, current Information Commissioner, John Edwards, stated that "every regulatory action must be a lesson learned by the rest of the economy and play a role in behaviour change."[2] 

The ICO has also started to publish lists of reprimands that it hands out in circumstances where the threshold to issue fines has not been met. This is certainly an easier enforcement option for the ICO, given the prospects of costly appeals are more limited, whilst ensuring that the ICO is seen to be carrying out enforcement. For organisations, whilst they avoid financial penalties, the unintended consequences could be greater claims for compensation from individuals who rely on the ICO's finding of wrongdoing.

In the public sector, there has been a two-year initiative to raise standards, alongside an approach to ensure that fines do not unnecessarily impact on the provision of public services and budgets. For example, in June 2022, fines in excess of £750,000 issued to the Tavistock and Portman NHS Trust and NHS Blood Transplant Service were reduced by 90% and to a reprimand respectively.

What is clear is that the ICO is attempting to strike a delicate balance between encouraging companies to report breaches but also fostering accountability through appropriate and proportionate transparency. Breaches reported to the ICO (and any other notifications required) are important for organisations, the public and national authorities. The collection of data by the ICO will help understand cybersecurity trends, identify common weaknesses, common modes of attack and provide guidance to prevent similar attacks from occurring in the future.

One of the benefits of the pan-European nature of GDPR is that comparisons can be made between jurisdictions in respect of how GDPR is interpreted and the subsequent levy of sanctions. Of course, there will be outliers financially, particularly instances where 'BigTech' companies are involved. The €1.2 billion fine recently handed down to Meta Ireland, albeit subject to appeal, is now the largest under the GDPR, with the remainder of the now-top 10 fines also taken up by major corporations such as Meta and associated companies, Google and Amazon.

The ICO has handed down 13 GDPR-related fines as at mid-May 2023[3]. By way of comparison, the Spanish data protection authority has issued 646 fines within the same period, with the French data protection authority issuing 35.  These wildly different totals demonstrate the differences in approach to enforcement. However, we have seen the ICO increasingly look to use other powers including orders to cease processing which, in some instances, are likely to have a far greater impact.

Should a consistently tougher approach be considered? The Irish Council for Civil Liberties criticised data protection authorities across Europe for failing to adequately use the 'shield' that the GDPR provides. The Irish Data Protection Commissioner was condemned for providing draft decisions which were routinely criticised by other European bodies as being too weak.

We will watch with interest to see whether the revised ICO approach of publishing all decisions will have the intended impact.

The courts are shaping the compensation landscape

The courts have been left to shape the compensation claim landscape for data breaches. Concerns that data breach claims could become the 'new whiplash' have proved unfounded but there has certainly been a significant rise.

In 2021, the Supreme Court comprehensively dismissed the claim in Lloyd v Google, closing the door to the prospect of the widespread use of the group litigation process being used for data breach claims. The recent decision of the High Court to dismiss the claim in Prismall v (1) Google (2) DeepMind has reiterated that the door currently remains firmly shut.

Low value data breach claims pursued on an individual basis have proved more resilient, but a number of judicial decisions will help to ensure the proper allocation of these claims. Our team has been at the forefront of defending data breach actions comprising a number of overlapping or inadequately pleaded number of actions; breach of data protection legislation, breach of confidence and misuse of private information being common examples. Incorrectly issued in the High Court, these claims often included claimed costs inflated far beyond the value of the claim itself. 

Judicial decisions indicating data breach claims should dealt with the small claims track, and the further introduction of fixed recoverable costs will likely limit the prospect of many claimant practitioners viewing data breach claims as a viable income stream.

Some claimant practitioners may look positively as the recent Austrian Post decision in the European Court of Justice, albeit it has no direct impact on UK law. The ECJ confirmed that not every GDPR infringement gives rise to a right to compensation on its own, but that there is no threshold of seriousness for non-material damage claims. This is similar but at the same time contrasting with the UK position which equally recognises that compensation is only payable in respect of damages shown to be caused by a breach of the UK GDPR, but that a de minimis threshold does exist in respect of those damages. Although the decision is unlikely to trigger a wave of GDPR litigation in the UK, data breach claims per the GDPR are still very much evolving, and further precedents may occur in time.

Is there the prospect of an alternative method to deal with data breach claims? Possibly. Although, there is currently no scope for the ICO to deal with data breach claims, the UK Information Commissioner last year suggested that data breach claims could form part of the ICO remit based on his own experience in New Zealand as Privacy Commissioner. However, there have been no subsequent formal proposals and such a move would be exceptionally difficult with resources available to the ICO. As noted above, however, the ICO's reprimand lists could provide an oven-ready finding of wrongdoing on which to base compensation claims.

[1] https://ec.europa.eu/commission/presscorner/detail/en/qanda_20_1166

[2] https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2022/11/how-the-ico-enforces-a-new-strategic-approach-to-regulatory-action/

[3] https://www.enforcementtracker.com/

Authors