9 Min Read

Playlists and Policies: The Legal Dimensions of UK Music Streaming

Read more

By

|

Published 29 September 2023

Overview

In 1999, a group of college students created a computer program which allowed users to share, over the Internet, electronic copies of music stored on their personal computers: Napster. At a time when Britney Spears's (Hit Me) Baby One More Time was top of the charts in the United Kingdom, the arrival of Napster marked the emergence of online music sharing. The platform sparked landmark legal battles over digital rights management and computer copyright piracy, and was effectively shuttered only two years later.

Much has changed since Metallica v. Napster Inc (2000), an American legal case which was the first to see an artist suing a software company for copyright infringement. In the nearly 25 years since, the music industry has gone through rapid maturation. Tech companies, artists, and record labels now work together – although not always in perfect harmony – and music streaming platforms have firmly established their dominance in the entertainment industry worldwide.

Nevertheless, these platforms, whilst revolutionising the way we consume music and podcasts, continue to navigate a complex landscape of intellectual property rights, data protection laws, and increasing regulatory oversight. This article provides insight into some of those issues.

What type of day-to-day legal issues do UK-based music streaming platforms encounter?

With regard to IP, the platform must ensure it has licences to stream the music, which will be subject to copyright. A large portion of the platform's work therefore involves negotiating agreements with various rights holders, which are typically record labels, publishers, and independent artists. Furthermore, displaying or otherwise using other audio or visual content, for example album art or trade marks owned by music labels, must also be accounted for in the applicable contracts. In such cases, the UK's Copyright, Designs and Patents Act 1988 will be of key importance.

Compliance with data protection legislation presents another challenge. As platforms gather, analyse, store, and share vast amounts of consumer information ranging from basic personal data to listening habits and other forms of behavioural analytics, they must do so in accordance with Retained Regulation (EU) 2016/679 (UK GDPR), the Data Protection Act 2018, and other applicable data protection and privacy laws. Moreover, data misuse or breaches may also come under the purview of the Online Safety Bill ("OSB"), discussed below.

From a regulatory perspective, music streaming platforms fall within the remit of bodies like the UK's Competition and Markets Authority and the communications regulator, Ofcom. Here, the rules and regulations concerning competition and digital markets are relevant, as are those concerning consumer rights: platform terms and conditions should comply with the Consumer Rights Act 2015.

The UK has seen increased scrutiny placed upon the business models and practices of streaming platforms in recent years, as notably covered by the Streaming Economy Inquiry held by the House of Commons Digital, Culture, Media and Sport Committee, which could lead to additional regulatory measures.

What particular activities or business models of a music streaming platform might bring them into scope of the UK OSB?

The OSB, which is expected to come into force in early 2024, is designed to impose a duty of care on companies to improve the safety of their users online, especially with regards to illegal or otherwise harmful content.

As a general observation, most music platforms limit the ability of users to upload content (by contrast to, for example, a social media platform or community forum) and thus, control of what audiovisual content ultimately goes onto the platform will likely be within the remit of the platform itself. That said, the OSB is relevant to consider, particularly if uploading, sharing, or commenting functionalities are enabled.

If the platform allows users to comment on songs or artists, create playlists, or upload their own music, it will be responsible as a platform for moderating such user-generated content to ensure it does not contain harmful or illegal material. This might include defamatory or harassing content, or hate speech. Similarly, the platform could potentially facilitate cyberbullying or harassment through its social features, such as messaging or user interaction elements.

It is also worth considering the use of behavioural analytics and algorithms which personalise or recommend content, noting that such recommendations could potentially amplify harmful content. For more on this topic, albeit from a US perspective, see for example Reynaldo Gonzalez, et al v Google LLC, 598 US ___ (2023), a case concerning YouTube's algorithmic recommendation systems which allegedly pointed viewers to terrorist videos.

What sort of practical steps will a music streaming platform need to take in order to discharge the safety duties under the OSB?

The OSB has not yet been approved in a final form, and further amendments may be made to the current draft. However, based on information made available as of July 2023, a music streaming platform will likely be compelled to take the following steps when the law comes into force. Please note the list below is not exhaustive:

  1. robust content moderation —the platform is likely to be obliged to implement mechanisms to monitor, detect, and remove harmful or illegal content. Users should have easy and straightforward ways to report harmful or illegal content, and such reports should be dealt with promptly and effectively
  2. clear policies — the platform's terms and conditions and other user policies should clearly and accessibly outline what content and behaviour is unacceptable, and the consequences for breaching these rules. A platform should consider educating its users about relevant risks as appropriate
  3. safety-by-design — the platform's user interface/user experience and other design elements and features should be built with user safety in mind, especially bearing in mind vulnerable users, children, and those with special needs or accessibility issues
  4. transparency — transparency is a major theme throughout the OSB, and is directly correlated with accountability. Among other things, platforms will be required to publish annual transparency reports, and be transparent about use of algorithms which amplify or suppress certain types of content

Is a music streaming platform which is within scope of the OSB likely to be subject to the free speech duties thereunder? If so, how might this be the case?

The UK's OSB is designed to improve online safety while protecting free speech.

In brief, platforms are obliged to ensure that freedom of expression is not unduly restricted, but this must be balanced against their obligation to moderate and, where necessary remove content which is harmful or illegal. If a platform permits users to leave comments, send messages to other users, or upload user-generated content, the platform must consider its duties regarding the protection of freedom of expression. Ancillary obligations to consider together with free speech include privacy, criminal activity such as fraud, harassment, and image-based sexual abuse, as well as protection of vulnerable users.

In what context might the activities of a UK-based music streaming platform fall within the scope of the Regulation (EU) 2022/2065, the EU Digital Services Act (EU DSA)?

Notwithstanding the UK’s exit from the EU, a UK-based music streaming platform will be caught within the scope of the EU DSA if it offers services to recipients who are established or located in the EU, irrespective of where the platform is itself based as per Article 2(1) of Regulation (EU) 2022/2065, the EU DSA.

As in the UK's proposed OSB, the EU DSA introduces several obligations for online platforms, which will include music streaming platforms, particularly if they host user-generated content such as user-created playlists, comments, or audiovisual files.

Among other things, the platform would be required to provide an electronic reporting mechanism under the EU DSA's ‘notice and action’ regime, to allow users to report illegal content. The platform must action the content quickly, for example by removing the content or suspending the offending account. The platform must also have a complaint handling procedure in place, to permit users to dispute content moderation practices, and such procedures and moderation decisions must be transparent. This applies also to algorithmic or AI-assisted decisions.

Each EU Member State is obliged to have one or more competent authority which is responsible for enforcing the EU DSA, and one of the competent authorities will be designated as that country's 'Digital Services Coordinator' under Article 38 of Regulation (EU) 2022/2065, the EU DSA. A platform must co-operate with the relevant Digital Services Coordinator(s), which has powers to investigate, audit and enforce the EU DSA.

Platforms with more than 45 million users in the EU are defined as 'very large online platforms', and will have more extensive obligations given their size and influence on consumers. These enhanced requirements include risk assessments, external audits, and additional transparency obligations. Only relatively few music platforms such as Spotify, Apple Music, Amazon Music, and YouTube Music are likely to meet this criteria.

Kelsey Farish, an associate in DAC Beachcroft's technology and media team, was commissioned by LexisNexis to write this article. It was originally published on LexisNexis PSL Legal Insights in August 2023 and is shared here with kind permission from LexisNexis.

Author