A Collection is a selection of features, articles, comments and opinions on any given theme or topic. It allows you to stay up‑to‑date with what interests you most.
Login here to access your saved articles and followed authors.
We have sent you an email so you can reset your password.
Sorry, we had a problem.
Tags related to this article
Published 27 May 2021
There have been mounting concerns as to the potential harm which the internet can cause to adults and children alike, with growing calls for social media companies in particular to protect users from harmful online content. In 2017, Health Secretary, Matt Hancock, warned companies (including Facebook, Google and Twitter) as to their inaction in relation to the removal of inappropriate content. This followed the suicide of 14-year-old, Molly Russell, whose father has previously said that social media was partly to blame after her Instagram account was found to contain distressing material about depression and suicide. In spite of Theresa May’s government proposing legislation to counter online harm in 2019, it has taken until the most recent Queen’s Speech on 11 May 2021 for the draft Online Safety Bill (“OSB”) to be announced.
The proposed legal framework seeks to introduce robust new measures to protect the safety of its online users, including granting Ofcom the power to impose significant fines of up to £18m or 10% of global turnover, whichever is the higher, should companies fail to comply with the new rules. The OSB covers a wide range of content which may cause harm, including terrorism, racism, child sexual exploitation, suicide, eating disorders and revenge porn. Although much of the content is to be expected, there have been some surprises, such as the addition of tackling online scams (such as fake investment opportunities) which is on the rise.
The OSB imposes extensive duties of care on providers of internet services that allow users to share user-generated content (user-to-user services) and providers of search engines (search services) (together, “regulated services”). It also imposes duties on “senior managers” of regulated providers, being individuals who play a significant role in either “(a) the making of decisions about how the entity’s relevant activities are to be managed or organised, or (b) the actual managing or organising of the entity’s relevant activities.”
Companies will be categorised into two tiers according to the size of their online presence and the level of risk posed on the platform. Those companies falling within the scope of the regime will have a duty of care towards users of their platform with those duties taking a multi-layered approach. Category 2 service companies will be required to take action against illegal content on their platforms. Companies posing a higher risk to users are classed as Category 1 service companies. It is thought that fewer companies will fall within the scope of the Category 1 definition. However, those which do will need to take action against content which is legal but harmful.
As will be evident from the preceding paragraph, a distinction is drawn in the OSB between illegal content and content which is legal but harmful. In an effort to provide clarity, the OSB includes definitions of content which is harmful to children and that which is harmful to adults (see sections 45 and 46 OSB). Despite those relatively lengthy definitions, critics have suggested the meaning of what would be deemed harmful content is currently too vague for companies to properly interpret.
Whilst the OSB tries to strike a balance between the duty to make the internet safer and the importance of safeguarding free speech/journalistic content, there has been criticism from free speech campaigners that the OSB brings newspaper and magazine content on social media sites in scope, meaning that there is the potential for free speech to be stifled. However, further detail is required including further secondary legislation and Codes of Practice before the true impact of the legislation becomes clear.
The OSB does not expressly prohibit insurers offering cover for fines issued by Ofcom under the new regulatory regime. Whilst it is possible that Ofcom may later issue rules expressly prohibiting the offering of insurance for OSB regulatory fines, as it is currently drafted, the OSB introduces the possibility of insured technology companies (offering regulated services) submitting claims in relation to fines imposed by Ofcom. Given the size of possible fines, the potential exposure for Insurers is significant.
It is noteworthy that there is also the potential for “senior managers” (to be subject to criminal proceedings for what are termed “information offences” under clause 73. Consequently, conduct exclusions will likely need to be considered in the event of an unsuccessful defence of information offence proceedings.
Notably, the OSB also applies to providers of regulated services which are based outside the UK which will no doubt mean that the content of the draft OSB will have been brought to the attention of Silicon Valley. Whilst we now have some understanding of what the future regime may look like, the OSB still has a long journey ahead before it is enacted. The draft OSB will now be scrutinised by a joint committee of MPs, which will likely result in many changes before any final version is formally introduced to Parliament. Whether you are a champion of the OSB, or a critic, both sides can likely agree that the OSB has noble ambitions.
London - Walbrook
+44 (0)20 7894 6098
+44(0)20 7894 6595
Eleanor Ludlam, Pavan Trivedi
Patrick Hill, Hans Allnutt, Eleanor Ludlam
Jade Kowalski, Charlotte Halford, Christopher Air, Sophie Devlin, Eleanor Ludlam, Rebecca Morgan
Patrick Hill, Sonali Malhotra
Hans Allnutt, Pavan Trivedi
Sophie Devlin, Shanaka Wijetunge
Eleanor Ludlam, Astrid Hardy
Christopher Air, Alexander Dimitrov
Aidan Healy, Charlotte Burke
Eleanor Ludlam, Alexander Dimitrov
Eleanor Ludlam, Sonali Malhotra