This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

MediaWrites

By the Media, Entertainment & Sport group of Bird & Bird

| 4 minute read

Online Harms in the UK: Significant new obligations for online companies and fines of up to 10% of annual global turnover for breach

Yesterday the UK government published its long-anticipated final response to the Online Harms White Paper. Partner Bryony Hurst and Associate Theo Rees-Bidder set out the headline points, and what we can expect next.

Yesterday was a busy day for Europe in terms of tech regulation.  In addition to the ground-breaking announcements made by the European Commission of the new Digital Services Act and the Digital Markets Act, the UK government also published its long-anticipated final response to the Online Harms White Paper.  The Online Harms consultation ran from 8 April 2019 to 1 July 2019 and received over 2,400 responses from stakeholders across the technology industry, including online platforms, charities, think-tanks, publishers, individuals and small/medium sized enterprises.

The government’s response contains a significant amount of detail and gives us a real insight into the stance the UK will adopt post-Brexit on digital regulation.

As the lie of the land becomes clearer, in future articles we will be breaking down the detail into practical points businesses should be aware of.  But, for now, here are some of the headline points that jump out when reading the government’s response:

  • Content in scope: The UK has opted to legislate against both a) illegal content and b) harmful (but not illegal) content. This remains a controversial decision; indeed, by contrast, the EU in its Digital Services Act, stated explicitly that it would not legislate against this sort of content, explaining that this is “a delicate area with severe implications for protection of freedom of expression”.
  • What is “harmful”?: For content not classified as illegal (which currently covers terrorist content, child sexual abuse materials and suicide, with a question mark about making material promoting self-harm illegal too), platforms will have to assess whether there is a “reasonably foreseeable risk” of the content “causing physical or psychological harm to adults”. Financial harms are explicitly not within the scope of the proposed legislation.
  • Companies in scope: This remains broad, covering any site which hosts user-generated content or allows users to talk to one another online (either publicly or, controversially, privately – meaning private messaging services will be caught). In practice this means social media companies, video sharing platforms, IM platforms, online forums, search engines, online marketplaces, dating apps, commercial porn sites, consumer cloud storage sites, video games which enable user interaction, peer to peer services and even some forms of advertising (organic and influencer ads, for example) will be caught by the legislation.
  • Jurisdiction: One feature which has raised eyebrows is the jurisdictional test adopted; companies in scope will be subject to the regime if they are accessed by users in the UK, without more. Companies without a physical presence in the UK/EU have only just started getting comfortable with the idea of being subject to European laws (e.g. GDPR) based on the fact that they target citizens who live here – but this mere accessibility test stretches things even further.
  • Layered approach to regulation: the government appears to have taken on board the concerns voiced in the consultation responses that legislation in this area risked having a disproportionate effect on smaller businesses with less reach than the larger platforms.  It has proposed categorising companies, according to their reach and the nature of the activities/features on their site.  Category 1 companies (which the response indicates will include large social media sites) will be subject to more extensive obligations than Category 2 (likely to include dating apps and private messaging services) and lower.
  • What are the obligations?: The government is sticking with its concept of a “duty of care” to be imposed upon companies in scope. How companies fulfil this duty of care will be down to the regulator, who has been tasked with issuing Codes of Practice in relation to specific content/activities. These Codes will presumably set out organisational/technological steps companies are expected to take to protect users from harmful content. It appears that Category 1 companies will have to tackle illegal content and harmful but not illegal content via clear terms and conditions and consistent enforcement of the same (evidenced by regular transparency reporting), whereas Category 2 and lower will be obliged only to tackle illegal content, and carry out an assessment on the risk to children of other content on their sites (and implement additional protection measures for children if necessary). Two interim Codes of Practice were announced with the government’s response (one for terrorist content, another for child sexual abuse materials) and these will give some insight into how prescriptive these Codes will be and the sorts of measures to be taken.
  • Significant regulatory enforcement powers: The response confirms that Ofcom (the current UK broadcasting regulator) will take Online Harms regulation within its remit. It has been granted real powers of enforcement for non-compliance; fines of up to 10% of annual global turnover or £18million (whichever is higher) and the ability to order UK access to sites to be blocked. The government has also reserved a right to introduce criminal liability for senior managers, if companies do not take compliance with the new regime seriously enough. The level of fines is potentially eye-watering for the very large platforms, and unprecedented (fines under GDPR were set at a maximum of 4%, which pales into insignificance by comparison with this new proposal).

What next?

Unsurprisingly, the devil will be in the detail. The draft Online Safety Bill is expected in the new year, and there are sure to be further twists and turns as it is developed and scrutinised by Parliament during 2021. Look out for further articles over the coming weeks in which we’ll be analysing what the government’s response means for businesses and how to prepare for what’s to come.

Tags

other, social media, digital regulation, online harms, united kingdom