Out-Law / Your Daily Need-To-Know

There is broad consensus that a greater degree of regulation of online content is necessary, but the aims of the proposed new Online Safety Bill in the UK could be undermined by a lack of clarity over the way the legislation is to be implemented and enforced.

The Bill will introduce a new and very broad regulatory framework in the UK which will impose extensive obligations on providers of certain classes of services that share content generated by users. Duties will also be imposed on search engines.

The proposals cover services that have links to the UK, either because they have a significant number of UK users or the UK is one of the target markets for the services, or because they are services capable of being accessed in the UK and there are reasonable grounds to believe there is a material risk of significant harm to individuals in the UK due to content on the service.

A move towards greater intervention

Implementing online safety regulation is a difficult task and policymakers across the world are grappling with achieving it in different ways. The new and much more interventionist approach represents a quite dramatic shift away from the position that has prevailed in the internet age, which we have explored previously.

A concept of safe harbours for internet intermediaries has existed since the late 1990s. In the EU, that consensus is reflected in the E-Commerce Directive which sets out a “notice and take down” regime for internet intermediaries and which, crucially, prohibits the imposition of any obligation on intermediaries to generally monitor the content which crosses their services. After Brexit, the default position remains that general monitoring cannot be required in the UK, but the government is free to legislate away from this in any specific area of its choosing. That is what it is planning to do with the Online Safety Bill. However, any change of this significance needs to be handled extremely carefully.

Barker David

David Barker

Partner

If the goal is to have an enforceable regime on a relatively quick timescale, legislators should look to simplify and to resist the temptation to respond to a difficult task with a complex solution

The obvious approach in this scenario would be for the government to proceed on a staged basis, and to avoid trying to do too much too quickly. In reality, though, the government is taking a different tack, with an expansive approach.

This tendency to try to “fix everything” all at once persists in the Bill. There are problems with the government's approach that we hope can be resolved as the Bill is scrutinised as it passes through parliament.

Duty of care

The government continues to trumpet the idea that the Bill creates some sort of “duty of care” owed by service providers to users. This is unhelpful messaging, and the Bill does no such thing. The concept of a duty of care exists primarily in the tort of negligence whereby a private right of action exists where Person A breaches their duty of care to Person B and Person B suffers damage as a result. Person A carelessly injuring Person B in a road traffic accident is a simple example. The importation of this concept into the government’s messaging ought to be dropped.

Indeed, a more suitable approach to the legislative scheme as a whole would be to set out duties around a concise set of principles – like the principles relating to processing of personal data under Article 5 of the General Data Protection Regulation (GDPR). The Bill includes online safety objectives in Schedule 4, which regulator Ofcom must follow when setting out codes of practice on how the duties are complied with. An obligation to comply with a concise set of objectives, set out at the beginning of the Bill, would be a more suitable approach to creating a regulatory framework.

Awful but lawful

One of the more problematic aspects of the Bill is the proposed regime to regulate lawful but harmful content. This has also been colloquially referred to as “awful but lawful” content. Here the government is in a difficult position. Undoubtedly there is much public concern about the amount of content proliferating which is potentially damaging to some users but which is not in itself unlawful. By choosing to take on this challenge now, the government has produced legislation which is very complex. It seems unlikely that any additional benefit to be obtained through these provisions will outweigh the questions they raise.

The government’s approach targets different content by reference to the type of service (in terms of its functionality), the scale of the service (in terms of the number of users), and the demographic of users (particularly as to their age). As a consequence, it is more or less impossible to produce an easily accessible diagram or scheme of how different types of content are to be regulated in different contexts. Service providers are left having to navigate all of these levels of complexity in the Bill. 

An example of the new and difficult-to-fathom concepts introduced by the Bill is “content of a kind which presents a material risk of significant harm to an appreciable number of adults in the UK. Harm in this context means physical or psychological harm. Trying to navigate concepts like this will be a real challenge. Service providers will be making these very difficult assessments about the potential harm associated with particular types of speech, in various contexts which may or may not assist them, for the most part at scale.

Clearly the government is aware that there were other ways of targeting content that could be considered harmful. The most recent version of the Bill has introduced new communications-based criminal offences relating to certain types of content online, which we can assume the government considered the most egregious. The Bill has also added a new verification requirement and required services to give users more control over the other users they interact with.

Time will tell whether a better approach would have been to tackle lawful but harmful content in a second phase, offering the prospect of learning from what works well or badly with unlawful content in phase one.

Systems or content?

There remains an ongoing tension in the Bill’s focus: is it primarily concerned with the systems which service providers are to put in place, perhaps accepting that these will be imperfect, or is it primarily concerned with the type of content which is to be addressed?

Of course, systems and content are not mutually exclusive: it is impossible to put in place systems without some clarity on the type of content to be addressed. Still, despite assertions that the Bill will target systems, it is unclear from reviewing the Bill how that will work in practice. Most of the duties included in the Bill and the debate around it are about types of content. This perhaps appeals to the political desire for real world examples of the types of harm that will be addressed. We already know there will be difficulties identifying content that is covered by the Bill.

We think a focus on systems would be far more practical and useful to the service providers who will have to work out how to comply with this legislation. The larger platforms already have systems and processes around flagging or removing certain content. The Bill is a missed opportunity to provide some clarity and some consistency around what is required. These duties alone could form the basis for a Bill that would impose meaningful obligations on platforms and search engines around putting in place governance mechanisms to keep users safe.

New in-scope items

Late additions have been made to the Bill. These add to the Bill’s unwieldiness. One of these is the introduction of a requirement for age verification measures for non-user-to-user pornography sites. Something similar was previously attempted and, frankly, bungled, under the Digital Economy Act 2017. It is now just another unstraightforward thing that must be achieved by the Bill, to add to everything else.

A further requirement has been imposed on providers of ‘Category 1’ services to provide adults with the option to verify their identities. Although the Bill does not specify how this verification process is intended to work or what type of information the service must collect, it does say that it need not require documentation. Separate provisions require providers of Category 1 services to include features to allow adult users tools to manage the content they encounter and avoid content presented by unverified users. The provisions are intended to address anonymous abuse online, but they will be challenging for service providers to implement in a way that does not create a barrier to some users getting online.

Along similar lines, the Bill now includes provisions applicable to Category 1 services and large search services requiring their providers to address online advertising which facilitates fraud. 

Fraudulent advertising is a complex topic in its own right. Indeed, the government launched a consultation on online advertising on 9 March this year. The consultation document was updated on 17 March and mentions that the review “will work in conjunction with the measures being introduced through the forthcoming Online Safety Bill” which introduces “a standalone measure for in-scope services to tackle the urgent issue of fraudulent advertising”. The process of debating and implementing these provisions, however, might well have been more straightforward if done within the context of a consultation effort focused specifically on online advertising. 

Leaving difficult issues until later

Reading the Bill, there is a sense that the government has tried to tackle so much that it must inevitably postpone engaging with much of the detail until after the Bill becomes law. The government has acknowledged that the Bill is complicated, but asserted that does not mean it will be overly complicated for service providers to comply with, as Ofcom’s codes of practice and guidance will provide detail and clarity for services as to how to comply with their legislative duties. That remains to be seen and, as things currently stand, providers are ill-equipped to prepare for the new obligations, as the detail on the duties is not set out in the Bill itself.  

The Bill also leaves the crucial area of setting out the threshold conditions for Category 1, Category 2A, and Category 2B service providers to secondary legislation. Many of the Bill’s duties apply based on which of these categories a service provider falls under. At this stage, service providers are therefore unable to get a complete picture of which duties will apply to them, as well as being in the dark about how they will be expected to comply if they are caught by the thresholds. 

Citizen journalism

The mainstream media have understandably lobbied hard for exceptions to the new regime and have largely been successful in doing so. This has, however, left a notable divide between the way in which the Bill treats what might be termed as traditional journalism and the way in which it treats citizen journalism. 

Barker David

David Barker

Partner

Recognising the importance of public participation should not be underestimated, especially in an online world in which it has become difficult to work out who is actually offended, and who is feigning offence

It is worth reflecting on just how much the internet has changed over the past 20 years. Web 2.0 did not exist in a mainstream sense in the early 2000s and yet the ability of everyone to participate in social sites/platforms, video sharing and the like is now a central part of how the internet is used. In the first Gulf War in the early 1990s, CNN reported live on US air-strikes on Baghdad from a hotel in the Iraqi capital. The US military effectively acknowledged that they were obtaining much of their immediate intelligence from the channel. Fast forward 30 years to the Russian invasion of Ukraine and much near real-time information is actually provided through citizen journalism on social media sites, not by the traditional media.

These changes come with significant challenges, from how to determine whether information is authentic, to where to draw the line on posting graphic content. Yet recognising the importance of public participation should not be underestimated, especially in an online world in which it has become difficult to work out who is actually offended, and who is feigning offence. How the risk of over-blocking plays out will be of real democratic importance. 

The risks to freedom of expression

Currently the Bill imposes duties on service providers to protect freedom of expression, but it is unclear how providers are required to balance this against the other duties set out in the Bill. The Bill’s European Convention on Human Rights Memorandum sets out general commentary on why interference with Article 10 rights is said to be justified under the Bill. 

The expansion in the harms covered, as discussed above, may lead to a trend towards providers removing content at scale out of concern for contravening their duties. The Bill does impose duties on all providers of user-to-user services to put in place complaint procedures which allow users to complain to the provider in relation to a decision to take down or restrict access to content. Similar complaints processes also apply to regulated search services. 

According to the Bill’s factsheet, “the largest social media platforms will no longer be able to arbitrarily remove harmful content”. Whether this is happening currently may be contested. In any event, smaller providers caught by the legislation with fewer resources to moderate content may be more likely to take a conservative approach to taking down content.

Next steps

The Bill has had its second reading in the Commons, with the Committee stage scheduled to conclude no later than the end of June. The report and third reading stage are then likely to be held together in the second week of July. Consideration by the Lords is unlikely to begin until after the summer recess, in September. Depending on the extent of any “ping-pong” between the Houses, the Bill could receive Royal Assent by February 2023. If important detail is to be left to secondary legislation and codes of practice, it may be a further year or more before enforcement activity can begin.

From a business perspective, numerous details need to be decided before we have an Act and accompanying secondary legislation, if applicable, that Ofcom can enforce. Parliament now has the opportunity to scrutinise this legislation and how it will work in practice. If the goal is to have an enforceable regime on a relatively quick timescale, legislators should look to simplify and to resist the temptation to respond to a difficult task with a complex solution.  

Co-written by Rosie Nance of Pinsent Masons.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.