Out-Law / Your Daily Need-To-Know

Out-Law News 3 min. read

Online Safety Bill: ‘legal but harmful’ content emphasis shifts to Ts and Cs


Amendments made to the UK Online Safety Bill in relation to so-called ‘legal but harmful’ content place a greater emphasis on both the contents and enforcement of online platforms’ terms and conditions.

In recent days, the UK government has published a raft of amendments to the Bill, which is currently before parliament. Arguably the most significant changes relate to proposed duties on the largest service providers around legal but harmful content.

Under the previous version of the Bill, providers of regulated user-to-user services or search services would have been subject to overarching duties to address both illegal and other harmful content from their platforms. However, under the latest amendments introduced the Bill will no longer impose duties on the largest services in relation to content that is not illegal but could be harmful to adults. The government has said the action was necessary to protect free speech and overly zealous content moderation of “legitimate posts” by platforms seeking to avoid sanctions – the Bill makes provision for substantial fines to be issued for non-compliance.

Included in the amendments, however, are replacement provisions addressing legal but harmful content that Michelle Donelan, secretary of state for digital, culture, media and sport, described as a “triple shield”. Three rules make up the ‘triple shield’.

The first is the remaining duty under the Bill for service providers to remove content that is illegal from their platforms.

The second is the principle, as Donelan put it, that “legal content that a platform prohibits in its own terms of service should be removed, and legal content that a platform allows in its terms of service should not be removed”.

The third concerns user empowerment, with there being an emphasis on service providers to ensure users have the tools available to them to decide what content they engage with.

Donelan said: “Adults should be empowered to choose whether or not to engage with legal forms of abuse and hatred if the platform they are using allows such content. So the ‘third shield’ puts a duty on platforms to provide their users with the functionality to control their exposure to unsolicited content that falls into this category. These functions will, under no circumstances, limit discussion, robust debate or support groups’ ability to speak about any of these issues freely.”

“The user empowerment tools will allow adults to reduce the likelihood that they will see certain categories of content if they so choose. The duty will specify legal content related to suicide, content promoting self-harm and eating disorders, and content that is abusive or incites hate on the basis of race, ethnicity, religion, disability, sex, gender reassignment, or sexual orientation. This is a targeted approach that reflects areas where we know adult users, in particular vulnerable users, would benefit from having greater choice over how they interact with these kinds of content,” she said.

Technology law expert Meghan Higgins of Pinsent Masons said: “In relation to illegal content, the government has announced that it is introducing new offences aimed at online content such as creating an offence of sending a communication that encourages self-harm, new offences relating to sharing an intimate image without consent, and epilepsy trolling. We don’t know yet if there are further offences to be introduced that could address other classes of content. This could to a certain extent address some of the concerns about the removal of legal but harmful content.”

“The government has not made any announcements that suggest the duties in respect of harmful content where a service is likely to be accessed by young people will change,” she said.

Lottie Peach, also of Pinsent Masons, added that she expects the government and Ofcom, which will be responsible for regulating under the UK’s online safety regime when it takes effect, to play an important role in helping service providers understand exactly what their obligations are in relation to the new requirements – including in particular their obligations to implement user empowerment tools – in due course.

Other amendments introduced would empower Ofcom to take enforcement action against individual directors, managers and officers at service providers that are subject to the online safety regime, as well as the providers themselves.

New minimum age, and associated age verification requirements, are also expected to be mandated for providers’ terms of service.

Out-Law understands that MPs will return to scrutinising the Online Safety Bill next week. The Bill is expected to be passed from the House of Commons to the House of Lords for further scrutiny either later this month or early 2023.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.