Under the draft Online Safety Bill, some online service providers – those providing regulated user-to-user services or search services – would be subject to a series of new duties, including “safety duties” that aim to minimise the presence of material such as terrorist content and child sexual exploitation and abuse content on their platforms, as well as the time it takes for such content to be removed. Specific duties within the Bill are also aimed at addressing the risk of children accessing harmful content.
Certain types of content would be exempt from the proposed new laws, according to the government’s plans. This includes emails, text messages, one-to one live aural communications, paid-for adverts, user reviews of products, and content from recognised news publishers.
Among the other duties providers would face if the Bill is introduced into law in its current form are those aimed at respecting free speech and privacy rights, as well as those designed to protect journalistic content and content of “democratic importance”.
To help providers meet their prospective new duties, the Bill makes provision for the creation of new statutory codes of practice. Those codes would be drafted by the UK’s media and communications regulator Ofcom, though they would be subject to the approval of government ministers and the UK parliament. According to the Bill, however, there would still be scope for providers to comply with their duties under the legislation without necessarily following the steps recommended in the Ofcom codes.
If the Bill is passed as drafted, Ofcom would gain a raft of other new powers, including the right to issue a “technology warning notice” to providers where it suspects they are failing to comply with their legal duties. Under those notices, the regulator would be able to order providers to use “accredited technology” to identify terrorist content or child sexual exploitation and abuse content and “swiftly” take down, or in the case of search services specifically, remove the material from their search results. Where that technology is already being deployed, Ofcom would be able to order to provider to take specific steps in using that technology so that it is implemented “more effectively”.
Providers that fail to meet their duties under the new Bill would be subject to potential enforcement action. Ofcom would have the power to impose fines of up to £18 million, or 10% of a providers’ annual global revenue, whichever is highest.
The regulator would also be able to seek court orders to disrupt the activities of non-compliant providers or prevent access to their services altogether where it is deemed there is a risk of significant harm to individuals in the UK.
The draft Bill also provides the government with deferred powers to introduce a new criminal offence for senior managers which it said it could introduce if it decides that action is necessary to drive compliance.
The government has estimated that the cost of compliance to businesses under the proposed new Bill would be almost £1.7 billion over a 10 year period. Additional content moderation costs would exceed £1.2bn over that period alone, it said.
The government said that the draft Bill would be subject to pre-legislative scrutiny by a joint committee of MPs during the current parliamentary session, before a final version of the draft legislation is introduced before parliament.
Technology law expert David Barker of Pinsent Masons, the law firm behind Out-Law, said: “The Bill is a far-reaching attempt to regulate numerous aspects of online activity. It will need to be scrutinised closely over the coming months to test whether the new regulatory approach will be workable in practice”.
The publication of the Online Safety Bill comes after the government consulted on proposals for how online intermediaries might address what the government previously termed as "online harms". In February 2020, the government published an initial response to the feedback it had received, and late last year published its full consultation response.
The European Commission last December set out its own plans for a revised framework for the removal of illegal content as part of a proposed new EU Digital Services Act.