Out-Law News 4 min. read
17 Dec 2024, 8:54 am
The outlining of new measures to address ‘illegal harms’ “fires the starting gun” for service providers’ compliance with the UK’s Online Safety Act (OSA), experts have said.
Meghan Higgins and Lottie Peach of Pinsent Masons were commenting after new guidance and codes of practice on illegal content were issued by Ofcom on Monday. The guidance advises providers of ‘user-to-user’ (U2U) services and search services on how to assess the risks of harm arising from illegal content or illegal activity on their services. The codes, which are subject to parliamentary approval, provide services with practical recommendations on how to manage and mitigate against the illegal harms risks.
The recommendations in the codes fall under broad categories. For U2U services, those categories are: governance and accountability; content and moderation; reporting and complaints; recommender systems; settings, functionalities and user support; terms of service; user access; and user controls. A similar set of categories applies under the search services illegal content code.
Providers have until 16 March 2025 to complete their illegal harms risk assessments under the OSA and, assuming the codes complete the parliamentary process, until 17 March 2015 to implement the recommended measures for their services in the codes – or use other effective measures to protect users. It is open to providers to adopt different measures from those Ofcom recommends, though they must explain how their chosen approach fulfils the duties outlined in the Act. Providers that implement the recommendations in the codes will automatically be deemed to have complied with the legislation.
Duties around illegal harms form just part of the OSA regime: the Act requires the removal of illegal content by providers of U2U services, search services, or services on which pornographic content is published or displayed; the removal of content that is legal but harmful to children by in-scope services that are likely to be accessed by children; and for certain types of “priority” illegal content and content that is harmful to children, services have an obligation to proactively monitor their platforms and remove this content before users encounter it.
The greatest obligations will fall on certain high risk and high reach services, under a new categorisation system for regulated services. The draft categorisation regulations have been laid before parliament and it is expected that Ofcom will release its final categorisation advice to the Secretary of State imminently. Ofcom, the regulator under the OSA, has previously estimated that at least 100,000 online services, based in the UK and overseas, will be in-scope of the OSA.
The illegal harms duties are the first area of the Act that Ofcom has focused its attention on. It consulted on draft illegal content codes last winter and their publication now in ‘final’ form has been accompanied by a raft of related guidance and other papers. These are designed, among other things, to help service providers better understand the codes, how they relate to the duties they face under the OSA, and Ofcom’s intended approach to enforcement.
Meghan Higgins said: “There is a considerable amount of detail for in-scope service providers to have to digest and only a short three-month window for them to do this and then complete their illegal harms risk assessments under the OSA and take steps to address the risks identified. This is a tall order even for the largest online service providers in-scope of the OSA, let alone smaller providers that lack well-resourced compliance teams. Ofcom intends to make a new digital tool available to providers in early 2025 to help them comply with their illegal content duties.”
While finalised codes of practice and guidance relevant to the protection of children are expected to be issued in April 2025, as part of Ofcom’s ‘phase two’ of its OSA implementation roadmap, its new illegal content codes include measures to tackle online grooming.
Ofcom said: “These will mean that, by default, children’s profiles and locations – as well as friends and connections – will not be visible to other users, and non-connected accounts cannot send them direct messages. Children should also receive information to help them make informed decisions around the risks of sharing personal information, and they should not appear in lists of people users might wish to add to their network. This will make it harder for perpetrators of grooming activity to identify and contact vulnerable children.”
“Our codes set an expectation that high-risk providers use an automated tool called hash matching to detect child sexual abuse material (CSAM). This will help prevent the circulation of this damaging material, disrupting offenders, and flagging to services to report these offences. In response to feedback on our consultation, we have expanded the scope of our CSAM hash matching measure to capture smaller file hosting and file storage services. These services are at particularly high risk of being used to distribute CSAM,” it added.
Ofcom has said it will be “proactively driving compliance” with the OSA’s duties on illegal harms in early 2025, including through “supervisory engagement with the largest and riskiest providers to ensure they understand our expectations and come into compliance quickly, pushing for improvements where needed; gathering and analysing the risk assessments of the largest and riskiest providers so we can consider whether they are identifying and mitigating illegal harms risks effectively; [and] monitoring compliance and taking enforcement action across the sector if providers fail to complete their illegal harms risk assessment by 16 March 2025”.
Lottie Peach of Pinsent Masons said: “While the codes are not binding on providers, their publication represents the firing of the starting gun for Ofcom’s oversight of OSA compliance. Providers must take their duties seriously – they could be liable for heavy fines or criminal sanction if they do not.”
The OSA provides for significant powers of enforcement. These include the power for Ofcom to issue fines of up to £18 million, or 10% of a company’s annual global revenue, whichever is highest. Failure to meet certain child protection duties could also give rise to criminal liability for senior managers – including a risk of imprisonment of up to two years.