Specific measures that online service providers can implement to meet their obligations on child safety under the UK’s Online Safety Act (OSA) – including “safer algorithms”, effective content moderation systems, and designating someone accountable for compliance with the children’s safety duties under the Act – have been outlined by a regulator.
The measures are set out in new codes of practice published by Ofcom and include steps services in scope of the OSA can take to prevent children from encountering certain types of harmful content online.
The OSA provides for a higher level of protection online for children than adults. In-scope services are required to have completed a children’s access assessment to assess whether children are likely to access their service or part of it. As a first step, services should consider whether it is possible for children to access the service. In the absence of measures such as highly effective age assurance or other controls that will prevent children from accessing a service, services should consider whether they have or are likely to attract significant numbers of child users. Services were required to have completed children’s access assessments by 16 April 2025.
Ofcom has advised that most services not using highly effective age assurance are “likely to be accessed by children” and therefore must go on to carry out a children’s risk assessment. These assessments need to be completed by 24 July and are separate to the illegal content risk assessments that all services in-scope of the OSA should have completed by 16 March. In carrying out the risk assessment, services should review and consider the specific risk factors associated with particular service types and functionalities, along with more general risks such as their user base age and the business model and commercial profile of their service.
If a service is a user-to-user service that is likely to be accessed by children, proportionate measures will need to be taken to prevent children of any age from encountering ‘primary priority content’ such as pornography, suicide, self-harm and eating disorder content. They also need to protect children in age groups judged to be at risk of harm from encountering other harmful content, classed as ‘priority content’ – including but not limited to bullying content, and content which depicts serious violence or challenges and stunts.
Search services likely to be accessed by children face similar obligations. They must take proportionate measures to minimise the risk of children of any age encountering the most harmful search content to children: pornography, suicide, self-harm and eating disorder content, as well as minimise the risk of children in age groups judged to be at risk of harm from encountering other harmful content – including bullying content, and content which depicts serious violence or challenges and stunts.
Both types of services also have an obligation to address other types of content that present a material risk of significant harm to an appreciable number of children in the UK, which is referred to in the OSA as non-designated content.
The children’s safety codes that Ofcom has now issued, which are subject to parliamentary approval, set out the steps and measures providers of in-scope services can take to meet those obligations and to address the specific risks identified in the child risk assessment.
Adherence to the codes is not mandatory – providers of in-scope services can use alternative methods to meet their obligations under the OSA, but services will need to keep accurate records of the steps taken in order to evidence them.
Provided the codes obtain parliamentary approval, providers of in-scope services will either need to adhere to the codes or implement other effective compliance measures from 25 July 2025.
The safety measures set out in the codes are required to be proportionate. Factors relevant to the proportionality of measures include: the type of service provider – whether it is a user-to-user service or search service; the outcome of the latest risk assessment and risk of harm posed to children; the size of the service – the UK user base – with Ofcom having said ‘large’ services, which it considers are those where the number of monthly UK users exceeds seven million, approximately equivalent to 10% of the UK population, expected to go further than smaller services; and the functionalities and other characteristics of the service.
There are 40 safety measures set out in the codes for user-to-user and search services that can be implemented to mitigate the risks to children across broad areas. These include: carrying out robust age checks through recommended assurance measures, using safer algorithms and ensuring recommender systems are designed to protect children from harmful content, implanting effective content moderation practices at an appropriate scale, strong governance and accountability, and providing more choice and support for children.
Some measures apply to all services, including having someone accountable for compliance with the children’s safety duties, making sure that reporting and complaints functions are easy to use, having content moderation systems and processes and ensuring terms of service or publicly available statements are clear and accessible.
Other additional measures apply to services that pose significant risks to children – these being services that are rated as posing a medium or high risk to children based on the children’s risk assessment.
Services that do not comply with the Act are at risk of penalty fines of up to £18 million or 10% of the global revenue of the service. Senior managers of services that fail to comply with duties under the Act also face potential criminal liability.
In addition to publishing its children’s safety codes, Ofcom has opened a consultation on proposed amendments to its illegal content codes of practice. Under those proposals, all providers of in-scope services under the OSA would be expected – where they seek to adhere to the illegal content codes – to either use highly effective age assurance to offer child users the option to block and mute other users and disable comments on their content or offer these controls to all users on the parts of the service that are accessible by children.
Currently, the illegal content codes only impose such requirements on providers of large user-to-user services with the relevant risks and functionalities. Ofcom said, however, that it considers it now proportionate to extend the scope of those measures so certain smaller services that are likely to be accessed by children apply them too, where they seek to rely on compliance with the codes to demonstrate their compliance with the OSA’s illegal harms duties. Ofcom has requested stakeholder comments on the proposals by 22 July.
Out-Law News
17 Dec 2024