Out-Law News 4 min. read
10 Apr 2025, 11:39 am
Technology companies and other online platforms operating in the UK still have time to respond to Ofcom’s consultation on its proposed guidance as to how to better protect women and girls online, an expert has said.
The draft guidance (60 pages/1.1 MB), published on 25 February under the Online Safety Act (OSA), sets out ambitious but practical measures to be implemented to improve the safety of women and girls online. It has been issued at a time of increasing concerns about and media coverage of the prevalence of misogynistic content online. The guidance is intended to supplement the existing safety duties on service providers in scope of the OSA to provide a holistic approach to addressing the harms to women and girls online.
The guidance recognises both that women and girls face unique risks online and that they are disproportionately affected by certain types of online harm. It also explains that the spread of harmful gender-based content online can normalise harmful gender dynamics, and in acute cases radicalise boys and men.
The guidance focuses on four major issues that impact the experiences of women and girls online: online misogyny, ‘pile-ons’ – where coordinated groups target specific individuals or groups – and online harassment, online domestic abuse, and intimate image abuse. It also outlines nine action points for online service providers to use as a framework when considering how to improve online safety for women and girls, including measures to take responsibility for online gender-based harms, to take action to minimise the risk of such harm before it occurs, and to better support women and girls by addressing online gender-based harms when they do occur.
The recently published guidance draws on industry research to help explain the risks and harms that women and girls currently face on online services. For example, it was found that young people searching for friends, advice or shared groups are served content that is increasingly misogynistic through their recommender feeds. As a case study, the guidance highlights an Open University project aimed at detecting ‘misogynoir’ – hateful content directed at Black women and girls. However, the guidance highlights that, for these techniques to be successful, they will need to recognise that safety measures which treat different kinds of abuse, such as racism and misogyny, in isolation, will fail to account for intersectional hate.
Additionally, a report in 2020 by Plan International, found that 58% of girls and young women surveyed had personally experienced some form of online harassment. The guidance confirms that harassment is often highly personal and can involve patterns of behaviours aimed at isolating the survivor and victim. Ofcom suggests that there needs to be an increased amount of reporting options. The reporting systems must be updated to cover any changes providers make to their services and cover all types of content and interaction supported on a service.
For each of the nine action points, Ofcom has recommended practical steps that online service providers can take in response which are categorised as either ‘foundational’ or ‘good practice’ steps.
‘Foundational steps’ represent the minimum standards online service providers are expected to meet in order to comply with their duties to protect users under the Online Safety Act. These are not new requirements – the ‘foundational steps’ refer to a range of expectations Ofcom has already set out for service providers in its Illegal Content and Protection of Children Codes of Practice and related guidance. However, this guidance sets out how those requirements can be fulfilled by online service providers to ensure online specific but prevalent gender-based harms are addressed.
The ‘foundational steps’ apply to different services based on functionality, risk, and size. However, the steps include having an ‘accountable individual’ for compliance with online safety duties, a written statement of responsibilities for senior managers who make decisions related to the management of online safety risk, having an internal monitoring and assurance function and ensuring a complete code of conduct that sets standards and expectations of individuals working for the providers.
The draft guidance also proposes ‘good practice steps’. These are classed as additional steps that can be taken but are ‘less commonly used’. However, Ofcom says that “we do not expect all service providers to need to – or be able to – implement all of the foundational steps or good practice steps we have set out under each action” but that it strongly encourages “providers to implement relevant good practice steps in addition to taking the action required to meet their enforceable duties”. Good practice recommendations include prompting users before they post potentially harmful content, enhancing visibility settings to allow users to privatise content retroactively, and removing geolocation data by default.
Examples of changes set out in the guidance that providers can make include better moderation practices and enhanced reporting mechanisms.
Sadie Welch, technology law expert at Pinsent Masons, said: “Ofcom has explained that it considers it core to women and girls’ safety online that they have greater control over who contacts them, what they see, and the information about them that is visible to others. One of the foundational steps to support this is the implementation of user-friendly reporting tools which allow users to easily report harmful content. The tools should be accessible, intuitive and provide clear instructions on how to report different types of abuse, such as image-based sexual abuse. There is also a suggestion that real-time support should be implemented, including automated detection techniques and automated content moderation. However, the guidance outlines that it is important to ensure that the systems are accurate, effective, contextually nuanced and that they minimise bias whilst continuously improving.”
Online services in scope of the Online Safety Act were required to have completed their Illegal Content Risk Assessment and Children’s Access Assessment by March 2025 and may have done so without fully engaging with the content set out in this draft guidance. However, Ofcom is clear that it sees the application of the actions outline above as an ongoing exercise where providers can continually assess and improve the experiences of women and girls on their service.
Responses to the consultation must be submitted by 5pm on 23 May. Following the consultation, Ofcom intends to publish the final guidance by the end of the year. Additionally, Ofcom stated that it will publish an assessment of what technology companies have implemented to protect women and girls online around 18 months after finalising the guidance.