Mum's the Word!
“With just over a month left for in-scope services to carry out illegal content risk assessments by the 16 March 2025 deadline, after which they will be required to start implementing measures to address those risks and comply with their illegal harms safety duties, the targeting of the Mumsnet forum with illegal content demonstrates the importance of sites and apps identifying their vulnerabilities and considering how they can take action to ensure they have measures to comply with their illegal harms safety duties in place, as egregious breaches face fines of the greater of £18m or 10% of qualifying worldwide revenue.”
The UK online safety regulator, Ofcom, published the first edition of its Illegal Harms Code under the Online Safety Act 2023 (‘OSA 2023’) on 16 December 2024, meaning that every site and app offering user to user or search services falling within the scope of the OSA 2023 (such as social media firms, search engines, messaging, forums, gaming and dating apps, and pornography and file-sharing sites) has until Sunday 16 March 2025 to conduct an illegal harms risk assessment aka illegal content risk assessment.
Regulated user-to-user services are required by section 9(2) OSA 2023 to “to carry out a suitable and sufficient illegal content risk assessment” within 3 months of the publication of Ofcom’s guidance, and the same obligation is imposed on regulated search services by section 26(2) OSA 2023.
An illegal content risk assessment or illegal harms risk assessment is an assessment of the risk posed to users of your service by content relating to the one hundred and sixty eight (168) priority offences identified at Schedule 5, Schedule 6 and Schedule 7 OSA 2023. Ofcom has collated these offences into seventeen (17) categories of illegal harms: terrorism; harassment, stalking, threats and abuse offences; coercive and controlling behaviour; hate offences; intimate image abuse; extreme pornography; child sexual exploitation and abuse (CSEA); sexual exploitation of adults; unlawful immigration; human trafficking; fraud and financial offences; proceeds of crime; assisting or encouraging suicide; drugs and psychoactive substances; weapons offences (knives, firearms, and other weapons); foreign interference; and, animal welfare. In-scope user-to-user services are required to consider the risk of illegal content being present on or disseminated via their service and the risk of their service being used for the commission or facilitation of a priority offence, whereas in-scope search services are required to consider the risk of individuals encountering illegal content in their search content, i.e. in or via search results.
Having conducted their illegal harms risk assessment, services will then be required to comply with the illegal content safety duties having regard to their assessment of risks.
In a worrying portent that should act as a catalyst for in-scope services to start addressing the risk of illegal harms on their platforms, BBC News has reported that Mumsnet, which describes itself as the UK’s most popular site for parents and offers a network for parents with 9 million visitors per month and 100 million page views, had been targeted with the posting on its site of "Several sets" of child abuse images between 23:00 GMT on Sunday 02 February 2025 and 03:00 on Monday 03 February 2025. It was reported that Mumsnet utilises overseas volunteer moderators outside of its usual UK business office hours. Mumsnet Limited’s founder and director Justine Roberts was reported as having informed the BBC that “Most of the images were removed within an hour of being posted and all were taken down by 04:00 on Monday”.
Among the action it took in response to the incident, Mumsnet reported the incident to the Metropolitan Police Service, temporarily suspended its photo upload functionality, and stated that it was planning to implement artificial intelligence (AI) filters to pre-moderate content by flagging "illegal" and "disturbing" images before they appear.
The obligation on services to comply with their illegal content safety duties will apply from 17 March 2025, although Ofcom has indicated that it intends to afford in-scope services a grace period of six (6) months when it will exercise “forbearance” provided services are acting responsibly and taking steps toward compliance but reserves the right to enforce in the case of “egregious” breaches of the OSA 2023.
Ofcom’s guidance indicates that, when conducting the illegal harms risk assessment, that assessment requires the service to consider both the likelihood and impact of illegal content being encountered, and that in determining likelihood regard can be had to previous evidence of harm occurring based on user complaints and reports and other relevant evidence.
The Mumsnet incident raises questions about the efficacy of the online safety regime in light of Ofcom’s guidance and, we anticipate, could well lead to a more stringent approach in future iterations of the Illegal Harms Code.
At its recent ‘The Online Safety Act Explained: How to Comply’ event held between 03 – 05 February 2025, Ofcom gave a hypothetical example of a website that it considered could legitimately determine it was at low risk of CSAM urls and image-based CSAM.
Giving the example of a health charity website which includes a forum with 10,000 monthly UK users dedicated to a particular condition, affording users the ability to share their experiences by sharing images, text and hyperlinks and responding to other users, and which operated post-moderation review and a complaint system, Ofcom indicated that while the service would have some risk factors for CSAM urls and image-based CSAM, if it had never had any previously it could conclude that it was at low risk for such content.
It would appear that based on Ofcom’s guidance, a forum such as Mumsnet which, depending on the number of UK users could be considered a large service (Ofcom has set a threshold of an average user base of greater than 7m monthly active UK users) could (certainly before last weekend) have considered itself to be low risk, with the consequence that it would only be required to have “foundational protections for users in place” and would not be required to implement automated content moderation, including hash matching to detect and remove CSAM and the detection and removal of content matching listed CSAM URLs.
While it appears Mumsnet is proposing to take steps now to go beyond the legal requirements that will apply from 17 March, the Mumsnet incident demonstrates the danger of placing too much weight on previous experience of harm, as opposed to the potential of the functionality to enable harm, when conducting risk assessments and the impact of this for the efficacy of the online safety regime in preventing users from encountering such content.
Should you require support in understanding whether the Online Safety Act 2023 applies to you, in conducting an illegal harms risk assessment, a children’s access assessment, or in implementing safety measures to comply with your safety duties, contact us.
Find out more about our Online Safety, Online Harms, Content Moderation and Content Regulation services.
Our Online Safety & Online Harms Resources page has links to relevant documents regarding the passage and implementation of the Online Safety Act 2023.