Time's up
“Ofcom is wasting no time in taking steps to enforce the Online Safety Act 2023, having sought copies of illegal content risk assessments using its enforcement powers even before yesterday’s deadline for completing them.
Now Ofcom is focusing its immediate efforts on the highest risk content, announcing investigations to ascertain compliance with the requirement to tackle child sexual abuse material (CSAM). ”
The deadline has arrived for all regulated user-to user services and regulated search services to have completed their illegal harms risk assessment aka illegal content risk assessment under the Online Safety Act 2023.
Regulated user-to-user services are providers of services hosting user-generated content and which have a significant number of United Kingdom users or where UK users are the, or one of the, target markets, or, is capable of being used by UK users and poses a material risk of significant harm.
Regulated search services are search engines or services incorporating search engines which have a significant number of United Kingdom users or where UK users are the, or one of the, target markets, or, is capable of being used by UK users and poses a material risk of significant harm.
Such regulated services are obliged by s.9 Online Safety Act 2023 and s.26 Online Safety Act 2023, respectively, to undertake an illegal content risk assessment and to maintain records in accordance with s.23 Online Safety Act 2023 and s.34 Online Safety Act 2023, respectively.
It would be wrong to think that only Big Tech organisations like Meta, which provides Instagram and Facebook, Google, TikTok, X, Snap, Reddit and others are affected by these legal obligations; the government’s own impact assessment on the enactment of the Online Safety Act 2023 recognised that it anticipated that some 21,500 small and medium sizzed business would fall within the scope of the OSA and that compliance would cost each of them thousands of pounds to carry out their assessments and implement mitigations, such as user reporting functionality.
On 03 March 2025, even before yesterday’s deadline, the online safety regulator Ofcom announced that it had opened an enforcement programme in respect of illegal content risk assessments by issuing information notices pursuant to s.100 Online Safety Act 2023 to require “the largest social media platforms, as well as smaller but risky sites” to submit their illegal content risk assessments to Ofcom by 31 March 2025 in order to “monitor whether providers are complying with their illegal content risk assessment duties and record keeping duties under the Online Safety Act” with the main objectives to “monitor compliance with the relevant duties in the Act, to monitor how our risk assessment guidance and record keeping guidance are being applied by industry and support the adoption of best practice”.
Now, with the deadline to complete the illegal harms content assessment having passed, and platforms expected to have identified and assessed the risks of illegal harms and identified appropriate mitigations, in scope platforms are obliged to start implementing those mitigations to prevent illegal content from appearing on their platforms and to act quickly to remove it when it slips through the net, in accordance with the illegal content safety duties under s.10 Online Safety Act 2023 in respect of regulated user-to-user service and s.27 Online Safety Act 2023 in respect of regulated search services.
Ofcom has today (17 March 2025) launched a further enforcement programme into measures being taken, or due to be taken, by file-sharing and file-storage services to prevent users from encountering or sharing child sexual abuse material (CSAM) to “assess the measures being taken by providers of file-sharing and file-storage services that present particular risks of harm to UK users from image-based CSAM to ensure users do not encounter, and offenders are not able to disseminate, such content on their services”. Ofcom states that it has provided several services with prior notification of its intention to issue information notices requiring them to provide copies of their illegal content risk assessments as well as details of the measures they already have in place or plan to put in place. Failure to comply or any concerns will lead to investigations being opened in respect of individual services.
We anticipate that this will prove to be merely the first of several investigations that Ofcom will undertake to assess compliance and that it will want to be seen to act strongly against large or high risk services that it perceives as failing to comply or not to be taking their obligations seriously.
At Handley Gill, we have significant experience advising regulated entities on responding to regulatory enquiries and in successfully representing organisations under investigation by regulators, including Ofcom. If we can support you and your organisation, contact us and book a free initial consultation.
Should you require support in understanding whether the Online Safety Act 2023 applies to you, in conducting an illegal harms risk assessment, a children’s access assessment, or in implementing safety measures to comply with your safety duties, contact us.
Find out more about our Online Safety, Online Harms, Content Moderation and Content Regulation services.
Our Online Safety & Online Harms Resources page has links to relevant documents regarding the passage and implementation of the Online Safety Act 2023.