Child Development
“The processing of children’s personal data continues to be an area of strategic focus for the Information Commissioner and is a rare area in which the ICO has proactively investigated and taken enforcement action. Despite this, the ICO’s own enquiries reveal that non-compliance with its Age Appropriate Design Code aka Children’s Code continues to be widespread among popular social media platforms, which raises questions about the efficacy of the ICO’s approach to enforcement whereby it grants multiple opportunities to address its concerns.
As Ofcom begins its enforcement of the Online Safety Act 2023, and the regulators continue to co-operate, we anticipate further investigations and enforcement action in this area, particularly in relation to age assurance, and an expansion of the ICO’s focus to incorporate AI tools. ”
The statutory code of practice on the processing of children’s personal data in the context of information society services under section 123 Data Protection Act 2018, the ‘Age Appropriate Design Code’ also commonly referred to as ‘Children’s Code’, was issued on 12 August 2020, came into force on 02 September 2020 and required controllers in scope to comply by 02 September 2021. The Code establishes 15 standards for providers of information society services based in the UK or which offer services to the UK and which are likely to be accessed by children to comply with, some of which go beyond the requirements of the UK GDPR and Data Protection Act 2018. In particular, the Code incorporates the obligation under Article 3(1) United Nations Convention on the Rights of the Child that the best interests of the child shall be a (but not the) primary consideration.
Even before the Code came into force, the ICO trumpeted its success in securing changes to the provision of online services, albeit some of these successes relate to online safety rather than data protection, including:
the introduction of age gating by Meta based on self-declaration, parental supervision tools and limiting the personal data used for the purposes of targeted advertising in relation to under 18s to their age and location;
Google turning off autoplay on YouTube and introducing break and bedtime reminders by default, as well as disabling location history on Google accounts for under 18s, expanding safeguards on age-restricted ads and enabling under 18s or their parents to request the removal of their images from Google search results in accordance with the Article 17 UK GDPR right to erasure.
The Children’s Code warned that “If you do not follow this code, you may find it difficult to demonstrate that your processing is fair and complies with the GDPR or PECR” and that it would “target our most significant powers, focusing on organisations and individuals suspected of repeated or wilful misconduct or serious failure to comply with the law”.
In April 2024, the ICO published its Children’s Code Strategy, reporting that “In the two years since the Children’s code was introduced, we have empowered many organisations through advice and guidance” and that it had “delivered impactful interventions to drive compliance”, including auditing 11 of the world’s largest games and social media platforms, assessing the conformity of 44 organisations, investigating and issuing a Preliminary Enforcement Notice against Snap, Inc.’s ‘MyAI’ generative AI chatbot (albeit that concluded with the ICO issuing a Notice detailing its findings of “Non-Infringement’ of Article 35 UK GDPR and Article 36 UK GDPR after Snap carried out an updated data protection impact assessment (DPIA)) and, imposed a £12.7m monetary penalty notice on TikTok in respect of breaches of Articles 5, 8, 12 and 13 UK GDPR in relation to its processing of the personal data of under 13s. The ICO committed to focusing on 4 priority areas throughout 2024-25 for the purposes of evidence gathering, engagement and, supervision and enforcement, specifically: default privacy and geolocation settings; profiling children for targeted advertising, the use of children’s information in recommender systems and, the use of information pertaining to children under 13 years of age.
In August 2024, the ICO issued a progress update on its Children’s Code strategy, stating that it had carried out a review of a sample of 34 social media and video sharing platforms (although it was only able to complete its observations for 30 due to account creation requirements). This found non-compliance by at least some platforms in every priority area identified in the strategy, with particular concerns being raised including children’s profiles being made public by default, strangers being able to receive messages and friend requests from strangers by default, nudge techniques and other functionality which encouraged children to share geolocation, the granularity of geolocation data, the inability to switch off geolocation sharing, excessive data collection for the purposes of profiling and targeted advertising and the inability to control advertising preferences, privacy notices that were unclear as to what and how personal data would be used for recommender systems and, the use of self-declaration as a method of age assurance despite the ICO having identified in its Opjnion on Age Assurance that this was unlikely to be a suitable method for confirming that children were aged 13 plus where processing posed high risks, including profiling, tracking or targeting for marketing.
As a consequence, the ICO noted that it:
had written to 5 platforms in relation to default privacy settings, putting them “on notice that, if they do not bring themselves into compliance or demonstrate a compelling reason for their current approach within eight weeks, we will conclude investigations with a view to formal enforcement action, where appropriate” and that it planned “to name the organisations we have written to, both platforms that improve their practices and those in respect of which we will conclude investigations with a view to formal enforcement action where appropriate” (they were subsequently identified as Dailymotion, Twitch, Frog, Discord and SendIt);
was engaging with several other platforms calling on them to improve their practices in relation to default privacy settings;
was writing to 4 platforms in relation to geolocation settings (later updated to 5: BeReal, SendIt, Soda, Frog and Vero);
had established a “programme of active engagement” where it had identified potential concerns in relation to profiling and targeted advertising “verifying their approach” and indicating it “will follow up to secure improvements”;
was issuing a call for evidence in relation to the use of children’s personal data in social media and video sharing platform recommendation systems and the use of age assurance by those platforms to identify under 13s;
was engaging with social media and video sharing platforms to “better understand” their approach to the use of children’s personal data in recommendation systems;
was writing to 4 platforms to “clarify the lawful bases they rely on to process children’s personal information and their approach to age assurance” (later identified as Vero, Imgur, Fruitlab and Vimeo);
was “considering the practices” of platforms relying on consent to process personal data of non-logged in users.
In September 2024, the ICO joined 10 other global data protection and privacy regulators in signing a Joint Statement on a Common International Approach to Age Assurance, which reiterates the application of several principles and requirements of data protection law in the context of age assurance but builds on these, importing concepts of international human rights law and online safety, including at principle 3 which states that “Any age assurance implemented should be in the best interests of the child, while guaranteeing all users' fundamental right to access information from the internet”. and at principle 7 which provides “Providers should balance the data protection risks posed by the age assurance method(s) implemented against the best interests of the child, including their rights to safely access diverse information online while being protected from harmful material”. The Joint Statement also extends international recognition for the Information Commissioner’s position that “Self-declaration alone should be used only in situations where there is little to no data protection risk to children”.
In what appears to be the conclusion of its 2024-25 Children’s Code Strategy, in March 2025 the ICO published its most recent Children’s Code progress update, announcing that it had opened 3 investigations and had secured several practice improvements from platforms:
TikTok is once again under investigation, this time in relation to how it uses 13–17-year-olds' personal information to make recommendations to them and deliver suggested content to their feeds (TikTok continues to appeal the monetary penalty notice imposed by the Information Commissioner in April 2023 in relation to its processing of the personal data of under 13s);
Imgur is under investigation in relation to how it assessed the age of child users;
Reddit is under investigation in relation to how it assessed the age of child users;
X, formerly Twitter, stopped serving adverts to users under 18, removed the ability for under 18s to opt in to geolocation sharing, improved the public transparency materials available for under 18s and, created a dedicated help centre for child users and parents;
Sendit stopped automatically including geolocation information in children’s profiles;
BeReal stopped allowing children to post their precise location online;
Dailymotion implemented new privacy and transparency measures encouraging children not to share personal information;
Viber committed to turn off personalised advertising for children, ensuring that children’s default advertising experience is not based on their behavioural data or profiles.
The ICO also announced it was writing to platforms to better understand their approach to age assurance where this combined self-declaration and profiling. This coincides with the imminent deadline for children’s access assessments to be completed under the Online Safety Act 2023, which will be enforced by Ofcom.
The ICO noted that its priority would be “Driving improvements” and that it would “focus on services where the risks to children are likely to be higher”. We anticipate that the ICO will ramp up its regulatory co-operation with Ofcom and that the scope of its efforts is likely to expand beyond social media companies to AI tools.
At Handley Gill, we support organisations to comply with the ICO’s Age Appropriate Design Code / Children’s Code standards, implement appropriate age assurance and draft compliant and accessible privacy notices. We also have significant experience successfully representing organisations under investigation by regulators including the Information Commissioner and Ofcom. If we can support you and your organisation, contact us and book a free initial consultation.
Find out more about our data protection and data privacy services.
Find out more about our Online Safety, Online Harms, Content Moderation and Content Regulation services.