LEGAL, REGULATORY & COMPLIANCE CONSULTANTS

Handley Gill Limited

Our expert consultants at Handley Gill share their knowledge and advice on emerging data protection, privacy, content regulation, reputation management, cyber security, and information access issues in our blog.

Not so instant compliance

Meta’s proposals to improve children’s privacy and safety when using Instagram are to be welcomed, but we shouldn’t be fooled into thinking that the introduction of Instagram Teen Accounts has been driven by the Information Commissioner’s Children’s Code or the threat of its enforcement.
— Handley Gill Limited

02 September 2021: The Information Commissioner’s Office Age Appropriate Design Code, now more commonly referred to as the Children’s Code, takes effect (having been issued in August 2020) requiring that online services “follow a set of standards when using children’s data” including that information society services likely to be accessed by children, including social media sites, must ensure the “best interests of the child” are the primary consideration, “Take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users”, impose “‘high privacy’ by default” and provide appropriate transparency in relation to any parental controls and monitoring.

September 2024: Meta announces it will introduce Teen Accounts on Instagram for users in the US, UK, Canada and Australia initially, with Teen account protections “turned on automatically” including “default private accounts”, “messaging restrictions”, limits on interactions, “sensitive content restrictions”, “time limit reminders” every hour and sleep mode enabled, with under 16s requiring parental consent to diminish or remove those protections.

Despite the ICO’s warning in the Children’s Code that “If you do not follow this code, you may find it difficult to demonstrate that your processing is fair and complies with the GDPR or PECR” and that it would “target our most significant powers, focusing on organisations and individuals suspected of repeated or wilful misconduct or serious failure to comply with the law”, Instagram’s operator Meta has not faced any public enforcement action or criticism in the 3 year period since the Children’s Code came into force. Indeed, the ICO published a statement in response to Meta’s announcement stating “We welcome Instagram’s new protections for its younger users following our engagement with them”, apparently seeking to take credit for playing a role in bringing about these changes but without raising any concern regarding the three year delay in compliance.

An expansion of its Teen Account offering for Instagram users is proposed in the EU later in 2024, and globally in 2025, with Meta committing to delivering similar restrictions on its other platforms, such as Facebook, in 2025.

Meta’s proposals follow its announcement in January 2024 that it would prompt teens to update their privacy settings on Instagram and would be implementing stricter content controls and search recommendations to protect child users from age inappropriate content.  

Perhaps the more likely scenario is that Meta has been spurred on by the US Senate passing the Kids Online Safety Act (‘KOSA’) in July 2024, subsequent to which President Biden called on Congress to submit the Bill to him for signature without delay.  The Act would impose a duty of care on platforms in relation to individuals under the age of 17 who use their services as well as stronger parental controls and impose specific requirements such as limiting the ability to communicate with minors, limit features that encourage children to spend more time on the service, impose limits on algorithms that deliver personalized recommendations and offer parental tools. A version of the Bill was advanced by the House Committee on Energy and Commerce and can now proceed to a vote on the floor of the House of Representatives.

KOSA would supplement the US’ Children and Teen’s Online Privacy Protection Act (‘COPPA’), which currently restricts the processing of personal data of children under the age of 13, and could be complemented by COPPA 2.0, which was also progressed to the House floor, and which would increase the age threshold to 17, require the implementation of functionality to delete children’s data from platforms, and ban targeted advertising to children.  

The Information Commissioner’s Office is currently consulting on elements of its Children’s Code Strategy ‘Protecting Children’s Privacy Online’, which was published in August 2024, specifically the use of the personal data of children under the age of 13 and the identification of such individuals through age assurance or age verification, and the use of children’s personal data in algorithmic recommender systems by social media (SMPs) and video sharing platforms (VSPs).  Ofcom identified recommender algorithms as children’s main pathway to online harms in its ‘Protecting children from harms online’ consultation in relation to its implementation of the Online Safety Act 2023, published in May 2024, and proposed to require certain user-to-user and search services to configure algorithms to remove or reduce the visibility of harmful content in its draft Children’s Safety Codes.

While the position of the UK government and its regulators (as well as activists in this area) in advancing online safety and children’s rights has provided a foundation for global action, it appears to be a combination of the Online Safety Act 2023 and Ofcom’s guidance and pending international legislation that has driven action.

The laissez-faire attitude of the ICO towards data protection enforcement merely encourages the world’s biggest companies that non-compliance will not lead to opprobrium or punishment and assures them that they will be able to engage privately over a period of years before they ultimately choose whether or not to comply at their leisure. This is likely to be reflected in the approach of regulated entities in other areas, such as the application of the law to artificial intelligence (AI). Only last week, Meta announced that it would be resuming the training of its AI models on users’ personal data, albeit restricting this to data shared publicly after it temporarily paused its approach, having “engaged positively with the Information Commissioner’s Office (ICO)” whose “constructive approach” it welcomed. This forced the ICO into issuing a statement in response that “The ICO has not provided regulatory approval for the processing and it is for Meta to ensure and demonstrate ongoing compliance”. One suspects Meta’s executives won’t be losing any sleep.

If your organisation requires support in complying with the ICO’s Children’s Code, or understanding how the Online Safety Act 2023 applies and how compliance can be implemented, please contact us.

Find out more about our data protection and data privacy services.

Find out more about our Online Safety, Online Harms, Content Moderation and Content Regulation services.

Our Online Safety & Online Harms Resources page has links to relevant documents regarding the passage and implementation of the Online Safety Act 2023.