Snap Out of It
“The ICO’s decision in respect of Snap’s MyAI chatbot emphasises the importance of ensuring that a thorough DPIA is conducted prior to the commencement of processing, with an understanding of the implications of its conclusions, but also demonstrates that even where large organisations fail to comply with their legal obligations in respect of vulnerable data subjects for significant periods, and only take remediating action under threat of enforcement, the imposition of a monetary penalty notice is unlikely. ”
In late February 2023, Snap, Inc. launched its ‘MyAI’ chatbot, based on OpenAI’s generative pre-trained transformer (GPT) large language model (LLM) via a custom API, for Snapchat+ subscribers, and on 19 April 2023 made MyAI available to all users.
Snap’s MyAI
The personal data collected or generated in the course of using the MyAI chatbot included a My AI Bio, age bucket, non-granular geolocation (city or region), IP address, country code, region, city and user ID, timestamps and conversation IDs for individual queries submitted to My AI, interaction data, keywords extracted from interactions to identify commercial intent and, summaries of previous interactions used to inform future interactions referred to as ‘memories’. Queries, summarise of previous interactions and user attributes would be shared with OpenAI. Where commercial intent was identified, the query, the user’s age bucket, a unique user ID for and other metadata would be shared with Microsoft to deliver a “contextual advertisement” incorporated into the MyAI response to the relevant prompt.
Snap indicated that it had measures in place to filter certain types of responses to produce an error message and others would “trigger a response containing vetted information and self-help resources”. Similarly, content moderation filtering was in place where certain types of images were uploaded. The specifics of this information were redacted from the ICO’s decision.
Snap ultimately identified the purpose(s) of processing as being “providing a personalised My AI experience, improving the service, delivering contextual advertisements and providing a safety and security-oriented feature”.
While user figures are redacted form the decision, it does reveal that as at March 2023, over a quarter of users of Snap’s MyAI tool were aged 13-17.
The ICO’s Investigation
In March 2023, after MyAI was released to Snapchat+ users but before it was released to all users, Snap had requested a meeting with the ICO to discuss matters including MyAI.
That meeting was held on 20 April 2023, subsequent to which the ICO made an informal request (i.e. not in the exercise of its powers under s.142 Data Protection Act 2018 to issue an information notice) that Snap confirm its lawful basis for processing, whether it was a controller or processor; whether it had conducted a data protection impact assessment (DPIA), its transparency measures, mitigations to security risks, data minimisation, mechanisms to comply with data subject rights requests and, whether AI is being used to make solely automated decisions within the meaning of Article 22 UK GDPR, these being the ‘eight questions that developers and users need to ask’ identified by the ICO’s Executive Director of Regulatory Risk in a blog on generative AI. This was followed by a request for a copy of the DPIA, which Snap declined to provide.
On 19 May 2023, the ICO utilised the powers under s.142 Data Protection Act 2018 to require Snap to provide all versions of its DPIA in relation to MyAI.
On 26 May 2023, Snap provided its most recent DPIA in respect of MyAI of the same date to the ICO. On 21 May 2023 it provided three earlier versions of the DPIA, the earliest of which was dated 24 February 2023, which were redacted.
On 02 June 2023, the ICO advised Snap of its failure to comply with the information notice by providing unredacted copies of its DPIA. While Snap originally indicated that its position was that the redacted portions of the DPIAs were not “material or substantive” to the ICO’s inquiry, by 05 June 2023 it had capitulated and provided unredacted copies.
On 23 June 2023, Snap was informed that the ICO was opening an investigation into its compliance with Article 35 and Article 36 UK GDPR.
The ICO simultaneously issued a second information notice under s.142 Data Protection Act 2018 related to the introduction of the MyAI feature and the risk assessment process, and included documents referenced in the earlier DPIAs.
The Preliminary Enforcement Notice Issued By the Information Commissioner Against Snap
On 06 October 2023, the ICO issued Snap with a Preliminary Enforcement Notice in which it indicated that it had reached the provisional conclusions that Snap had failed to carry out a data protection impact assessment (DPIA) that met the requirements of Article 35(7) UK GDPR (i.e. that contained: a systematic description of the envisaged processing operations and the purposes of the processing; an assessment of the necessity and proportionality of the processing operations in relation to the purposes; an assessment of the risks to the rights and freedoms of data subjects; and, mitigations, including safeguards, security measures and mechanisms to ensure the protection of personal data) and, that it had failed to comply with the obligation under Article 36(1) UK GDPR of prior consultation in respect of processing that the DPIA reveals would result in a high risk in the absence of measures taken by the controller to mitigate the risk “despite appearing to conclude in the first four iterations of its DPIA relating to My AI that such processing would result in a high risk to the rights and freedoms of users aged 13-17 in the absence of measures taken to mitigate the risk”. The notice required Snap, Inc. to “carry out a revised data protection impact assessment in relation to My AI”.
In relation to the DPIA, in the Provisional Enforcement Notice the ICO provisionally concluded that with its First, Second, Third and Fourth DPIAs, Snap had totally failed to meet the required standards in accordance with the ICO’s DPIA guidance and that, in particular, Snap had:
failed adequately to describe how it intended to process personal data in connection with My AI and, in particular, had failed to adequately explain how it uses personal data and/or OpenAI’s GPT technology to display advertisements to users, explain how information or inferences extracted from user queries sent to My AI are used for the purposes of personalisation or to serve targeted advertisements on My AI and across other parts of the Snapchat platform, clearly identify the categories of personal data that are used for the purposes of advertising and clearly identify the retention periods for the different categories of personal data identified as being processed in connection with My AI;
failed to “describe the wider context in which it processed personal data in connection with MyAI. Specifically, the Commissioner provisionally found that Snap had failed to consider issues of public concern relating to the use of generative AI and individuals’ expectations in respect of the use of geolocation data”. In addition, the ICO had provisionally concluded that Snap “had inadequately assessed the necessity and proportionality of the processing operations performed in connection with My AI”;
failed to incorporate an assessment of the “risks to the rights and freedoms of data subjects in relation to: (a) the targeting of users aged 13-17 for advertising purposes; (b) the processing of special category data on a large scale; and (c) the impact of the use of generative AI technology and the risk that, due to its novelty and complexity, users, particularly those aged 13-17, would be less likely to understand the manner in which and the purposes for which their personal data is processed” and included only “a cursory and higher-level risk assessment, in tabular format, which failed to include any explanation of the basis for its conclusions in respect of the likelihood and severity of harm, and the overall risk level”;
“failed to consider the risks associated with the processing of special category data on a large scale”;
“failed to comply with the requirements of Article 35(7)(d) UK GDPR as some of the mitigatory measures listed by Snap were inaccurate and / or did not address the risks that Snap had identified”; and,
not incorporated “any child-specific mitigatory measures”.
These facts were revealed in the ICO’s decision of 21 May 2024 detailing its “Findings of Non-Infringements” of Article 35 UK GDPR and Article 36 UK GDPR.
The ICO made an announcement that it had “issued Snap, Inc and Snap Group Limited (Snap) with a preliminary enforcement notice over potential failure to properly assess the privacy risks posed by Snap’s generative AI chatbot ‘My AI’”, particularly in relation to children.
While purporting to emphasise that “No conclusion should be drawn at this stage that there has, in fact, been any breach of data protection law or that an enforcement notice will ultimately be issued”, the Information Commissioner John Edwards issued a statement that “The provisional findings of our investigation suggest a worrying failure by Snap to adequately identify and assess the privacy risks to children and other users before launching ‘My AI’” and “We have been clear that organisations must consider the risks associated with AI, alongside the benefits. Today's preliminary enforcement notice shows we will take action in order to protect UK consumers' privacy rights.”
Having been issued with the preliminary enforcement notice, on 03 November 2023 Snap submitted written representations to the ICO.
On 22 November 2023, Snap complied with the requirement in the preliminary enforcement notice to carry out a revised DPIA, which it submitted to the ICO.
Snap’s written representations were subsequently supplemented with a skeleton argument on 11 December 2023.
An oral hearing was held on 14 December 2023.
Subsequent to that hearing, Snap published revised ‘just in time’ privacy notices to users in respect of MyAI, issuing a new notice to iOS users on 18 December 2023 and to Android users on 22 December 2023.
On 15 January 2024, the ICO requested responses from Snap to several questions arising following the oral hearing, to which Snap replied on 04 February 2024.
On 13 February 2024, Snap advised the ICO of further changes made to its Privacy Policy, just in time privacy notice, its ‘Family Centre’ and to the most recent fifth DPIA.
The ICO’s Decision Notice Detailing Findings of Non-Infringement
On 21 May 2024, the ICO published its Decision Notice in respect of its investigation into Snap, detailing its findings of non-infringement in respect of Articles 35 and 36 UK GDPR.
Article 35
The Decision Notice states that the “Commissioner has concluded that Snap has, as of 22 November 2023, carried out an assessment of the impact of the envisaged processing operations on the protection of personal data relating to My AI that complies with Article 35 UK GDPR” in a manner that “contains a systematic description of the processing operations performed by Snap and its processors for the purposes of providing the My AI feature to Snapchat users”.
Despite the ICO’s guidance on DPIA’s stating that “You should seek and document the views of individuals (or their representatives) unless there is a good reason not to” as part of the DPIA, there is no indication that any consultation was carried out and nor is there any censure for failing to do so.
Article 36
The Information Commissioner found no infringement of the Article 36 UK GDPR obligation to carry out prior consultation with the Information Commissioner in respect of processing that the DPIA reveals would result in a high risk in the absence of measures taken by the controller to mitigate the risk.
This might be surprising in light of the Commissioner’s earlier assertion that Snap’s own DPIAs had appeared “to conclude in the first four iterations of its DPIA relating to My AI that such processing would result in a high risk to the rights and freedoms of users aged 13-17 in the absence of measures taken to mitigate the risk”, which was based one DPIAs recording that Snap “assessed the “likelihood of harm” as “probable”, the “severity of harm” as “significant” and the “overall risk” as “high”” and that this risk had been “accepted” in respect of 13-17 year olds.
The ICO’s conclusion was stated to have been reached as a consequence of the ICO’s acceptance of Snap’s representation that “this was an error and did not reflect Snap’s true assessment of the risk posed to such users once its mitigatory measures were taken into consideration” and that in fact “Snap had formed the view when carrying out the first four DPIAs for My AI that the risks to data subjects were not high, with mitigations in place” and that the residual risk was, in fact, “medium”.
Enforcement and penalties
As a consequence of the finding that Snap had completed a satisfactory DPIA by 22 November 2023, albeit some nine months after its MyAI product launched and only in response to a Preliminary Enforcement Notice, the ICO concluded that “there are no grounds to issue an Enforcement Notice which requires Snap to cease processing the personal data of Snapchat users in the UK for any purpose connected to My AI”.
Section 149 Data Protection Act 2018 provides that where the Commissioner is satisfied that a person has failed or is failing to comply with their obligations, including to complete a data protection impact assessment (DPIA), it can issue an enforcement notice requiring them to take or refrain from taking the steps specified in the notice, being requirements appropriate for the purpose of remedying the failure.
The Decision makes no reference to section 155 Data Protection Act 2018, which grants the Information Commissioner the power to issue a monetary penalty notice where there is or has been a failure to comply with the UK GDPR, having regard to Article 83(2) UK GDPR, i.e. the nature, gravity and duration of the infringement taking into account the nature scope or purpose of the processing concerned as well as the number of data subjects affected and the level of damage suffered by them, the intentional or negligent character of the infringement, any action taken by the controller or processor to mitigate the damage suffered by data subjects, the degree of responsibility of the controller or processor taking into account technical and organisational measures implemented by them pursuant to Articles 25 and 32, any relevant previous infringements by the controller or processor, the degree of cooperation with the Commissioner in order to remedy the infringement and mitigate the possible adverse effects of the infringement, the categories of personal data affected by the infringement, the manner in which the infringement became known to the Commissioner, in particular whether (and if so to what extent) the controller or processor notified the infringement, where measures referred to in Article 58(2) have previously been ordered against the controller or processor concerned with regard to the same subject-matter, compliance with those measures, adherence to approved codes of conduct pursuant to Article 40 or approved certification mechanisms pursuant to Article 42; and any other aggravating or mitigating factor applicable to the circumstances of the case.
What can we learn?
The ICO’s decision:
Emphasises the importance of carrying out a compliant and thorough data protection impact assessment (DPIA) prior to commencing processing and in good time to ensure that any further obligations, such as prior consultation, can be carried out in advance of commencement;
Underlines that a thorough DPIA that makes the relevant assessments correctly first time (subject to periodic review) will prevent the potential risk arising from comparisons of multiple versions of the DPIA;
Reiterates the need to understand the implications of the conclusions of DPIAs when they are being prepared and signed off;
Demonstrates that the ICO will prioritise the processing of children’s personal data for investigation and associated publicity, in accordance with its ICO25 strategic plan;
Suggests that the ICO won’t hesitate to use its powers to compel the provision of information in the context of a potential investigation and declining informal requests is unlikely to be well-received;
Indicates that, even in the face of significant and prolonged non-compliance by major organisations in relation to vulnerable data subjects such as children, in respect of whom higher standards are supposed to apply in accordance with the ICO’s Age Appropriate Design Code / Children’s Code, and which the Information Commissioner himself described as a “worrying failure”, the ICO will be reluctant to utilise its power to issue penalties and organisations are likely to be afforded several opportunities to remediate their approach without fear of monetary penalty.
At Handley Gill, we support organisations to undertake Data Protection Impact Assessments and wider AI Impact Assessments or AI Conformity Assessments. We also have significant experience successfully representing organisations under investigation by regulators including the Information Commissioner. If we can support you and your organisation, contact us and book a free initial consultation.
Find out more about our data protection and data privacy services.
Find out more about our responsible and ethical artificial intelligence (AI) services.