LEGAL, REGULATORY & COMPLIANCE CONSULTANTS

Handley Gill Limited

Our expert consultants at Handley Gill share their knowledge and advice on emerging data protection, privacy, content regulation, reputation management, cyber security, and information access issues in our blog.

Deep Impact on Data Protection Impact Assessments

The Court of Appeal recently determined that South Wales Police’s trial of the use of facial recognition technology overtly in public places was unlawful and in breach of data protection legislation, notwithstanding its finding that the processing of personal data was proportionate, including on the basis that the force’s Data Protection impact Assessment was inadequate.

In the case of R (on the application of Bridges) v The Chief Constable of South Wales Police and others [2020] EWCA Civ 1058, an individual who believed that his personal data had been processed on two separate occasions, once in central Cardiff and on the other occasion at a defence exhibition, through the capture of his image in the course of the overt deployment of vans equipped with live facial recognition technology (known as AFR Locate) alleged that the processing was unlawful and in breach of data protection legislation, Article 8 European Convention on Human Rights and the Public Sector Equality Duty under s149 Equality Act 2010. Mr Bridges was supported in his action by the civil liberties organisation Liberty.

South Wales Police used AFR Locate to capture digital images of members of the public at a rate of up to 50 faces per second and to compare them with digital images of persons on a watchlist compiled by the force for the purpose of the deployment, comprising people wanted on a warrant or unlawfully at large, for example. The AFR Locate software would process the special category biometric data and generate a similarity score which, if it passed the relevant threshold, would be referred to a human operator for review.

The Court recognised that the vast majority of individuals whose facial images would be processed using AFR Locate would not be suspected of any wrongdoing and would not be relevant to the purpose for which the data was processed, and such data would be instantaneously deleted from the system, with suspected matches retained for 24 hours and the CCTV feed itself retained for 31 days.

A Data Protection Impact Assessment (DPIA) had been conducted by South Wales Police prior to commencement of the pilot. They had also prepared and implemented a Standard Operating Procedure and the Home Office had established an Oversight and Advisory Board comprised of representatives of the police, Home Office, the Surveillance Camera Commissioner, the Information Commissioner, the Biometrics Commissioner and the Forensic Science Regulator.

While a large proportion of the judgment was dedicated to whether there was a sufficient legal framework in place to govern the use of AFR Locate, with the Court determining that there was not, and whether the deployment of the technology was proportionate, which the Court concluded it was capable of being given the negligible impact of the momentary processing of personal data in relation to innocent individuals, the Court also considered the adequacy of the DPIA.

The DPIA was criticised by the parties, including the Information Commissioner, on grounds including that:

- it failed to recognise that the deployment of AFR Locate would involve the processing of personal data relating to innocent individuals;

- it failed to acknowledge, or adequately acknowledge, that the rights of individuals under Article 8 ECHR and other privacy rights may be engaged and how any risks could be mitigated;

- it did not refer to any other risks to rights affected by the deployment;

- it did not recognise that processing would take place on a blanket and indiscriminate basis;

- it did not sufficiently recognise the scale of processing, particularly when it concerned special category biometric data;

- it had not considered the consequences of a false positive match; and,

- it failed to address the risk of gender and racial bias.

Not all of these criticisms were upheld by the Court. The Court did, however, find that the DPIA had improperly been conducted on the basis that it had proceeded on the basis that Article 8 ECHR was not infringed by the processing, which was inappropriate given the Court’s finding that the lack of an adequate legal framework rendered the processing not in accordance with the law, and had thereby failed to address the risks to data subjects and appropriate mitigation. The Court did not reject the Divisional Court's test that "If it is apparent that a data controller has approached its task on a footing that is demonstrably false, or in a manner that is clearly lacking, then the conclusion should be that there has been a failure to meet the section 64 obligation" under the Data Protection Act 2018. The DPIA was therefore held not to comply with the Act. Furthermore, it was found that there had been a failure to conduct the ongoing assessment of whether the software introduced unacceptable bias on the basis of protected characteristics necessary to comply with the Public Sector Equality Duty, whether recorded as part of the DPIA or elsewhere.

One issue that this judgment represents for data controllers is that if they have conducted their DPIA by following the ICO’s own template, they will not have considered and recorded their assessment of all of the matters that they are required to address in order for their DPIA to be compliant with the requirements of the GDPR and Data Protection Act 2018, or to ensure that their activities comply with wider legal requirements.

In light of this decision, data controllers need to revisit their DPIAs, prioritising those which relate to the processing of special category and criminal conviction and offence data, to ensure that all relevant matters have been taken into account. The technical legal nature of the assessment required to be undertaken, which goes beyond pure data protection compliance, will mean that many data controllers will lack the knowledge and resources to conduct their own compliant DPIAs. Data controllers may also wish to consider creating their own template DPIA form to ensure that all relevant matters are considered. You can access our Data Protection Impact Assessment (DPIA) template here.

Should you require assistance in preparing an appropriate DPIA template for your organisation, reviewing and revising your DPIAs, or conducting DPIAs, please contact us: info@handleygill.com.

Find out more about our data protection and data privacy services.