LEGAL, REGULATORY & COMPLIANCE CONSULTANTS

Handley Gill Limited

Our expert consultants at Handley Gill share their knowledge and advice on emerging data protection, privacy, content regulation, reputation management, cyber security, and information access issues in our blog.

Face Off

Notwithstanding the multiple laws, regulations and guidance applicable to police use of facial recognition technology, chief officers continue to enjoy considerable latitude when deploying LFR and must establish their own policies and procedures to meet the Court of Appeal’s requirements as set out in Bridges. That latitude, which has led to significant concerns being expressed across the political divide, could be reduced if the new Labour government’s promised programme of engagement leads to new legislation or regulation.
— Handley Gill Limited

Sir John Whittingdale MP speaking during debate on ‘Police Use of Live Facial Recognition Technology’ in Westminster Hall, House of Commons, Wednesday 13 November 2024

Chief officers of police forces proposing to deploy live facial recognition technology for law enforcement purposes must not only procure suitable equipment and software but must comply with a panoply of laws, regulations and emerging best practice on artificial intelligence (AI); from the Human Rights Act 1998 to the Data Protection Act 2018 to the Equality Act 2010 to the Police and Criminal Evidence Act 1984 and the College of policing’s Authorised Professional Practice (APP) on Live Facial Recognition, and with oversight by the Information Commissioner’s Office, the Biometric and Surveillance Camera Commissioner, the Equality and Human Rights Commission, Police and Crime Commissioners, the Independent Office for Police Conduct and His Majesty's Inspectorate of Constabulary and Fire & Rescue Services, it shouldn’t be suggested that the use of live facial recognition technology (LFR) is unregulated.

Nevertheless, in the absence of specific legislation governing the use of LFR by police and wider law enforcement, significant criticism has been targeted at LFR, and individual deployments have been subject to legal challenges. 

For the first time yesterday (13 November 2024), Parliament debated police use of live facial recognition technology, after the Conservative MP for Maldon, Essex, Sir John Whittingdale secured the debate.

MPs from across the political divide raised concerns including in relation to accuracy, discrimination and bias, the criteria for inclusion on a watchlist, the source of watchlist images, data retention and destruction, whether deployment should be subject to judicial oversight, system configuration, transparency, human involvement, deployment locations, implications for the presumption of innocence and burden of proof, the potential for ‘mission creep’, the ability to scrutinise’ black box’ AI technologies, impact of surveillance on personal privacy and autonomy and the scope for a chilling effect, impact on freedom of assembly and freedom of expression, and even the very efficacy of LFR as a law enforcement tool.

At Handley Gill, we have sought to address all of these issues when advising on the procurement and deployment of the controversial artificial intelligence (AI) technology LFR. This is a complex area of law, and one in which there currently remains significant latitude for chief officers in their approach, which  leads to regional inconsistencies, prevents efficiencies of scale from being achieved, and inhibits the collation and comparison of data to monitor and assess efficacy and accuracy. 

We were delighted that during the Westminster Hall debate Sir John Whittingdale MP acknowledged Handley Gill’s work in this area and reflected our view that “it is undesirable for individual Chief Officers and PCCs to have to engage in the wide ranging review and preparation of the necessary documentation, and that a move toward a common national approach (and choice of technology provider) would secure efficiencies and also enable closer monitoring…to ensure their efficacy and lawfulness.”

While not the subject of the debate, concern was also raised regarding private use of live facial recognition, such as by retailers using schemes such as Facewatch, and the previous Conservative government’s proposals to utilise the Driver and Vehicle Licensing Agency and/or Passport Office databases of photographs as sources for watchlist images in connection with Retrospective Facial Recognition technology.

At the conclusion of the debate, the new Labour Minister of State for Policing, Fire and Crime prevention, Dame Diana Johnson committed to a “programme of engagement” and “series of roundtables” to understand “how much support the police may require from Government and Parliament to set and manage the rules for using technologies such as facial recognition” and to ensure that “we protect the public from potential misuse of those technologies” and establish appropriate mechanisms for the scrutiny of the “application of the rules and regulations”.

Unlike the previous Conservative government, the new Labour government has not closed off the prospect of further legislation or regulation of seek to explicitly address police or wider use of LFR, but appears not to intend to seek to address this as part of the Data Use and Access Bill, which is due to have its second reading on Tuesday 19 November 2024.

Should you require support in procuring or deploying facial recognition technologies please contact us.

Find out more about our responsible and ethical artificial intelligence (AI) services.

Find out more about our data protection and data privacy services.

Find out more about our ESG and human rights services.