LEGAL & REGULATORY COMPLIANCE CONSULTANTS

Handley Gill Limited

Our expert consultants at Handley Gill share their knowledge and advice on emerging data protection, privacy, content regulation, reputation management, cyber security, and information access issues in our blog.

The Secret Sauce

The introduction of the new Algorithmic Transparency Recording Standard, initially mandatory only for central government departments but expected to be extended to apply to all public sector bodies, is necessary to support compliance with public law obligations but will require AI developers to not only give up the secret sauce behind their AI models to their own clients, but to accept that this will be made publicly available, potentially revealing unauthorised data gathering and processing practices in the process and exposing them to liability.
— Handley Gill Limited

On 17 December 2024, the government published its mandatory scope and exemptions policy for the Algorithmic Transparency Recording Standard (ATRS). This makes clear that all Ministerial departments, non-ministerial departments, and Arm’s-length-bodies (ALBs), meaning executive agencies and non-departmental public bodies, which provide public or frontline services, or routinely interact with the general public, and now mandated to complete an Algorithmic Transparency Recording Standard report in resect of all algorithmic tools in the beta/pilot or production phase and which either have a significant influence on (in the sense that they meaningfully assist, supplement, or fully automate) a decision-making process with public effect or, which directly interact with the general public. 

The Algorithmic Transparency Recording Standard requires the relevant organisation to complete a spreadsheet identifying: 

  • the relevant algorithmic tool;

  • the tool’s status (i.e. whether it is currently being piloted, is live or is retired);

  • the nature of the tool and its purpose;

  • any external suppliers;

  • how the tool was procured;

  • the term on which any external supplier is granted access to data;

  • the benefits of the tool;

  • a detailed description of the tool’s operation including its architecture and the models incorporated into the tool (including the datasets on which they were trained and developed and how they were used and the model’s performance); 

  • alternatives considered;

  • the source data to which the model will be applied (including the volume of data, sensitive attributes, the completeness of data and whether it is representative, how data was collected, data cleansing processes, any data sharing agreements in place, data access and storage);

  • what information the tool provides to the decision maker;

  • how the tool fits within the decision making process;

  • frequency and scale of use of the tool;

  • required training for deployers/users of the tool;

  • the role of humans in the decision making process;

  • mechanisms for appeal/review of decisions reached using the tool;

  • a summary of the outcomes of impact assessments conducted (assessments suggested are a data protection impact assessment (DPIA), equality impact assessment and an algorithmic impact assessment); and,

  • the associated risks and mitigations;  with,

  • an option to provide further information in relation to specific datasets.

These records are then published on the government’s Algorithmic Transparency Recording Standard Hub repository, albeit that as at 17 December 2024 only 23 records had been published, including those from public sector bodies not currently mandated to comply with the Standard, such as the Information Commissioner’s registration inbox AI tool. This is despite the ATRS having been introduced in 2022, the policy stating that compliance has been mandatory for all government departments since 06 February 2024 and, government departments having made announcements about their use of AI tools for tasks including the review of consultation responses. The Algorithmic Transparency Recording Standard in respect of HM Treasury’s use of AI to manage correspondence via a correspondence triage automation tool was published on 17 December 2024. As the government is expected to imminently publish the AI Opportunities Action Plan prepared by Matt Clifford CBE, promising to unleash artificial intelligence (AI) across the public sector, public bodies must conduct the due diligence necessary to meet their transparency obligations.

In practice, many AI model developers have proven to be reluctant to disclose detailed information even to prospective and current customers relating to the development and performance of their AI models and may not even publish or routinely make AI model cards available to them. AI model cards are intended to provide a summary of information relating to their development, intended use, training data, performance, potential limitations and wider ethical considerations to provide transparency and enable potential users, affected individuals and the general public to understand how they work, their impact and any potential issues. AI developers hoping to supply the UK government and wider public sector should now expect to be pressured to be far more open regarding the development and performance of their models and tools. 

The Algorithmic Transparency Reporting Standard mandatory scope and exemptions policy is clear that while it imposes new transparency requirements on government departments and other public sector bodies, it does not displace the Freedom of Information Act 2000 or other legislative or common law obligations, and therefore if and in so far as data may be withheld under these provisions, for example on account of the information constituting a trade secret (section 43(1) Freedom of Information Act 2000 (FOIA)), disclosure prejudicing commercial interests (section 43(2) Freedom of Information Act 2000 (FOIA)) or the information is provided in confidence by a third party (section 41 Freedom of Information Act 2000 (FOIA)), it should not be included in the ATRS report or should be redacted from it prior to publication.

From an AI developer’s perspective therefore, while they will be under pressure to be far more open with UK government purchasers of AI and to other algorithmic tools, consideration should be given not only to what information is disclosed to their customers, but the terms on which it is disclosed, whether it should explicitly be identified as being confidential or commercially sensitive and what rights of review and/or approval it should have over the publication of transparency information including the ATRS report. 

If your organisation requires support in understanding whether the ATRS applies to you, in conducting a DPIA, algorithmic impact assessment and/or equality impact assessment, in completing or reviewing an Algorithmic Transparency Reporting Standard report or in negotiating contractual obligations relating to transparency, please contact us.

Find out more about our responsible and ethical artificial intelligence (AI) services.