LEGAL & REGULATORY COMPLIANCE CONSULTANTS

Handley Gill Limited

Our expert consultants at Handley Gill share their knowledge and advice on emerging data protection, privacy, content regulation, reputation management, cyber security, and information access issues in our blog.

Tell me why!

Data protection law does not necessarily require data controllers to disclose the secret sauce of the algorithms behind automated decision making processes.
Transparency obligations can be satisfied by disclosing details of the personal data underlying the decision together with details of the procedure and principles applied and an illustration of how the outcome would be altered by different inputs.
Where there continues to be a dispute as to whether the information provided is sufficiently meaningful while protecting confidentiality and trade secrets, supervisory authorities and the courts are entitled to scrutinise the relevant information.
— Handley Gill Limited

 Where automated decision-making based on personal data is used to make decisions having a legal or similarly significant effect, both the EU General Data Protection Regulation (GDPR) and the UK GDPR provide that individuals should be pro-actively provided with “meaningful information about the logic involved” and are obliged to provide such information in response to a data subject access request under Article 15 GDPR / Article 15 UK GDPR. Such information may be necessary in order to facilitate the right of individuals under Article 22(3) GDPR / Article 22(3) UK GDPR to contest decisions made solely using automated decision-making.

As algorithmic decision making and artificial intelligence become more prevalent, organisations are often concerned about the scope of these obligations and the potential impact on the secret sauce comprising trade secrets and other confidential information.

When, to coin the Little Britain phrase, the “computer says no”, what information is a data controller obliged to disclose to individuals to comply with the obligations under Articles 13(2)(f), 14(2)(g) and 15(1)(h) GDPR?

The European Data Protection Board (EDPB) comprised of European data protection supervisory authorities, including the UK regulator the Information Commissioner at the relevant time, adopted an updated version of the guidelines issued by its predecessor the Article 29 Working Party on automated decision-making, which suggest that controllers are not required to provide “a complex explanation of the algorithms used or disclosure of the full algorithm” but should be “sufficiently comprehensive for the data subject to understand the reasons for the decision”. The guidelines suggest that it would be acceptable to provide the “main characteristics considered in reaching the decision, the source of this information and the relevance” and should offer “real, tangible examples of the type of possible effects”. The Information Commissioner’s guidance on Automated decision-making and profiling is aligned with that of the EDPB. 

The CJEU has today (27 February 2025) ruled on a request from the Austrian courts for a preliminary ruling in the case of C‑203/22 CK v Magistrat der Stadt Wien (commonly referred to as the Dun & Bradstreet case) in the context of a dispute pertaining to a refusal to renew or issue a low value mobile phone contract purportedly based on credit scoring.

The Court rejected any suggestion that controllers should disclose “a complex mathematical formula, such as an algorithm, or by the detailed description of all the steps in automated decision-making, since none of those would constitute a sufficiently concise and intelligible explanation” [59]. Instead, the Court determined that individuals are entitled to “a genuine right to an explanation as to the functioning of the mechanism involved in automated decision-making of which that person was the subject and of the result of that decision” [57] which should comprise “the procedure and principles actually applied in such a way that the data subject can understand which of his or her personal data have been used in the automated decision-making at issue” [61]. The Court emphasised that individuals must be able to verify the accuracy of the personal data underlying the decision and suggested that it would be sufficient to be demonstrate “the extent to which a variation in the personal data taken into account would have led to a different result” [62].

It is important to note that, as a consequence of Brexit, UK courts are not bound to follow this ruling of the European court but, they are entitled to have regard to it and, subject to any changes to the UK’s data protection legislation in relation to this issue, we anticipate that the UK’s approach will remain in alignment. 

Contact us and book a free consultation if you require support:

  • drafting privacy notices or responding to data subject access requests (DSARs) relating to your use of algorithms in automated decision-making; 

  • determining what your obligation to provide meaningful information requires; 

  • ensuring that customers using your algorithm or AI protect your confidential information and trade secrets; or, 

  • understanding how an automated decision was made about you and whether you can challenge it.

Find out more about our data protection and data privacy services.

Find out more about our responsible and ethical artificial intelligence (AI) services.