Out-Law News 4 min. read

Disclosure of algorithms envisaged under EU GDPR


Businesses that deploy technologies such as AI to make decisions that have legal or similarly significant effects on individuals could be required to disclose to data protection authorities or courts commercially sensitive details of their algorithms under a scenario envisaged by a legal adviser to the EU’s highest court.

That element of the opinion expressed by Jean Richard de la Tour, an advocate general at the Court of Justice of the EU (CJEU), is likely to be viewed as “controversial”, according to data protection law expert Malcolm Dowden of Pinsent Masons, who said it highlights how AI use raises potential tensions between rights arising under the EU General Data Protection Regulation (GDPR) and protections for trade secrets that are also enshrined in EU law.

Stephanie Lees, also of Pinsent Masons, said, though, that the opinion does provide some helpful clarity on the information organisations need to disclose to data subjects regarding their automated decision making.

The CJEU has been asked to determine the extent of organisations’ disclosure obligations in cases where they make decisions about individuals based solely on automated processing of their data and are then asked by those individuals to share information about that activity.

Article 15 of the GDPR gives data subjects rights to access their personal data held by controllers about them, along with “meaningful information about the logic involved where they have been subject to solely automated decision-making and the significance and the envisaged consequences of such processing” for them.

A court in Austria has asked the CJEU to clarify what ‘meaningful information about the logic involved’ means in this context, to help it resolve a dispute between an Austrian consumer and the City Council of Vienna concerning an automated credit assessment which resulted in her being refused a mobile contract due to an apparent lack of financial creditworthiness on her part.

The consumer’s bid to gain an insight into the logic of the decision made about her was endorsed by Austria’s data protection authority and she subsequently won a court ruling that entitled her to enforce her right to information against the credit assessor. However, the enforcing authority, City Council of Vienna, said the credit assessor had already met its disclosure requirements – a view that the consumer is now challenging before the Administrative Court in Vienna.

The Vienna court is seeking the CJEU’s help with how the information rights pertaining to automated decision-making under Article 15 GDPR should be interpreted, to enable it to resolve the dispute before it.

To help the CJEU answer the questions it has been posed, advocate general Jean Richard de la Tour was appointed to consider the issues at hand and give a non-binding opinion. He issued his opinion earlier this month after examining, among other things, what is meant by “meaningful information” in the context of automated decision-making, as well as the extent to which controllers are obliged to provide full details of the algorithms they use to achieve automated decisions, including where such information constitutes a trade secret.

The advocate general confirmed that ‘meaningful information about the logic involved’ in automated decision-making relates to the method and criteria used by the controller for that purpose. He said the information controllers must provide in this regard must be “concise, easily accessible and easy to understand, and formulated in clear and plain language”.

The advocate general said the information to be disclosed must also be “sufficiently complete and contextualised to enable that person to verify its accuracy and whether there is an objectively verifiable consistency and causal link between, on the one hand, the method and criteria used and, on the other hand, the result arrived at by the automated decision at issue”.

The complexity of algorithms means that most data subjects would not be equipped to understand them. The advocate general said: “the controller is not required to disclose to the data subject information which, by reason of its technical nature, is so complex that it cannot be understood by persons who do not have particular technical expertise, which is such as to preclude disclosure of the algorithms used in automated decision-making”.

However, complexity does not relieve the controller from its obligation to provide the data subject with information that is understandable, and that allows them to exercise their rights, including the right to understand and, if appropriate, challenge the decision. That might require the controller to produce an explanation that includes information concerning the algorithm, such as explicit details of the weighting applied to information, that constitutes a trade secret.

In such cases, where a meaningful explanation would necessitate disclosure of details arguably amounting to a trade secret, controllers would have to disclose those details, and possibly the full algorithms, to regulators or the court so they can “weigh up, in full knowledge of the facts and in accordance with the principle of proportionality and the confidentiality of that information, the interests involved and determine the extent of the right of access that must be granted to [the data subject]”.

The opinions offered by advocates general to the CJEU are non-binding on the court’s judges, but they are often influential and followed by the judges in their rulings. A judgment in this case is not anticipated for several months.

Dowden said: “The idea that trade secrets might need to be disclosed to data protection authorities for evaluation might well be regarded as a highly sensitive and controversial suggestion, as the evaluation of trade secrets is by no means a core activity or competence of data protection authorities.”

Data protection law expert Lees said, though, that the opinion provides helpful clarification for organisations responding to data subject access requests, on their obligations to provide “meaningful information about the logic involved” in automated decision-making.

Lees said: “Data subjects must be given information about the context in which automated decisions have been made using their personal data, so they can understand its accuracy and challenge the decision if they wish. This includes controllers providing details on the method used, the criteria taken into account, and their weighting. Explanations should be accessible, but there is no need to provide a mathematical formula.”

“Organisations should ensure that before undertaking any automated decision making, they understand how the technologies work and are able to provide the functioning of the mechanism involved in any automated decision making, so they can explain the results of such decisions on request,” she said.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.