Out-Law News 5 min. read
07 Mar 2025, 2:41 pm
Businesses seeking to limit the information they would otherwise have to disclose under data protection laws because they consider the information to constitute a trade secret of theirs, could be forced to disclose those secrets to the court or regulators to arbitrate on the question of disclosure, according to a new ruling by the EU’s highest court.
The decision of the Court of Justice of the EU (CJEU) regards the interplay between rules outlined in the EU’s General Data Protection Regulation (GDPR) regarding automated decision-making and EU trade secrets law.
The outcome of the ruling is that it is possible that businesses using AI systems to facilitate solely automated decision-making impacting people, could be required to share the underlying algorithms for those systems with the courts or regulators to decide whether they need to be disclosed to individuals that request information about the logic behind those decisions, as is their right to do under the GDPR.
Article 15 of the GDPR gives data subjects rights to access their personal data held by controllers about them, along with “meaningful information about the logic involved where they have been subject to solely automated decision-making and the significance and the envisaged consequences of such processing” for them.
A court in Austria asked the CJEU to clarify what ‘meaningful information about the logic involved’ means in this context, to help it resolve a dispute between an Austrian consumer and the City Council of Vienna. The dispute concerns an automated credit assessment which resulted in the consumer being refused a mobile contract due to an apparent lack of financial creditworthiness on her part.
The consumer’s bid to gain an insight into the logic of the decision made about her was endorsed by Austria’s data protection authority and she subsequently won a court ruling that entitled her to enforce her right to information against the credit assessor. However, the enforcing authority, City Council of Vienna, said the credit assessor had already met its disclosure requirements – a view that the consumer is now challenging before the Administrative Court in Vienna.
To help it resolve the dispute before it, the Vienna court asked the CJEU to rule on how the information rights pertaining to automated decision-making under article 15 GDPR should be interpreted.
Data protection law expert Malcolm Dowden of Pinsent Masons said the CJEU first addressed the question of what constitutes “meaningful information” about the logic involved in case of automated decision-making, including profiling.
“In that context, the court states that the requirement ‘cannot be satisfied either by the mere communication of a complex mathematical formula, such as an algorithm, or by the detailed description of all the steps in automated decision-making, since none of those would constitute a sufficiently concise and intelligible explanation’, Dowden said. “The reference to ‘mere communication of the…algorithm’ in that context is to emphasise that the controller cannot simply provide what to most people would be a wholly unintelligible string of mathematical formula and coded instructions. The controller is required to provide information in a form that the data subject can understand and that makes it possible to determine how the decision that affects them was made. Consequently, the controller must consider what simplified or additional explanatory information must be provided to achieve that result.”
“In practice, therefore, the controller must consider whether providing a copy of the algorithm itself, or providing a detailed description of all the steps in automated decision making would allow the data subject to understand the process. If it would not, then the controller must add information to make the response understandable,” he said.
Having addressed that general question, the CJEU turned to specific circumstances that – as a carve-out from the general position – might require or permit the withholding of certain information. One example is where information required to understand the decision-making process would be personal data relating to individuals other than the data subject, for example, showing how the algorithm has produced a decision in relation to other individuals. Another example is where disclosure of the information would involve disclosure of trade secrets.
This issue was considered earlier in the case by an advocate general to the CJEU, who gave a non-binding opinion. The advocate general examined case law relating to the question of personal data relating to individuals other than the data subject and concluded that if the controller thinks that information relating to those other individuals should be withheld then it must disclose that information to the court or regulator so that they can decide, on balance, whether the controller was correct. If the court or regulator agrees with the controller’s assessment, then, according to the advocate general, they may authorise or direct withholding in order to protect third party rights. If the court or regulator considers that disclosure of third-party personal data is necessary, then they can authorise or direct the controller to disclose it – insulating the controller from any claim by the third party.
“The advocate general considered that the case law concerning third party personal data rights could be ‘fully transposed’ to the situation in which the issue is not third-party rights, but not trade secrets,” Dowden said. “The advocate general therefore considered that if a controller believes that certain information may be withheld as a trade secret that it must nonetheless be disclosed to the court or supervisory authority so that they can decide, on balance, whether to authorise withholding or to direct disclosure. The CJEU judges reached the same view.”
“What needs to be taken into consideration”, added Nils Rauer, a specialist in AI, data and intellectual property law at Pinsent Masons, “is the fact that article 15(1)(h) of the GDPR is not meant to simply provide a glance into the ‘engine compartment’ of AI systems – the provision is about checking the correctness of the data and the legitimacy of the processing.”
“Also, the article must be read in context notably with article 22(3) of the GDPR, which concerns the need to suitably safeguard the rights, freedoms and legitimate interests of data subjects. The data subject needs to be enabled to actually express his or her point of view and to contest the decision. Therefore, it is not the algorithm as such that needs to be of glass, it is – as explicitly stated in article 15(1)(h) GDPR – ‘the logic involved’,” he said.
In recognition of this concept, the CJEU explicitly refers to the requirement of making transparent “all relevant information concerning the procedure and principles” relating to the use, by automated means, of personal data with a view to obtaining a specific result. The procedure and principles are not to be understood as a reference to technical algorithms. In this context, Rauer pointed to the court’s reference to article 12(1) GDPR.
“The judges emphasise that in order to ensure that the data subject is able fully to understand the information provided to him or her by the controller, article 12(1) requires the controller to take appropriate measures, among other things, to provide the data subject with those data and information in a concise, transparent, intelligible and easily accessible form, using plain and clear language. So, it is neither about getting access to algorithms, nor about explaining each and every step of often highly complex decision-taking processes,” he said.