Out-Law Analysis Lesedauer: 4 Min.

GDPR ruling has commercial implications for credit reference agencies


A recent ruling by the EU’s highest court could force credit reference agencies to make changes to their business models.

Depending on the way credit reference agencies currently process personal data, the Court of Justice of the EU (CJEU) ruling could increase their compliance burdens – not only because it could require them to disclose more information about the way they build credit profiles to the people they build those profiles about, but because, at a more fundamental level, it could force them to rethink how they build those profiles.

Credit assessments deemed to be ‘decisions’

Credit reference agencies play an essential role in enabling businesses of all kinds to make decisions on whether to extend credit to consumers, by assessing those individuals’ creditworthiness and risk of defaulting from payments. Typically, this assessment is provided to the customer businesses they serve in the form of a credit score. To inform their assessment, credit reference agencies will typically review a wide range of data about the consumer – data they source directly from the consumer and data they gather from third-party sources. The processing of that data and profiling of consumers is subject to data protection law.

In the context of whether consumers are provided with credit or not, credit reference agencies see themselves as intermediaries in the process. The decisions, as they see it, are taken by the businesses they provide the consumers’ credit ratings to.

The distinction is important because of what EU data protection law provides.

Article 15 of the GDPR gives data subjects rights to access their personal data held by controllers about them, along with “meaningful information about the logic involved where they have been subject to solely automated decision-making and the significance and the envisaged consequences of such processing” for them.

Article 22 of the GDPR goes further – it provides people with a general right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning them or similarly significantly affects them. That right does not apply if the decision: is necessary for entering into, or performance of, a contract; is authorised by EU or member state law; or is based on the individual’s explicit consent.

What the CJEU has said

In December 2023, the CJEU considered the extent to which article 22 of the GDPR was engaged by the activities of German credit reference agency, SCHUFA.

SCHUFA establishes a prognosis on the likelihood of future behaviour of a person which is based on that person’s characteristics and mathematical and statistical processes. It then groups people to others with similar characteristics who have behaved in a similar way in order to predict behaviour. An individual who had been refused a loan based on a credit score SCHUFA had shared with the would-be lender challenged SCHUFA’s decision-making process.

The CJEU determined that the refusal of an online credit application does engage article 22 and that SCHUFA’s activity did constitute profiling. It reached that view after determining that the output of SCHUFA’s activity should be regarded as a ‘decision’ and that it had engaged in automated individual decision-making, because it played a “determining role” in the eventual outcome. In essence, a negative credit rating would inevitably mean that the final decision would be a “no”. In reaching that decision, the CJEU was also influenced by the fact that SCHUFA would be best placed to provide meaningful information on how the decision had been reached.

The CJEU built on that case law last month in a case referred to it from Austria involving credit reference agency Dun & Bradstreet.

In that case, a mobile phone provider refused a contract to a consumer. That “decision” was driven by a credit assessment provided by Dun & Bradstreet. Rather than being treated as a factor feeding into the mobile phone provider’s final decision, the outcome of the Dun & Bradstreet credit assessment was regarded as decision in its own right and, crucially, one made by solely automated processing. As a decision made by solely automated processing, and as one having a legal effect on the data subject – the refusal of a contract – the decision engaged the requirement to provide ‘meaningful information’ as to the logic involved.

In that regard, the CJEU said the information Dun & Bradstreet has to provide to the consumer must explain – in a concise, transparent, intelligible and easily accessible form – the procedure and principles it actually applied to build the profile of them, except to the extent that that information it considers it would have to disclose constitutes a trade secret, in which case further complexities arise.

Implications of the rulings

It is common for credit reference agencies to seek to make a distinction between the information they provide and the decisions taken using that information in the contracts for the provision of credit scores or similar assessments. Those contracts have commonly included clauses stating that they are intended to serve only as one factor feeding into a decision that is to be made by the lender, insurer or seller who requested the assessment. Some clauses go further and require the final decision to be made with a “human in the loop”. 

However, if there are clauses in place, but it can be shown that the end “decision” always follows the credit score and that there is, in reality, little or no actual human involvement in that final decision, that creates a risk for credit reference agencies that including contract clauses will not, in itself, guarantee that their service will not be regarded as a “decision”.

One solution might be for credit reference agencies to monitor and audit decision-making by their customers. However, that is highly unlikely to be commercially viable or contractually acceptable. Another approach might be for the credit reference agencies themselves to insert a “human in the loop”, whether reviewing all decisions or a sufficiently meaningful sample. However, that would introduce additional steps into the process and would go to the viability of the business model.

The question of whether provision of a credit rating or assessment is a “decision” or a mere “recommendation” or factor feeding into a decision is, therefore, one of considerable commercial significance. In the case of Dun & Bradstreet, the referring Austrian court accepted and the CJEU proceeded on the basis that the credit score was a “decision”. The practical question for the sector is whether that will be the default position, or whether legally viable and cost-effective measures can be deployed to ensure that the rating they provide does not constitute a “decision” or otherwise fall subject to the rules applicable to automated decision-making under the GDPR.

The issue is as relevant for credit reference agencies in the UK to consider as it is those in the EU, given the relevant UK GDPR rules currently mirror those under the EU GDPR.

We are working towards submitting your application. Thank you for your patience. An unknown error occurred, please input and try again.