CJEU Case in SCHUFA Has Implications Beyond Credit Scoring  

The European Court of Justice opinion that credit scoring constitutes automated decision-making under GDPR Article 22(1) has broader implications beyond credit-scoring. The ruling by the court “to fill a legal gap” implies that the risk scores produced by businesses like fraud detection and identity verification are automated decisions. It suggests controllers will need to obtain consent before calculating creditworthiness or other types of algorithm-based scoring that are used in a wide variety of business processes.

The court’s opinion is inconsistent with modern data analytics and well-established credit scoring practices and may be at odds with the evolving role analytic driven decision-making plays in many aspects of life. These analytic processes reflect the concepts “thinking and acting with data.” Thinking with data is the robust use of data to create new insights; use of those insights to affect individuals is acting with data. Although the score related to a particular individual, until that score was used by a lender – acting with data – that score itself had no impact on an individual. GDPR Article 22 only concerns acting with data. The CJEU overlooks the distinction between thinking and acting with data in order to reach a broad interpretation of the term “decision” in GDPR Article 22(1).

Big data were barely understood, and complex analytics were in their infancy, when the GDPR was adopted in 2016. The GDPR is intended to be technology neutral in many respects, but it has some gaps when it comes to regulating advanced analytics. Based on information contained in the order for reference, the court in SCHUFA determines that, in order to fill a legal gap – the data subject cannot obtain access to meaningful information about the logic involved in the score established by credit information agencies from the financial institution the data subject applied for a loan from and the credit information agency is not obliged to provide that information – that score is an automated decision for the purposes of GDPR Article 22(1). In our view, no such gap exists in the GDPR, but even if it did exist, the court should not have presumed what the relationship between the credit information agency and the financial institution is. In doing so, the CJEU reaches an incorrect decision.

The GDPR does address how to obtain access to the information at issue here. Usually, controllers and processors enter into agreements which require the processor to assist the controller in responding to such access requests. So, data subjects can obtain access to meaningful information about the logic involved in automated decision-making from the controller, the bank.

The issue in the case is what is the relevant decision? The act by which a bank agrees or refuses to grant credit to the applicant? The act by which SCHUFA derives the score from a profiling procedure? The court recognizes that the answer to this question depends on the facts in each case. The problem with the opinion is that the court goes on to make a series of incorrect presumptions about how credit scores are applied to conclude that the credit score is the decision. Ultimately, because of the fact driven nature of the inquiry, the court’s decision may not matter in the financial services industry. However, the broad holding that the court reasoned it should reach because of the absence of a legal definition of the term “decision” in the GDPR means that there many broader implications for other industries and sectors.

For example, scoring is used in retail transactions to identify fraudulent transactions. Machine learning scores transactions in real time by analyzing factors such as device information, IP address, and location in order to identify potential fraud in ecommerce transactions. If a customer usually pays with a credit card but suddenly switches to a different payment method, it may indicate that their account has been compromised and a real-time notification is sent. Detecting Retail Fraud

Another example is in healthcare. We all are familiar with the scores we receive when we get our blood test results. Are those decisions? The number determines whether a result is diabetes or not. If the doctor solely relies on the score, is the blood test result an automated decision?

In the SCUFA case, if the court’s determination that there is a gap in the GDPR because the data subject cannot obtain access to meaningful information about the logic involved in automated decision-making from the bank because the credit bureau, not the bank, has it, then the court just should have interpreted the law rather than made new law. This judicial activism in unwarranted particularly when the EU AI Act which governs credit scoring will be coming into effect soon.

While banks and credit information agencies may be able to get around the holding in SCHUFA because the facts are different, the court’s ruling has implications for other businesses providing AI or other analytical scoring.

The IAF policy analysis is here. 

Lynn Goldstein