Out-Law / Your Daily Need-To-Know

Out-Law News 4 min. read

Irish proceedings remind firms of data protection obligations when developing AI

Grok

Photo by Jaap Arriens/NurPhoto via Getty Images


Upcoming High Court proceedings between the Irish Data Protection Commission (DPC) and X, formerly Twitter, will be hugely significant to businesses using personal data to train artificial intelligence (AI) systems, according to experts.

The DPC launched urgent High Court proceedings against X’s European subsidiary, Twitter International Unlimited Company, in relation to X’s AI-powered search tool, Grok. It follows several complaints from consumer organisations seeking interim orders restraining X from processing EU and European Economic Area (EEA) users’ data to train Grok, citing a potential breach of the General Data Protection Regulation (GDPR).

The DPC and X have now reached an agreement whereby X undertook to suspend processing the personal data contained in the public posts of EU and EEA users on the platform for the purposes of training its AI model. This undertaking will remain in place until the matter returns to the court in September 2024 for further directions.

Nicola Barden, data protection law specialist at Pinsent Masons, said: “These proceedings serve as a stark reminder to businesses that while AI systems will be subject to increased regulatory obligations following the commencement of the EU AI Act, they must be acutely aware of their existing obligations in respect of the processing of personal data under the GDPR. Irish businesses should review their current policies and procedures to ensure they are compliant with their existing legal obligations under the Data Protection Act 2018 and the GDPR and any new obligations under the EU AI Act.”

The DPC alleged that X had processed public posts of X’s EU and EEA users between 7 May and 1 August in violation of its Irish and European data processing obligations.

For the first time, the DPC invoked section 134 of the Data Protection Act 2018 (DPA) seeking interlocutory relief. Section 134 of the DPA allows the DPC to make an application to the High Court for an order suspending, restricting or prohibiting the processing of data where the DPC considers there is an urgent need to act to protect the rights and freedoms of data subjects.

Maureen Daly, intellectual property expert at Pinsent Masons, said: “Despite the advantages afforded by AI, it is essential that companies remain cognisant of their existing legal obligations particularly under data protection law, of which X was reminded after being subject of a legal challenge by the DPC.”

The evidential threshold required for the application was strongly contested by the parties. X stated that it only received the papers for the application two days prior to the hearing and as such did not have sufficient time to formulate a statement of opposition to the claims of the DPC, describing these circumstances as a “fundamental concern”.

X submitted that in order to properly defend the proceedings, it would need to adduce complex technical evidence to demonstrate that personal data was used to train Grok without breaching the GDPR. It further stated that significant time and resources would be required to go through the data, and implement redactions if required, and it was likely to need to instruct a technical expert to provide evidence in support of this.

These claims were contested by the DPC which submitted that, despite what X may argue and notwithstanding the intricacies of AI tools, there was no complexity to the data processing in question and that it was up to X to come back to court and provide an accurate description of the current processing of personal data in order to discharge any court order.

X argued that it had only processed a small amount of data from the relevant users, stating that less than 0.001% of the public posts of EU and EEA users had been used for the purposes of training Grok. In response to this, the DPC stated that the act of processing, for the purposes of GDPR, is a lot wider than for example simply storing data.

“The court did not comment on this and so it will be interesting to see how much weight it will afford to the quality of data processed at the full hearing of the application,” said Barden.

X also argued that it had implemented mitigation measures for users. However, the DPC argued that these were not sufficient and were not fully in place until July. The mitigation measures related to the ability of a user to ‘opt-out’ of having their personal data be used for training AI models. X explained that these measures were in place before Grok began processing public posts, with some measures rolled out in September 2023 and further enhanced mitigation measures implemented in July, which consisted of a more bespoke opt-out feature for users.

X submitted that these measures were adequate for the purposes of its obligations under the GDPR and contended that any delay between the rollout of Grok and the implementation of the enhanced mitigation measures was due to a technical error.

The DPC accepted that X’s measures since July were adequate but argued that these were not relevant for the purposes of the application as the unlawful processing had already taken place. It contended that the measures in place in May, when the DPC reviewed a model of Grok, were inadequate and so this had allowed X to unlawfully process the data for two months.

The court held that it was necessary for these enhanced mitigation measures to have been in place for the entire time period in question and that it was not sufficient for a company specialising in technology to suggest that the reason these measures were not fully implemented was due to a technical error. It stated that the measures should have been rolled out from the start and to all users and that this, along with the fact that the data collected was still being used by Grok, supported the urgency of the application.

Barden said: “The timing of this case is interesting given the DPC’s guidance on AI and data protection was published in July. This could indicate that the DPC is seeing more complaints from data subjects in relation to AI and dealing with issues similar to those outlined in the proceedings against X.”

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.