Out-Law News 4 min. read
15 Jan 2025, 10:33 am
Plans to force AI developers to disclose what content they use to train their AI models could support a new wave of mass claims litigation by content creators, an expert in intellectual property law has said.
Gill Dennis of Pinsent Masons was commenting after the High Court in England and Wales ruled to prevent a US company acting on a representative basis for thousands of content creators whose copyrighted works have allegedly been infringed by AI developer Stability AI.
The proposed representative action arose in the context of ongoing litigation between Getty Images and Stability AI, which is a London-based AI developer behind a range of different generative AI systems, including its ‘Stable Diffusion’ system, which automatically generates images based on text or image prompts input by users.
Getty Images has claimed that Stability AI is responsible for infringing its intellectual property rights, both in how Stability AI has allegedly used its images as data inputs for the purposes of training and developing Stable Diffusion, as well as in respect of the outputs generated by Stable Diffusion, which Getty claims are synthetic images that reproduce in substantial part its copyright works and/or bear Getty brand markings.
Gill Dennis
Senior Practice Development Lawyer
One of the proposals in the current UK government consultation on copyright and AI is to impose an obligation on AI developers to be more transparent around the training data that they use. If this proposal is accepted and implemented, it would solve the class definition issue encountered in this case
Getty has further alleged that Stability AI is responsible for secondary infringement of copyright, on the basis that Stable Diffusion constitutes an “article” that was imported into the UK without its authorisation when it was made available on platforms GitHub, HuggingFace and DreamStudio, and for infringement of its database rights, trade marks, and the law of passing off. Stability AI rejects the claims. Trial hearings are expected to take place this summer.
Getty has alleged that more than 50,000 photographers and content contributors, who exclusively licensed their content to it over several decades, have had their rights in those works infringed by Stability AI. With support from Getty, one of those rights holders bid to raise a representative action on behalf of that entire group, against Stability AI.
The Civil Procedure Rules in England and Wales provide courts with the power to permit a representative claim to be pursued where it can be shown that all members of the relevant group, or class, to be represented has “the same interest in a claim”.
In this case, Stability AI was successful in applying to the High Court for an order preventing Washington-based content creation and licensing business Thomas M Barwick Inc. from raising a representative action on behalf of the 50,000-plus photographers and content contributors said to be affected by Stability AI’s alleged infringement. Thomas Barwick is a director of Thomas M Barwick Inc and claims to have made thousands of images and videos available through Getty Images websites. He has assigned rights in those works to Thomas M Barwick Inc.
Mrs Justice Joanna Smith, the judge in this case, considered that Thomas M Barwick Inc’s application for permission to raise a representative action could not succeed, because the way it defined the class it sought to represent was premised with the notion that the individuals’ copyright had been infringed by Stability AI – this being something that could only be determined at trial.
The judge also considered the fact that there was no definitive list of copyrighted works used to train Stable Diffusion and therefore “no way at present to identify the members of the class”.
Notwithstanding those issues, the judge retained discretion to grant permission to Thomas M Barwick Inc. to pursue the claims on a representative basis. She considered the shortcomings with the class definition were such not to justify the exercise of that discretion, however.
Mrs Justice Joanna Smith went on to consider an alternative proposal by Getty and Thomas M Barwick Inc. that they be given scope to pursue the claims against Stability AI without other content creators with concurrent rights of action being joined to the case. The judge was also opposed to that idea.
“In my judgment … and given the potential ramifications for [Stability AI] of so many exclusive licensors subsequently ‘vexing’ [it] with fresh proceedings, the court is entitled to expect proper evidence addressing this point. Absent such evidence, I agree with [Stability AI] that this is simply not an application that the court can possibly accede to at this stage.”
The judge, however, said the procedural “difficulties” she had identified were “not intractable” and added that “a pragmatic way forward must be found which does not involve the joinder of 50,000 potential claimants”. She suggested that a way might still be found for a mass claim to be considered as part of the trial proceedings due to take place this June, and left it open to Getty and Thomas M Barwick Inc. to re-apply for permission to raise a representative action, adding that the making of such an order “in many ways … would make very good sense”.
"The judge's finding that the class definition did not identify a class with a common interest raises some real practical problems for content creators seeking to prevent the use of their works to train AI without their consent,” said intellectual property law expert Gill Dennis of Pinsent Masons.
“It is widely acknowledged that, at present, it is very difficult for content creators to prove that their works have been used as training data; even AI developers themselves do not always know with any certainty what data they have used for this purpose. This objection will, therefore, always be an obstacle to a representative claim where copyright infringement in the context of generative AI is alleged,” she said.
“Content creators are very unlikely to bring individual infringement claims because of the high litigation costs involved, undermining the value of their IP rights and effectively leaving them without a remedy,” she said.
"One of the proposals in the current UK government consultation on copyright and AI is to impose an obligation on AI developers to be more transparent around the training data that they use. If this proposal is accepted and implemented, it would solve the class definition issue encountered in this case,” Dennis said.
Out-Law Analysis
29 Apr 2024