Out-Law / Your Daily Need-To-Know

Out-Law Analysis 13 min. read

How a US AI ruling could influence UK copyright policy

Sir Paul McCartney on stage_Digital - SEOSocialEditorial image

Sir Paul McCartney is a campaigner for content creator rights. Jim Dyson/Getty Images


A recent ruling by a district court in the US state of Delaware could have a bearing on how UK copyright law is modernised to reflect the training and use of AI models.

The US ruling was cited in the UK parliament recently as some MPs expressed concern that the UK government, in contemplating how it can better support the AI industry to grow in the UK, risks doing irreparable damage to the UK’s creative industries.

The question of how best to strike the balance between enabling AI developers to train their models on high-quality data, while ensuring content creators can control how their works are used and extract value from their use in AI training, has been polarising. Talks over the creation of a voluntary AI copyright code in the UK broke down last year after the government at the time could not foster consensus between stakeholders on both sides of the debate. The fundamental disagreement on how copyright law should be interpreted in the context of AI training and use has also been reflected in increasing litigation.

Amidst pressure from content creators to address what they claim is the unfair exploitation of their works by AI developers under the status quo, the new Labour government opened a consultation on copyright reform. However, the proposals it has expressed a preference for have alarmed creative industry representatives and spurred some UK law makers to pressurise the government into including AI-related copyright protections within a Bill already before parliament  that was designed for an entirely different purpose – the Data (Use and Access) Bill (DUAB). The government is seeking to resist those efforts and legislate on AI and copyright separately.

However, as the DUAB moves to an important next stage of the legislative process, there is no sign of AI and copyright issues going away – with the creative industry representatives engaging in a coordinated campaign on the final day of the government’s consultation today and the Delaware ruling also fuelling discussion. Below, we explore why some MPs believe the judgment moves the dial in the debate over AI-related copyright reform in the UK and outline why a different perspective on the divisive issue is needed.

The Delaware case

Media business Thomson Reuters has sued Ross Intelligence Inc. for alleged copyright infringement. Ross Intelligence built a legal research search engine that uses AI. The product uses training data that contained headnotes published by Westlaw. Those headnotes summarise key points of law and details of court determinations on them. Thomson Reuters, which owns the copyright for Westlaw works, previously refused Ross Intelligence a licence to use its content to train its rival product and claims that Ross Intelligence is responsible for copyright infringement by using the Westlaw notes in the training of its product without permission to do so.

Before the question of the validity of the copyright claimed by Thomson Reuters goes on to be considered by a jury, the Delaware district court judge ruled on whether Ross Intelligence had any defence to Thomson Reuters’ claims of infringement. Among other things, the judge held that Ross Intelligence could not rely on the ‘fair use’ limitation under US copyright law to defend against the claims.

To understand why this US ruling is being seen as relevant to the debate on AI copyright reform in the UK requires consideration of the context in which reform is being pursued and the reasons for it.

AI at the heart of the UK’s growth agenda

The Labour government was elected last year on a mandate to pursue certain ‘missions’ – including a mission to kickstart economic growth and secure the highest sustained growth of any G7 country. It sees harnessing the potential of AI and the AI industry as core to this objective. There is evidence for this across different initiatives in recent times.

For example, the government has been keen to shift the perception that it views matters pertaining to AI use and development through the safety lens only. It wants more emphasis to be on encouraging investment by AI developers and adoption of the technology across the economy. This reflects the pivot taking place globally, as evidenced at the recent Paris AI action summit. The recent renaming of the UK’s AI Safety Institute, to the AI Security Institute, was a further signal to the market on this point by the government.

The rapid development of AI was cited as an opportunity for the UK in the government’s industrial strategy last year – a paper in which it confirmed digital and technologies as one of eight “growth-driving sectors” it plans to prioritise support for. The government followed this up last month by endorsing the recommendations of the AI opportunities action plan it commissioned tech entrepreneur Matt Clifford to draw up – a plan that envisages stronger support for innovation from the UK regulatory system and the creation of ‘AI growth zones’ across Britain where development of data centres is to be enabled.

The copyright question and UK consultation

Also included in the AI opportunities action plan was a recommendation that the government reform the UK’s text and data mining regime. Text and data mining (TDM) is the use of automated computational techniques to analyse and copy large amounts of information to identify patterns, trends, and other useful information. 

Section 29A of the Copyright Designs and Patents Act 1988 sets out a text and data mining exception to copyright. In an AI context, this provides that TDM will not infringe the copyright in a copyright protected work provided an AI developer has lawful access to the work – for example, via a subscription or permission in terms and conditions – and the TDM is undertaken for the purposes of non-commercial research.

As we have highlighted previously, the fact the TDM exception is limited to non-commercial purposes is problematic in the context of AI, as most transformational AI is developed with a view to future commercialisation. In addition, the exception only applies to works protected by copyright and does not extend to database rights.

Despite the current limitations with the TDM exception, content belonging to rights holders has been used in the training of AI models, commonly without a licence. Rights holders have objected to this, claiming the activity constitutes infringement, but AI developers argue that the activity is permissible under the law as it stands. Matters are being put to the courts to determine, including in the UK, where a trial is due to take place before the High Court in London this summer in the case of Getty Images v Stability AI. The multi-faceted defence Stability AI has raised to the claims against it provides an insight into how it views different aspects of UK copyright law relating to the way its generative-AI system was trained and the output it produces.

The unresolved issues being explored in litigation in the UK and elsewhere creates uncertainty, which Clifford said in his report is hindering innovation and undermining the UK’s AI ambitions as well as the growth of the UK’s creative industries – one of the other “growth-driving sectors” specified in the government’s industrial strategy. A report by the House of Lords Library estimated that the creative industries contributed £124 billion in gross value added (GVA) to the UK economy in 2023; more than 5% of the GVA of the whole UK economy.

Clifford said the situation has “gone on too long and needs to be urgently resolved”, adding that the UK “is falling behind” the EU on the issue. Clifford said: “The EU has moved forward with an approach that is designed to support AI innovation while also enabling rights holders to have control over the use of content they produce.”

In response, the government said it will “act to ensure that we have a competitive copyright regime that supports both our AI sector and the creative industries”.

Tangible action, as far as the Labour government is concerned, started late last year when the government opened a consultation on copyright and AI.

Among other proposals, the government set out a range of potential options for legislating around TDM. The government said it does not favour requiring AI developers to obtain an “express licence” in all cases where they wish to train their models on copyrighted works in the UK. Instead, its preference is to recalibrate the existing TDM exception to facilitate the training of AI models using copyrighted material for commercial use – but only if rights holders do not opt their content out from being used in that way. In tandem with this, the government wants there to be greater transparency from AI developers over their use of copyrighted material to train AI.

The government’s consultation closes today.

The DUAB copyright hijack

At the same time as the UK AI copyright debate has been flaring, the government has been pressing ahead with separate plans to reform UK data protection law and enable data-related innovation and efficiencies across the UK economy. The DUAB was introduced into the UK parliament in October 2024.

Envisaged under DUAB are reforms to some existing provisions of data protection law that are highly relevant to AI development and use. For example, some provisions envisage a relaxation of some restrictions applicable to automated decision-making; while others seek to expand the concept of ‘scientific research’ to include certain privately funded and commercial research activities, and not just non-commercial research as is the case currently.

The DUAB proposals have sparked significant discussion to do with UK AI policy in the UK’s parliament – including on the copyright question.

Following the launch of the government’s copyright and AI consultation, Baroness Kidron, who sits in the House of Lords, tabled amendments to the DUAB designed to build new AI-related copyright protections into the Bill. The government’s attempts to defeat those amendments in the Lords before the Bill passed over to the House of Commons for further scrutiny were defeated.

Baroness Kidron’s amendments, if adopted, would require the government to introduce new regulations relevant to operators of “web crawlers” and general-purpose AI models. 

Those rules would require the operators to comply with UK copyright law, where their services have links to the UK, “regardless of the jurisdiction in which the copyright-relevant acts relating to the pre-training, development and operation of those web crawlers and general-purpose AI models take place”. The operators would also face new duties of transparency – including to disclose information regarding text and data used in the pre-training, training and fine-tuning of general-purpose AI models – as well as a requirement to ensure that “the exclusion of a crawler by a copyright owner does not negatively impact the findability of the copyright owner’s content in a search engine”.

The UK’s Information Commissioner’s Office would be tasked with enforcement of the new regime.

In a recent interview with the Guardian, Baroness Kidron described the government’s copyright and AI consultation as “fixed and inadequate”. She said her amendments, if adopted into law, would “mandate that companies have to account for where and when they take the material and make it transparent”

The second reading of the DUAB in the House of Commons took place earlier this month. The copyright question once again dominated discussion, despite government pushback.

The SNP’s Pete Wishart MP accused the government of seeking to “water down” the UK’s copyright regime. He described Baroness Kidron’s amendments as “the creative industries’ safeguard and guarantee in the face on an almost existential threat to their ability to sustain themselves”.

Victoria Collins, Liberal Democrats MP, said her party opposes the “opt-out system” envisaged in the government’s preferred option for reform. She said: “It is easy for those creatives to opt in, whereas opting out is harder, especially for smaller businesses or creatives in their own right.”

John Whittingdale, a Conservative MP and former culture secretary, explained that the government he was formerly part of had rejected proposals that are now under consideration “precisely because we felt it would drive a coach and horses through copyright law and do real damage to the creative industries”.

Whittingdale said Baroness Kidron’s amendments “put in clear terms what we believe the law is already” and said the opt-out system under consideration by the government “reverses the whole principle of copyright law”.

Fair use versus fair dealing

It was, however, a member of the sitting Labour government, backbench MP Alison Hume, that first referred to the Thomson Reuters infringement ruling in Delaware during the debate. She described the ruling as “the first pure AI training case decided in the US”.

Hume went on to express what she described as the “worry” of the creative industries “that the government’s preferred position of creators of original material opting out of having work scraped is not workable because no such model currently exists anywhere in the world”.

Whittingdale also referenced the Delaware judgment, describing it as “important”. He implied that the ruling undermined an apparent purpose of the reforms the government is pursuing – to support AI development at least as effectively as happens in the US. He said: “Previously we had been told that America was ahead in encouraging and promoting the use of this technology. It is reassuring that even in America, they recognise the importance of protecting creative works.”

At the heart of Whittingdale’s point is how differences between UK and US copyright law are sometimes perceived.

Relevant to the Thomson Reuters case, for example, the concept of ‘fair use’ in the US is broadly considered to be more liberal than the equivalent concept of ‘fair dealing’ in the UK.

Under the UK’s Copyright, Designs and Patents Act 1988, there are certain very specific situations where copying may be permitted without permission from the content owner. Fair dealing with a work for the purpose of criticism or review, of that or another work or of a performance of a work, does not infringe any copyright in the work provided that it is accompanied by a sufficient acknowledgement. Another example where copyright infringement may be avoided is where limited extracts of works are copied for non-commercial research or private study and the work that is reproduced is accompanied by a sufficient acknowledgement. Such use is, however, only permitted when it is ‘fair dealing’ – copying the whole work would not generally be considered fair dealing. Courts in the US have tended to consider more extensive uses of copyright works to constitute ‘fair use’ than UK courts have in relation to ‘fair dealing’.

Given the Delaware ruling that ‘fair use’ could not be relied upon by Ross Intelligence, with his comments, Whittingdale is in essence querying whether the government needs to go as far with its reform to match the more liberal US system to produce more of a level playing field for harnessing AI development in the UK.

This is a question that the government is likely to face again as the DUAB passes onto the committee stage within the House of Commons – the government has asked for the committee to submit its report by 18 March – and as it considers next steps from its copyright and AI consultation.

Achieving a ‘win-win’ on copyright and AI

Speaking during the second reading debate in the Commons, Peter Kyle, secretary of state for science, innovation and technology, said he has “no intention whatsoever of taking away any rights” from the creative industries “without any consultation”. He said the government is listening to their voices but that the government also wants to ensure that it builds on the fact the UK is the third largest AI market in the world and does not want to see tech talent have to go elsewhere “to seek opportunities to exploit their talent and potential as individuals”.

According to another government minister Chris Bryant, who is responsible for data protection and telecoms policy, while the government wants a debate about AI and copyright, “it feels odd to be doing a bit of that” in the DUAB. He said that while the government wants to achieve more licensing of content, it also believes “a win-win” for AI developers and the creative industries is possible. However, he said whatever option for reform the government chooses is likely to be pursued in a “full, stand-alone Bill”.

Bryant said: “That may include elements of what Baroness Kidron has put in, elements from elsewhere or, for that matter, bits of the [EU] copyright directive … which the former government helped draft [pre-Brexit] and then did not incorporate into UK law. It might be a whole series of different things, but it needs to be considered in the round.”

The UK government is not the only government around the world grappling with whether and how copyright law should be updated for the AI age. In Canada, the government earlier this month reported on feedback received to an earlier consultation it ran on copyright in the age of gen-AI. It acknowledged the polarised debate, including over how the TDM exception to copyright should operate in the AI age, and said it will “consider options to bring more clarity into the marketplace, and examine how a balanced copyright approach to TDM activities could support the rights of creators while fostering Canadian innovation in an evolving global context”.

As policymakers globally contemplate how to respond to the AI and copyright debate, they should keep in mind the interdependencies between the opposing sides of the debate.

The very AI industry the UK government wishes to support requires access to good quality data to train its AI models – without it, the AI systems put into the public domain will be less accurate, less effective, and not deliver fully on their promise. The economic, social, health and other improvements the government is seeking to facilitate through its support for AI will not materialise if content creators withdraw their material to avoid it being copied. Those content creators in turn risk losing out on valuable additional potential revenue from use of their works in AI training and output.

A system that incentivises rights holders to make their works available to AI developers for a fair price is therefore in the interests of both the AI and creative industries.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.