House of Lords blocks AI copyright grab—But is the fight over?
Michael Leidig
- Published
- Artificial Intelligence, Opinion & Analysis

Ongoing attempts to expand copyright exemptions for AI training will cripple original content creators and grassroots media organisations, warns Mike Leidig
In 2022, the UK government proposed expanding the text and data mining (TDM) exception to copyright law, allowing artificial intelligence (AI) developers to harvest editorial content and other original works without permission or payment. The aim was to enhance large language model (LLM) training and, with it, the UK’s AI competitiveness.
Text and data mining involves the automated analysis of large volumes of digital content to identify patterns, trends, and insights. AI companies rely on this process to train machine learning models, which improve their ability to generate text, images, and other forms of media. By broadening the TDM exception, the government intended to remove barriers that restricted access to vast datasets, theoretically accelerating AI research and development.
Understandably, the proposal faced immediate backlash from the creative industries, which argued it would primarily benefit large tech firms at the expense of creators and media organisations. Writers, journalists, photographers, and artists feared that their copyrighted work could be freely used to train AI systems without any form of credit or compensation. This raised concerns about fairness and the sustainability of creative professions, as AI-generated content, built on unpaid labour, could eventually compete with human creators in the marketplace.
Comparisons were drawn to Section 230 in the United States, which originally aimed to foster social media growth but ultimately enabled platforms to profit from journalistic content without paying for it. Lord Black of Brentwood warned that such changes could devastate an already struggling media sector, where advertising revenue has shifted from publishers to tech giants like Google.
In response to widespread criticism, the government withdrew the proposal in early 2023. However, the issue resurfaced last month 2025 with the publication of the ‘AI Opportunities Action Plan,’ which recommended reforms to make the UK’s TDM regime more competitive. This reignited concerns among authors, artists, and journalists, who feared that AI firms could exploit their work without compensation.
The House of Lords intervened, voting 145 to 126 in favour of amendments to the Data (Use and Access) Bill, requiring AI companies operating in the UK to respect existing copyright laws. These amendments, led by Baroness Kidron, underscored the need to protect creative works from unauthorised AI training. Matt Rogerson, head of policy at The Financial Times, called the government’s earlier proposal a “huge mistake”, reinforcing the view that copyright laws must adapt to protect content creators.
Against this backdrop, NewsX Community Interest Company (CIC), an independent journalism organisation that I founded, has launched legal action against Associated Newspapers, the publisher of Mail Online, over the misuse of journalistic content. NewsX alleges that Mail Online used high-quality images related to the death of Iranian woman Mahsa Amini—originally provided by a Farsi-speaking correspondent—without payment or attribution. Despite the images containing NewsX’s secret proof-of-work mark, Mail Online rejected claims for compensation.
This case highlights a broader issue: large media organisations increasingly rely on syndication agencies that neither employ journalists nor verify content, threatening the integrity and economics of news gathering. NewsX’s exclusive images, including uncropped versions that played a crucial role in global coverage of the Mahsa Amini protests, illustrate why proper credit and compensation matter.
While copyright law establishes ownership, proof-of-work ensures fair payment for editorial labour. If outlets only publish content when licenses are readily available, many critical stories might never be told. The Lords’ amendments to the Data Bill send a strong message that innovation must not come at the cost of human creativity. However, the fight is far from over—the bill will return to the House of Commons for further debate, leaving the future of copyright protections uncertain.
One pressing issue remains: enforcement. Many AI firms train their models in jurisdictions with weaker copyright protections, making it difficult for creators to track unauthorised use. Without swift action, AI companies will have already absorbed vast amounts of copyrighted content, leaving little recourse for journalists and artists. Policymakers must act now to balance technological progress with protecting intellectual property rights, ensuring the UK remains a leader in both innovation and creative excellence.

Michael Leidig is a British journalist based in Austria. He was the editor of Austria Today, and the founder or cofounder of Central European News (CEN), Journalism Without Borders, the media regulator QC, and the freelance journalism initiative the Fourth Estate Alliance respectively. He is the vice chairman for the National Association of Press Agencies and the owner of NewsX. Mike also provided a series of investigations that won the Paul Foot Award in 2006.
Main image: Courtesy ThisIsEngineering/Pexels
Sign up to The European Newsletter
RECENT ARTICLES
-
Forget ‘quality time’ — this is what children will actually remember -
Shelf-made men: why publishing still favours the well-connected -
European investors with $4tn AUM set their sights on disrupting America’s tech dominance -
Rachel Reeves’ budget was sold as 'fair' — but disabled people will pay the price -
Billionaires are seizing control of human lifespan...and no one is regulating them -
Africa’s overlooked advantage — and the funding gap that’s holding it back -
Will the EU’s new policy slow down the flow of cheap Chinese parcels? -
Why trust in everyday organisations is collapsing — and what can fix it -
In defence of a consumer-led economy -
Why the $5B Trump–BBC fallout is the reckoning the British media has been dodging -
WPSL Group unveils £1billion blueprint to build a global golf ‘super-group’ -
Facebook’s job ads ruling opens a new era of accountability for artificial intelligence -
Robots can’t care — and believing they can will break our health system -
The politics of taxation — and the price we’ll pay for it -
Italy’s nuclear return marks a victory for reason over fear -
The Mamdani experiment: can socialism really work in New York? -
Drowning in silence: why celebrity inaction can cost lives -
The lost frontier: how America mislaid its moral compass -
Why the pursuit of fair taxation makes us poorer -
In turbulent waters, trust is democracy’s anchor -
The dodo delusion: why Colossal’s ‘de-extinction’ claims don’t fly -
Inside the child grooming scandal: one officer’s story of a system that couldn’t cope -
How AI is teaching us to think like machines -
The Britain I returned to was unrecognisable — and better for It -
We built an education system for everyone but disabled students


























