The great intellectual property paradox

When pirates meet robots, things get complicated

KARACHI:

Here's a modern moral riddle for the post-bookstore, post-truth, post-everything age: If a machine trained on pirated novels writes a bestselling story, does the story get money for it? If it does, who gets paid?

As the cultural world grapples with the ever-expanding footprint of artificial intelligence (AI), two camps have emerged, often with overlapping memberships: those who decry the use of artists' and writers' work to train AI without consent, and those who passionately defend access to pirated academic and literary materials in the name of equity and anti-capitalist resistance. What do we make of the contradiction?

Z-Library, LibGen, Anna's Archive and the likes of them have long been the darlings of underfunded students, overworked researchers, and book-starved readers in the Global South. They offer vast repositories of paywalled knowledge and commercial literature for free. In late 2022, when the FBI took down Z-Library and arrested its alleged founders, a wave of online outrage followed, not because people supported copyright infringement per se, but because it laid bare a deeper issue: the inaccessibility of knowledge in a commodified system.

Yet fast forward to 2025, and the outrage has flipped. AI companies are scraping the same pirated datasets: books, journals, news articles, to train large language models. This includes OpenAI, Meta, Google, and likely any company whose model seems suspiciously well-read. The reaction has been widespread indignation, protests, copyright lawsuits, authors furious that their work was ingested without permission. And understandably so.

But here's the twist: some of the same people who once romanticised the digital Robin Hoods of Anna's Archive are now drawing sharp lines in the sand against AI. It is imperative to wonder what changed.

Meet your new librarian

AI tools like ChatGPT, Claude, and Perplexity are often trained on datasets that include pirated or publicly scraped material. That means your out-of-print poetry book, your obscure research paper, or your lovingly crafted fan fiction might now be part of a model's brain. The moral offence for many writers isn't just that they weren't paid, it's that they weren't asked.

In April, The Guardian reported on the UK's plans for a collective license to ensure authors are compensated for their work used in AI training. It's a step toward recognition that writing is labour. But here's where things get messy: if AI must pay to train on data, we have to reckon with what that means for the legitimacy of free access archives like Z-Library, somehow championing equitable access while condemning models trained on that same material.

This dilemma isn't just for writers. Last month, an AI-generated video mimicking the art style of Studio Ghibli went viral, prompting backlash from fans and the studio itself. Hayao Miyazaki, famous for his staunch opposition to AI-generated art, once called machine-generated imagery "an insult to life itself."

But let's not forget that much of Ghibli's global fandom was born on pirated DVDs, torrents, and unauthorised fan subs. The same applies to early anime culture, punk zines, indie comics, and underground music. Culture has always moved through shadow economies before it becomes mainstream.

Anti-capitalism with caveats

What this all reveals is a hierarchy of theft. It feels noble to pirate for the sake of access. It feels extractive to pirate for the sake of automation. The difference, many would argue, is power. AI companies are not poor students or struggling researchers. They're multibillion-dollar entities capitalising on collective cultural labour.

This power imbalance is real, but it doesn't resolve the paradox. If we believe information should be free, what limits do we place on that freedom? If we believe artists deserve consent and compensation, how do we justify the informal economies that brought many of us to art in the first place?

In a particularly surreal twist, the BBC announced last month that it has used AI to simulate writing classes from Agatha Christie and other long-dead authors. While the company insists it used only public domain material, the uncanny effect of conjuring an author's voice without their consent sparked a minor literary panic.

Here, it's important to ask what it means to preserve legacy in the age of simulation. Since AI can remix a dead author's style into a monetisable product, we have to be clear on whether it counts as homage or exploitation. And again, what if those same Christie novels were your gateway to literature, discovered on a secondhand hard drive or a dodgy file-sharing site?

AI has made this contradiction harder to ignore: we want art and knowledge to be free and sacred. We want machines to be ethical, but we're fine when people break the rules for the right reasons. We bristle at corporate extraction, but applaud individual resistance. We believe in fair pay, except when we can't afford it.

The truth is, intellectual property has always been a slippery concept. Shakespeare was a remixer. Folk tales were anonymous. Hip hop was built on sampling. Piracy made Hollywood nervous but also made it global. And now, AI is here, remixing everything with the cold neutrality of math.

So what do we do?

That depends on what you believe culture is for. If you believe art is a commodity, then AI training must come with licensing fees and permissions. If you believe art is a commons, then access matters more than ownership. If you believe art is both, well, welcome to the mess.

The call for a collective license in the UK is promising, but it's not a solution for everyone. Especially not for writers outside the Anglosphere, whose work may be scrapped but who may never see a dime. And it's not a solution for what happens to the culture downstream: the classroom relying on pirated PDFs, the rural library using a chatbot trained on pirated data, the next great writer raised on illegal ebooks.

We are in a moment of reckoning where two long-standing ideas collide: that creativity is sacred and that knowledge should be free. AI didn't create this contradiction. It just forced us to face it.

We cheer when piracy democratises access, jeer when AI exploits the same, and now we're stuck in an intellectual property paradox. Whichever side you lean toward, the question is no longer who owns culture, but who gets to use it, and why.

Load Next Story