IP Week @ SG: Experts debate AI authorship, copyright and creative ownership
01 September 2025
As artificial intelligence continues to redefine the boundaries of creativity, traditional notions of authorship and ownership are being tested like never before. The 2023 Sony World Photography Awards, controversially won by an AI-generated image, served as a flashpoint – raising urgent questions about what it means to create in a world where machines can mimic, remix and even originate.
From Getty Images’ US$1.7 billion lawsuit against Stability AI over alleged copyright violations to the U.S. Copyright Office’s evolving stance on AI-generated works, jurisdictions around the world are grappling with how to assign credit and control in a machine-driven creative economy.
Who truly owns the output of an algorithm trained on millions of human-made works? And when AI becomes the artist, inventor or author, what happens to the human claim?
Legal scholars are actively debating how intellectual property law should evolve to address AI-generated content. Speaking at a plenary session titled “Beyond Boundaries: Code to Create” during IP Week @ SG 2025, Ryan Abbott, author and professor of law and health science at the University of Surrey and a mediator and arbitrator at JAMS Inc., addressed the complexities of assigning authorship in the age of generative AI.
Abbott referenced the United Kingdom’s decades-old provision for computer-generated works, which assigns authorship to the human who initiates the creation process. He pointed out that while the rule has existed since the 1980s, it remained largely unused until recent advances in generative AI brought it back into focus. “Thirty-five years ago in England, people thought computers were getting so good so fast… but it turned out, no one cared for 30 years,” Abbott said.
Panelists also addressed concerns over surging volume of AI-generated content. Abbott noted Amazon now caps self-published books at three per day per user, following a surge in AI-authored titles. Lin Wu, head of regional legal and ecommerce legal teams at TikTok, also shared insights on how the platform manages AI-generated content.
TikTok requires creators to label realistic AI-generated content, including videos, images, and audio that have been significantly altered or entirely created by AI. The platform uses both creator-applied labels and automated detection systems to tag such content, helping viewers distinguish between human-made and AI-generated media. TikTok also prohibits AI content that impersonates public figures, fabricates crisis events, or misleads viewers, and may remove content that violates these guidelines.
Despite efforts to regulate, global approaches remain fragmented. “It gets a bit metaphysical… the short answer is it’s all a bit confused right now,” Abbott said.
As platforms and policymakers race to adapt, experts agree that clearer, harmonized regulations will be essential to balance innovation with the protection of human creativity.
- Cathy Li