The rise of generative AI has pushed copyright law into the centre of global policy debate. AI models are built on vast datasets drawn from books, articles, images, music, and countless other creative works – much of it protected by copyright. This pits a data-hungry technology against the economic rights of millions of creators. Around the world, governments are grappling with the same questions: How should creators be compensated? And how do we balance the urgency of innovation with the rights of those whose work powers these systems? India, too, is grappling with this dilemma and the government convened a Committee to chart a way forward.
The task before the Committee was formidable, and it deserves credit for attempting it. Yet the model it proposes to balance these competing interests is almost certain to encounter serious practical hurdles.
Mandatory Blanket Licences and Statutory Royalties
The Committee proposes a ‘hybrid model’ with a mandatory blanket license that grants AI developers a right to access lawfully obtained copyrighted works for model training. In exchange, creators are granted a statutory right to remuneration, paid for by developers as a percentage of their gross global revenue generated from commercialization. This payment is collected into a central pool managed by a government-designated body and distributed to rights holders. The ‘hybrid model’ is believed to eliminate the need for individual negotiations or permissions, preventing holdouts and ensuring broad availability of data.
On paper, this mechanism aims to balance access and fairness. In practice, however, it is far from straightforward. How should ‘revenue’ be defined for global companies where only a fraction of the training data is sourced from Indian content? And how should revenue be computed for companies where AI is not a standalone product, but integrated into their other offerings such as search, cloud and operating systems? Not only does this proposal create jurisdictional overreach by attempting to impose an Indian statutory levy on economic activities occurring outside the country’s borders, it also risks producing arbitrary outcomes by tying royalties to a company’s overall commercial success rather than to its actual use of copyrighted works.
Moreover, it places the government at the centre of a rate-setting exercise, which comes with several risks: prolonged negotiation, market distortions, and regulatory uncertainty due to political pressures. The Committee defends this model by drawing parallels to, among other examples, the Indian Railways’ fare-setting process. But one could argue that although price setting in Railways has kept fares affordable, it has also produced long-term underinvestment and quality shortfalls. A similar dynamic has played out in electricity pricing, where government-determined rates have been misaligned with market realities.
Operational Complexities: Lawful Access and Fine-Tuning
Another challenge is the ‘lawful access’ requirement, which restricts AI developers from using content that is not legally accessible (such as paywalled material or pirated copies). While well-intentioned, this means that even with statutory royalties in place, large publishers or aggregators can continue to lock their content behind paywalls, effectively forcing developers to negotiate separate access deals. In practice, royalty payments become mandatory, but access to the underlying data does not, leaving substantial bargaining power with data-rich entities.
Complications also arise for fine-tuned models that are built on foundational models for specific tasks. The Committee suggests addressing the issue of payments for such models through verifiable self-declarations from AI developers on the categories of content used for training. But while this appears light-touch, it creates real risks: weak enforceability and a high likelihood of missed/ misallocated royalties.
Royalty Distribution and Inherent Challenges
To distribute royalties, the Committee proposes a new, not-for-profit collective – CRCAT – tasked with the impossible job of turning a single annual revenue share from AI developers into fair payouts across India’s creative economy. On paper, the Committee offers two pathways: a simple pro-rata model, where royalties are split based on the number of registered works, and a more complex value-based model, where works are graded using indicators such as website traffic, licensing history, citations, awards and even social engagement.
But this seemingly neat architecture conceals serious operational tensions. Rights holders in music, news, books, visual arts, film and academic publishing do not merely operate in different markets, they inhabit different universes. And within each class of work as well, a weighting formula will inevitably privilege some kinds of visibility over others. In practice, therefore, the promise of “neutrality” in distribution will be continually contested. The moment money begins to flow, sectors will lobby for more favourable payouts; creators will challenge weightages; and the collective’s management will struggle. The result: disputes, delays, and administrative friction that could choke the very compensation pipeline the model seeks to build.
Striking the Right Balance
Across the world, countries are struggling to find a solution. In many jurisdictions, the law is silent; others have introduced blanket or conditional text-and-data-mining exemptions; and some allow copyright holders to opt out and withhold consent for the use of their works. Each approach has triggered its own challenges and resistance from different stakeholders. It is against this backdrop that the DPIIT Committee has undertaken a herculean task. Yet while its approach is well-intentioned, the hybrid model it proposes risks merely swapping courtroom battles for bureaucratic ones, even as the core challenge of measuring and meaningfully rewarding human creativity remains unresolved. India now faces the urgent task of shaping a framework that is not only principled but practical, one that truly balances the promise of AI with the rights of creators.
Rohit is Founding Partner and Mahwash is Manager at the public policy consulting firm The Quantum Hub (TQH).
Photo by Maxim Berg on Unsplash