
A man types on a computer keyboard in Toronto on Monday, October 9, 2023, in this photo illustration. THE CANADIAN PRESS/Graeme Roy
As the media landscape evolves, researchers in Canada suggest future laws aimed at balancing the power between tech giants and news outlets should also account for artificial intelligence (AI). A new report, published by the Centre for Media, Technology, and Democracy at McGill University, emphasizes that the growing use of AI chatbots in accessing news needs to be considered in the conversation about fair compensation for journalism.
Sophia Crabbe-Field, the report’s lead author, points out that more and more people are turning to AI-driven chatbots for information. These bots, however, don’t typically attribute the information they deliver to news organizations, which raises concerns for media companies. Crabbe-Field notes that AI companies are profiting from news content without paying for it, making it crucial for these issues to be addressed in upcoming policies.
The report argues that AI companies need large volumes of quality content to train their models, and news outlets are often the source of that content. The challenge, however, is that many publishers haven’t given explicit permission for their work to be used in this way. As a result, many want compensation for their contributions to the training of AI models.
While determining fair compensation is difficult—given the unpredictability of how much content is used and how much AI companies are profiting—it remains a pressing concern for publishers. News Corp. and The Associated Press have already signed deals with OpenAI to allow the company to use their content to train its models. These deals reportedly involve large sums of money, but the report warns that such individual agreements might not be the best way forward for the industry.
Crabbe-Field suggests that instead of signing individual deals, news outlets should consider working together to negotiate better terms and ensure long-term sustainability. She stresses that while these deals may provide immediate financial benefit, they may leave smaller, independent outlets behind, which could further exacerbate the imbalance in the media landscape.
Paul Deegan, president of the News Media Group, agrees that the current situation leaves many smaller publishers out of the equation. He acknowledges the importance of the licensing agreements and lawsuits filed by major publishers against AI companies, which could eventually set the standard for fair compensation.
The report also touches on ongoing legal battles. For instance, a group of media companies, including The Canadian Press and CBC, is suing OpenAI, alleging that their copyright was violated. The lawsuit proposes a model where AI companies pay a fixed amount per work, like $20,000, or a figure deemed fair by the court. Similar lawsuits have been filed by other publishers claiming their articles were scraped by AI companies without permission.
In response to the Online News Act, which aims to secure compensation from search engines and social media platforms, Crabbe-Field suggests that future legislation could take further steps to ensure fairness. One possibility is to introduce “must carry provisions,” which would require platforms like Meta to continue hosting content from news outlets that are covered by the law. This would make it harder for platforms to avoid paying for news content, as Meta did when it blocked Canadian news from its platform.
In summary, as AI continues to reshape how people access news, future legislation must address the need for fair compensation for content used in AI models. By supporting collective efforts and ensuring that smaller publishers are included, lawmakers can help preserve the long-term health of journalism.