Culture minister says ’serious conversation’ needed about AI systems and news media
· Toronto Sun

OTTAWA — Culture Minister Marc Miller says the government must have a serious conversation about AI systems’ use of news.
Visit asg-reflektory.pl for more information.
“Having the news cannibalized and regurgitated undermines the spirit of the use of that news in the first place and the purpose for which it’s used and we have to have a serious conversation with the platforms that purport to use it including AI shops,” Miller said.
Miller was asked whether the government is open to extending its Online News Act to AI companies. The Online News Act requires Meta and Google to compensate media outlets for displaying their content. Meta pulled news off its platforms in response, but Google has been making payments under the act.
He said it’s not a question about opening up the legislation but of making sure companies are acting responsibly.
Miller was speaking at a national summit of AI and culture, a day after a new report said AI systems depend on Canadian journalism for the information they provide users but don’t offer compensation or proper attribution in return.
Researchers at McGill University’s Centre for Media, Technology and Democracy tested 2,267 Canadian news stories on ChatGPT, Gemini, Claude and Grok.
They found when the platforms were asked about Canadian news events from their training data, they did not provide source attribution about 82% of the time.
The report said AI companies now extract value from journalism “at every stage: ingesting news archives as training data, producing derivative content without naming the sources, and delivering answers to consumers that could reduce the need and incentive to visit the original source.”
The system “accelerates the economic decline of the journalism it relies on,” the researchers said.
Miller said Tuesday he had seen the report. He said he wants the government’s legislation to work, and that “this is about people paying their fair share.”
Asked whether that principle extends to AI companies, Miller said “the principle of proper compensation for use of proprietary material doesn’t change.”
Miller reiterated that the government is open to a deal to bring news back to Meta’s platforms.
The McGill researchers said in a policy brief the problems posed for journalism by social media and AI systems are distinct.
While social media platforms “captured advertising revenue by aggregating attention around news content,” the brief reads, “AI companies are doing something different: they are absorbing the substance of journalism, and delivering it directly to consumers as their own product.”
That means the “consumer’s need to visit the source is not just reduced by algorithmic demotion, as it was with social media. It is rendered unnecessary by the AI’s response itself.”
A coalition of Canadian news outlets, which includes Postmedia, The Canadian Press, Torstar, The Globe and Mail and CBC/Radio-Canada, are suing OpenAI in an Ontario court. They argue OpenAI is using their news content to train ChatGPT, breaching copyright and profiting from the use of that content without permission or compensation.
When he was asked Tuesday about the government’s position on whether the use of copyrighted materials for AI training violates copyright law, Miller said he doesn’t believe there is a need to open up the law.
“Intellectual property reform is a complex issue that goes over and above artificial intelligence, and it is a multi-year process. So it’d be irresponsible in any context to stand here and say nothing’s going to happen,” he said.
“But the current copyright law does and should protect those that have created material and people need to be compensated properly.”
In a 2024 consultation on copyright and artificial intelligence, AI companies maintained that using the material to train their systems doesn’t violate copyright.
The news publishers’ lawsuit was launched in late 2024. It’s unclear how long it will take for the court to make a decision on the case.
The House of Commons heritage committee heard last year from groups and unions representing creative industries that take issue with AI’s use of copyright-protected works without permission and want to establish a licensing system covering such use.