Bill C-25’s criminal ban on deepfakes won’t restore credible news to the platforms where many Canadians look for it.
A bill before Parliament, C-25 , will amend the Canada Elections Act to target political deepfakes and misinformation in an effort to better protect the integrity of our elections.
Was fake content much of an issue in the last election? Is it still a concern in online debate over things like Alberta or Quebec separation, or support for party leaders?
In the last federal election, deepfakes did play a role. Researchers have traced numerous cross-platform misinformation campaigns to groups that include the extreme right Canada Proud and to actors linked to China and Russia. Some 24 percent of Canadians saw a social media post in which Pierre Poilievre or Mark Carney appeared in fictitious interviews with CBC or CTV News. Many others saw images portraying party leaders as wounded or under arrest, or, in Carney’s case, as tied to Jeffrey Epstein and Ghislaine Maxwell.
Political deepfakes are now part of our election landscape. Bill C-25 responds by making it an offence to create or distribute an audio or visual deepfake of a candidate or party leader, or material purporting to be from them, with intent to mislead the public. It would also make it an offence to make knowingly false statements about the process or outcome of an election.
There’s certainly a case for making these changes. But Bill C-25 targets only the most visible symptom of a larger problem — not only for our elections but for our democracy as a whole: the dearth of credible news on the platforms most Canadians look for it.
A large part of this was an unintended consequence of the Online News Act in 2023, which required Google and Meta to compensate Canadian news publishers for the use of news content on their platforms. Google reached a deal, but Meta chose instead to stop hosting Canadian news on Facebook and Instagram.
Surveying the damage a year later, McGill’s Media Ecosystem Observatory found that Canadian news outlets lost 85 percent of their engagement on Meta’s platforms, while roughly a third of Canada’s local media outlets became inactive on social media altogether. Roughly three quarters of Canadians surveyed in 2024 were unaware of the ban. The MEO estimated that Canadians were seeing a drop of 11 million views per day of Canadian news online, without realizing it.
What happens when legacy media gets buried
To be clear, Meta didn’t create Canada’s misinformation problem. Mis- and disinformation, foreign interference, influencer politics, and algorithmic amplification all predate the Online News Act. But Meta’s news ban made the problem far worse.
When credible news disappears from major platforms, it’s now clear that the vacuum gets filled by influencers, low-quality commentary, scams, foreign-linked campaigns, and deepfakes. Reuters reported in 2024 that right-wing meme pages and unreliable sources gained engagement after Meta blocked news in Canada, while Canadian and Australian officials warned of the risks to political discourse in election years and during emergencies.
The desert has only grown since. Meta announced in January of 2025 an end to its third-party fact-checking program in the United States, touting the change as a way to “allow more speech.” Elon Musk’s takeover of Twitter, now X, led to a similar relaxation of restrictions, opening the floodgates to false content. And on YouTube and Tiktok, legacy media is often drowned out by partisan influencers boosted by the algorithm.
The Media Ecosystem Observatory’s look back at the federal election in the spring of 2025 confirms the larger pattern. It found influencers to be the “loudest voices in the online political information environment,” while traditional news outlets, politicians, and parties were often less visible on Meta’s platforms and X. The report also found that automated bot activity and false content were widespread, “distorting political debate and confusing voters.”
Making credible media great again
Seen in this larger context, I doubt Bill C-25’s criminal ban on deepfakes will have much of an impact. It won’t restore local news to Facebook or Instagram. It won’t make credible journalism more visible on X, YouTube or TikTok. It won’t give researchers better access to platform data. And it won’t tell the public, in real time, why certain political claims are being amplified while others are buried.
Parliament needs to fix the vacuum it helped create. It should revisit the Online News Act and find a way to restore traditional media’s visibility on Meta’s platforms. But it shouldn’t stop there. We need stronger platform accountability during elections. This doesn’t require giving governments the power to decide which opinions are true or false, or to compel platforms to remove speech that is merely unpopular or inflammatory.
We can craft more restrained but effective law here. We can require large platforms to disclose viral synthetic media quickly during election periods. We can oblige them to give independent researchers meaningful access to data about political content, coordinated campaigns, bot activity, and foreign-linked influence operations. And we can impose duties to act on clearly false claims about election results, or on impersonated candidates and parties.
Many of us are forming opinions about the most pressing issues in our democracy — including whether to keep the country together — in online spaces increasingly shaped fake news and images. Targeting bad actors after the fact will not restore credible news in the volume we need to hear it.
We need to do more than punish the worst lies after they spread. We need to restore balance to the platforms where serious conversation is so easily tuned out.
