The first week of Canada’s federal election campaign saw the release of attack ads and a number of candidates disparaging each other on Twitter.
While experts say this is typical when it comes to an election, they warn that these divisive actions could polarize the electorate.
What’s more, after the 2016 U.S. presidential election, officials have been on high alert and ready to combat the spread of disinformation online during Canada’s election.
What is being done in Canada to stop the spread of false information online, and do political attack ads make the problem worse?
Here’s a look at what is happening.
Memes, fake news and deepfakes: How is disinformation spread?
According to Gordon Pennycook, assistant professor of behavioural science at the University of Regina who studies fake news, there are several different ways disinformation is spread online.
“One thing that’s common is memes,” he said. “They’re just images that have false information on them. There’s no link or anything.”
A Conservative party riding association was recently forced to delete a tweet containing a meme, which falsely said comedian Rick Mercer was urging young people to “vote Conservative.”
Pennycook says these memes are then shared on social media, often gaining traction in Facebook groups where they are spread to mass audiences.
Another way disinformation is spread is through “fake news” articles, Pennycook says.
“They are completely made up like headlines and they actually link to a website that looks ostensibly like a news website but is, of course, just somebody’s random website that they made,” he said. “And so the claims are just completely fabricated.”
On Tuesday, Conservative party Leader Andrew Scheer was forced to delete a tweet that falsely stated the RCMP had confirmed Liberal Leader Justin Trudeau was under investigation for the “SNC-Lavalin corruption scandal.”
Pennycook says another, albeit less-used, means of spreading disinformation is through deceptive videos, including “deepfakes.”
Deepfakes are fake video or audio recordings that look just like the real thing.
WATCH: Fighting disinformation during federal election campaign
Pennycook says that more often, a real video clip will be cut off or altered to present a different interpretation than what actually took place.
“You see that kind of stuff even in political advertising campaigns, misleading cuts and things like that,” he said. “So there’s deception all around.”
WATCH: What are deepfakes? Misinformation videos becoming more ‘powerful, precise’
The ‘rough and tumble of politics’
Gabrielle Lim, a researcher at Data & Society Research Institute who focuses on the implications of disinformation, propaganda and “fake news” in society, says that while it is difficult to know for sure whether attack ads have a direct correlation to the spread of disinformation, they can polarize the electorate.
“I don’t know, necessarily, that it leads to the spread of disinformation but I think it certainly can impact polarization,” she said. “I don’t know if they’re necessarily using straight-up false information.”
WATCH: Conservatives slam attack ads that played during Raptors game
However, she says parties should be careful in how they choose to react when false information is shared about them online.
“I think there’s a bit of calculation. On one hand, you can bring it to light and say this piece of information is false,” she said. “But oftentimes, that piece of information might be really obscure.”
“So I think there is a trade-off that they have to make in terms of how much attention do they want to bring to it or would they rather just let it slide.”
Additionally, Lim says federal parties should be very mindful of what they choose to share online.
“They should be very wary of the type of information they’re sharing, the information their candidates are sharing, because we often find that comes back to the leader as backlash,” she said.
According to Michael Pal, an associate professor at the University of Ottawa who researches election law, attack ads and targeted tweets are legal under Canada’s election law.
“It depends on the content of the specific ad,” he said, “But tweets criticizing other candidates‘ platforms and their personalities and things like that — that is legal under the Elections Act.”
“It’s also protected political expression under the charter,” he said. “It’s sort of the rough and tumble of politics.”
WATCH: NDP’s Jagmeet Singh on dealing with foreign meddling, fake news during election
Pal said the use of attack ads doesn’t worry him that much.
“What worries me more is if there are ads that might mislead voters about how they exercise their rights — where their polling station is, when to vote, how to vote, what ID is permitted — things that you might call voter suppression if people were given the wrong information,” he explained.
Pal says this is something that could happen during the campaign, though it is more likely to happen closer to election day.
Amending election law
When it comes to amending Canada’s election laws to prevent the spread of disinformation online, Pal says there is room for improvement.
“I think there are some groups that try to get around some of the campaign finance rules that relate to spending, and so they say they’re not third parties and that it’s more organic content — especially the ones operating online and advertising on Facebook — and sometimes that’s correct,” he said. “But other times, they are exploiting loopholes in the law.”
“There are ways the Elections Act could be updated to deal with some of the groups that are advertising online to make sure they are subject to the rules that ensure transparency and a level playing field through spending audits.”
He says, however, amending Canadian law will have little impact if foreign adversaries are the ones spreading the disinformation.
“You can have the best law in the world, but if someone in a state-affiliated entity in Moscow is the one spreading the misinformation and they’re not concerned about the impact of being charged, they know it’s unlikely they will ever have to face Canadian justice.”
Not an intractable problem
According to Ahmed Al-Rawi, an assistant professor at Simon Fraser University’s School of Communication, in order to prevent the spread of disinformation, Canadians should be “vigilant” and more “critical” of the information they choose to consume.
“Don’t take what you read for granted,” he said. “Try to check the sources.”
WATCH: Online game helps people detect fake news
Al-Rawi acknowledges, though, that this can be a difficult task.
“Sometimes, there are half-truths and half-false information,” he said. “That’s the difficulty because they might, for instance, use some facts but then they would embellish them or add false details.
“That’s where we need to be more vigilant and critical.”
Lim says self-awareness is key to identifying disinformation online.
“I think it’s self-awareness in knowing where your emotional biases lie, what political parties you follow and what sort of biases they might be promoting,” she said. “But I think that’s often difficult for a lot of Canadians or voters in general.”
Pennycook says Canadians should question things with “strong emotional valence” or that seem too good or too bad to be true.
“Before you turn into a keyboard warrior or share something, you should probably make sure that it’s true,” he said.
While this sounds like obvious advice, Pennycook says evidence has shown that a large part of the reason why people share false content is because they just don’t think about it.
“It’s not an intractable problem,” he said. “People do care about accuracy, and most of the time, they could figure out if something is true or false. But they don’t bother to do so.”
What are social media platforms doing to stop the spread of disinformation?
In a statement emailed to Global News, a Facebook spokesperson said the social media platform takes the protection of elections “extremely seriously” and is “committed to being a force for good in Canadian democracy.”
“Which is why we devote significant time, energy and resources to these issues as demonstrated by the launch of the Canadian Election Integrity Initiative,” the statement reads.
In a press release issued in January, Facebook said the company has made “massive investments” to help “protect the integrity of elections — not only addressing the threats we’ve seen on our platforms in the past but also anticipating new challenges and responding to new risks.”
Facebook says the “multifaceted” approach includes third-party fact-checking, the detection and removal of fake accounts using artificial intelligence, providing more transparency about advertisers and providing additional context to content posted to its platform.
“When a fact-checker rates a post as false, we show it lower in News Feed to significantly reduce the number of people who see it and the likelihood it will spread further,” the release reads. “Pages and domains that repeatedly share false news will be penalized with reduced distribution, and they will not be able to monetize or advertise on Facebook. This helps curb the spread of financially motivated false news.”
Additionally, according to Facebook, a dedicated, cross-functional Canada elections team has been in place since the 2018 Ontario provincial election to ensure integrity on the platform.
WATCH: Fake news and misinformation is shared and spread within and by peer groups
Similarly, Twitter’s election integrity policy says users may not use the platform to manipulate or interfere in elections.
Twitter says misleading information about how to participate in an election, voter intimidation and suppression and false or misleading affiliations violate its policy.
If an account is found to have violated the policy, the platform says it could result in the deletion of the tweet, profile modifications and/or permanent suspension.
However, inaccurate statements about an elected official, candidate or political party; organic content that is polarizing, biased, hyper-partisan or contains controversial viewpoints expressed about elections or politics; the discussion of public polling information and using Twitter pseudonymously or as a parody or fan account to discuss elections or politics do not violate the policy.
What are federal parties doing to stop the spread of disinformation?
In a statement emailed to Global News, the Liberal party says it takes cyber threats “very seriously” and is “committed to protecting Canada’s democracy.”
“That’s why we’ve put in place a plan to prepare and respond to the threat and have ensured the government of Canada mobilizes government-wide expertise to anticipate, recognize and respond to these threats,” the statement reads.
Additionally, the party says Canadians “expect social media platforms to address issues like cybersecurity and the spread of disinformation by demonstrating greater transparency” and that a Liberal government would “continue to hold them to account.”
The Liberal party says it has been providing candidates, campaign teams, staff and campaign officials with “comprehensive resources and guides” on best practices for information security on social media, on email and on the web.
The party says it has also launched a Twitter account dedicated to countering false claims and disinformation online.
WATCH: Most Canadians have fallen for fake news, poll finds
The NDP said it is “actively communicating with Twitter and Facebook” to monitor for online disinformation and bot accounts.
“While there seems to be little we can do about the flood of (particularly conservative) bot accounts, we report any accounts that spread misinformation or abusive, racist or sexist language,” reads a statement emailed to Global News.
The party says government and big tech “need to do more to stop this from happening in the first place.”
“Canadians deserve to have access to accurate information as they evaluate parties and candidates in the run-up to election day, and no one should be subjected to abusive, racist or sexist language,” the statement says.
In an emailed statement, a spokesman for the Conservatives said the party has “repeatedly asked the Trudeau Liberal government” to “clearly outline how they plan to monitor cyber threats and regulate social media platforms to prevent foreign influence while respecting Canadians’ fundamental right of free speech.”
Global News reached out to the Green party for comment but did not hear back by the time of publication.
Speaking to reporters on Tuesday, Canada’s chief electoral officer, Stephane Perrault, said Canada has a “robust electoral process” and that he is confident the election will be “solid.”
“We are monitoring social media. Our mandate is to make sure Canadians have the correct information on the voting process, and if there is incorrect information, wherever the source, whether it’s foreign or domestic, … we are positioned to rectify that information,” he said.
“Canadians are aware of what’s going on around the world, and they’re learning to check their sources in terms of the information,” he said. “And so I’m very confident that we will have a solid election again this year.”
Canadians will head to the polls on Oct. 21.