Canada, the United States and democracies around the world have lessons to share and plenty more to learn in what federal cabinet minister Dominic LeBlanc said Monday must be a collective, global effort to fight the scourge of online disinformation.
The federal Liberal government learned a lot from last year’s defensive measures against so-called “fake news” in advance of the October 2019 election, said LeBlanc, president of the Privy Council and the minister charged with keeping tabs on Canada’s democratic health.
The experience, he said, would be of help to the U.S., a country that’s no stranger to election interference from bad actors both domestic and foreign — and is hurtling headlong towards a presidential election in November widely seen as the most consequential in its history.
“Canada’s experience is that it takes a whole-of-government and a whole-of-society effort,” LeBlanc told a panel discussion hosted by the Institute for Data, Democracy and Politics at George Washington University.
“The strongest defence against threats to democracy is obviously an engaged and informed citizenry, and therefore a resilient public. Indeed, citizens in our view can be — with the right tools and awareness — the best bulwark against disinformation.”
The federal Digital Citizen Initiative mobilized academic and grassroots efforts to help Canadians better understand, recognize and root out disinformation, while intelligence agencies made public assessments of potential threats — a world first, LeBlanc said, predicated on the notion that “to be forewarned is to be forearmed.”
He said the government also streamlined its systems for identifying threats, tactics and vulnerabilities, sharing information with political rivals and G7 partners, and established a national security task force for recognizing and responding to foreign incursions.
“With the threat that is constantly evolving, no country can stand still,” LeBlanc said. “We’re hard at work evaluating these initiatives — what worked and what didn’t work as well as we would have liked in 2019 — in order to ensure that we’re ready for the next national election.”
Canada is also teaming up with Microsoft and the Alliance for Securing Democracy, a U.S.-based civil society group, on countering election meddling as part of the Paris Call for Trust and Security in Cyberspace.
Sasha Havlicek, chief executive of the Institute of Strategic Dialogue, a global think tank aimed at combating hatred, extremism and disinformation, cheered Canada’s strategy as an effective approach that mitigated the potential spread of bad information.
But the effort will be largely for naught, she added, unless and until social media platforms themselves grant overseers the chance to explore and evaluate the internal systems that Havlicek said are designed to favour titillating, sensationalist, extremist messaging.
“We desperately need third-party review of the systems and real robust transparency and oversight of the protocols that govern the information flows,” she said.
“The focus has been on content and content removal, which is not the right space. The focus needs to be on the recommendation, curation and moderation systems used by the platform — both algorithmic and human — so that we can genuinely understand what the impact is on our societies.”
As it happens, the online forum host Reddit shut down 2,000 of its controversial “subreddits” Monday, including one popular with supporters of President Donald Trump, under a newly imposed hate speech policy aimed at cleaning up the platform’s reputation as a safe haven for extremist views.
It’s just the latest social media move towards cleaning up the industry’s act. Twitter in recent weeks has shown new resolve in flagging or even censoring Trump’s own activity when it violates the service’s policies governing disinformation or inciting violence.
Facebook took a month to catch up, waiting until Friday to announce it would remove posts deemed to be inciting violence or designed to suppress voting, and would flag posts containing hate speech.
“I do get the sense that there’s something going on at Twitter, that maybe we’ve reached the last straw for what the management of Twitter can take,” said Rep. Adam Schiff, the Democrat chair of the House Intelligence Committee.
“I still get the sense that Facebook will need to be pulled and dragged into this era of corporate responsibility.”
Congress is fast approaching the point where it may need to take legislative action, he added.
“If the platforms aren’t going to step up and meet their social responsibility, then Congress needs to think about how we can either incentivize or insist on a different and more societally beneficial form of regulation,” Schiff said.
“The kind of Wild West environment we have online right now is just breeding rampant disinformation and division within our own society, and I think poses a real threat to democracies around the world.”