Justice Minister Arif Virani is keeping tight-lipped about whether the federal government is considering a new regulator that would hold online platforms accountable for protecting users from harmful content.
Asked about the prospect of a new agency, Virani said Friday the government is looking at consultations that have been done to date and how other countries have confronted the issue.
“We’re studying what’s worked in foreign jurisdictions,” he said.
“We’re definitely working with different online entities, including online companies,” the minister added, pointing to the recent deal the federal government struck with Google that will see the tech giant pay up to $100 million a year to media companies.
That total is about $72 million less than what the government’s draft regulations originally indicated would be owed under the Online News Act, which requires tech giants to compensate media outlets for news that is republished on their platforms.
Faced with the requirement to comply with that controversial law, Google had threatened to remove Canadian news from its search engine altogether.
Virani’s response on Friday came after a group of experts, assembled by the government last year to advise on a prospective new law, penned an open letter urging the government to hurry up and table a long-promised bill to address online harms.
Prime Minister Justin Trudeau had promised to introduce legislation to tackle harmful content online within 100 days of his successful re-election in September 2021.
Two years later, nothing has been tabled.
His initial promise to take action came even earlier. In 2019, Trudeau instructed his then-heritage minister to introduce regulations that would compel social-media platforms to remove all illegal content, such as hate speech or child abuse images, within 24 hours.
Canadian Heritage was seized over the past couple of years with highly contentious bills, including the Online News Act and the Online Streaming Act, which aims to bring streaming companies into line with broadcasting rules.
It’s in that context that responsibility for tabling an online harms bill was transferred to the justice minister after a cabinet shuffle in July.
Those seeking an online harms bill acknowledge that it could be even more controversial.
Experts and others said in this week’s open letter that the legislation must create a regulator that has the power to investigate and audit platforms, order corrective actions and impose fines.
In 2021, the government published a blueprint for how it would compel social media companies to deal with harmful materials on their platforms.
It included the hiring of a digital safety commissioner who would enforce rules requiring companies to remove content from child sex abuse images to terrorist material. It also proposed requiring companies to remove flagged content within 24 hours of a complaint.
The framework was roundly criticized for brushing up against freedom-of-expression protections in the Charter of Rights and Freedoms.
Academics and other advocates who weighed in on the measures also questioned what powers a digital safety commissioner would have, and how such a regulator would operate and enforce rules.
Virani has said in recent weeks that he hopes to table legislation soon, but emphasized it is difficult to come with up regulations when it comes to social-media giants.
On Friday, he said it is critical that the government gets it right when it comes to enhancing online safety for children and other vulnerable groups.
“We’re working on it very, very diligently in terms of aspects that relate to the Criminal Code, the Canada Human Rights Act and how we address issues that relate to what we’re seeing online.”
“Geopolitical conflicts” are leading to a spike in hatred online which can then manifest as physical violence, he added.
Virani did not specifically cite the Israel-Hamas war, though he referred to recent messages from police and leaders from Jewish and Muslim communities in Canada.
Lianna McDonald, executive director of the Canadian Centre for Child Protection, said she is very concerned about the absence of legislation to address online harms.
She said online platforms must show more accountability and transparency when it comes to the material on their sites, along with the steps they are taking to increase safety for children.
“Those companies, just as we see in the offline world, ought to be responsible for making sure that those are safe environments.”
She added that she believes reforms will have to be accomplished through regulations and under the threat of fines.
“It’s sort of lawless,” she said in an interview Friday. “They need to be responsible when children are using platforms.”
McDonald added that Australia has an online safety commissioner, saying she would support seeing a similar model developed in Canada.