Advertisement

Why it’s so hard to stop online hate before it becomes real-life violence

A person pauses in front of Stars of David with the names of those killed in a deadly shooting at the Tree of Life Synagogue, in Pittsburgh, Monday, Oct. 29, 2018. (AP Photo/Matt Rourke)

A man opens fire in the midst of Sabbath prayer services at a Pittsburgh synagogue, killing 11. Six months before that, a man drives a van into a crowd of people in Toronto going about their day. He kills 10. And before him, a man walks into a mosque in Quebec City, killing six. The list is long — and growing.

In the aftermath, there is always a scramble to understand why. Unnerving details emerge, often about what the suspect did online.

Robert Bowers, accused in the Oct. 27 Pittsburgh synagogue attack, shared his anti-Semitic views on fringe social media network Gab.com. Alek Minassian, accused in the Toronto van attack on April 23, posted to Facebook praising the “incel rebellion.” Incels, a combination of the words “involuntary” and “celibate,” refers to misogynist men who believe women are inferior and owe them sex.

Story continues below advertisement

Alexandre Bissonnette, who pleaded guilty to the Quebec City mosque attack in January 2017, spent hours on anti-immigrant, far-right websites and told investigators he “wanted to save people” from terrorist attacks.

“Radicalization happens online,” says Oren Segal, director of the Anti-Defamation League’s Center on Extremism.

Despite this, society still seems to struggle with finding effective ways to take preemptive action, to cut off online hate and extremism before vitriol turns deadly. Experts say the struggle is complicated by everything from an often disingenuous fight over free speech to law enforcement agencies without enough resources, to the internet itself.

The online world is big with dark corners. So, how do you stop online hate before it moves offline?

WATCH: Hate speech or freedom of speech? University of Calgary professor clarifies why it’s hard to prove

Click to play video: 'Hate speech or freedom of speech? University of Calgary professor clarifies why it’s hard to prove'
Hate speech or freedom of speech? University of Calgary professor clarifies why it’s hard to prove

“Murder capital of the internet”

In July 2011, Anders Breivik set off a bomb in front of a government building in Oslo, Norway, killing eight people. He then took a ferry to Utoya Island, where he killed 69 more people. The majority of his victims were teenagers.

Story continues below advertisement

He told investigators the government was responsible for what he saw as an invasion of Muslims. In the three years before his attack, Breivik was active on a white supremacy web forum called Stormfront, where he railed against the media, feminism, politicians, “pro-immigration Jewry” and Muslims.

In 2014, the Southern Poverty Law Center, which monitors hate groups and other extremists tied Stormfront to nearly 100 murders over a five-year period. SPLC, which spent two years identifying extremists who murdered someone and analyzing their online activities, dubbed the website, “murder capital of the internet.”

The report goes on to explain the usual trajectory that hate-motivated killers like Breivik take. They’re frustrated in their lives and project their grievances on others, searching the internet for answers. He’ll start on right-wing websites and anti-government websites peddling conspiracy theories but segue to “militant hate sites that blame society’s ills on ethnicity and shifting demographics.”

Then, he’s in, the report explains, binging “online for hours every day, self-medicating, slowly sipping a cocktail of rage.”

While there’s no “100 per cent” way to prevent attacks, Evan Balgord says the most effective way to mitigate the risk of offline violence is to go after the people producing hate propaganda online.

Story continues below advertisement

WATCH: Mélanie Joly calls on ‘web giants’ to counter hate speech online

Click to play video: 'Mélanie Joly calls on ‘web giants’ to counter hate speech online'
Mélanie Joly calls on ‘web giants’ to counter hate speech online

“Try to turn off that steady drip, drip, drip of hate at its source,” says Balgord, executive director of the Canadian Anti-Hate Network.

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

Get daily National news

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

“When we take down these public, semi-public places that they spread their message, that means less people successfully connecting with them, less people finding propaganda that will further radicalize them.”

Gab.com, where Bowers’ anti-Semitic comments went unchecked went offline shortly after the synagogue shooting, when it was revealed he had a history of sharing hateful messages on the platform. The site’s domain provider, GoDaddy, gave them 24 hours to move their site to another provider, saying it violated their terms of service and promoted violence against people.

Given that the site will likely find another platform, Barbara Perry says GoDaddy giving them the boot is more symbolic than practical, although some people might give up when their access is cut off.

Story continues below advertisement

“It really does send a powerful message,” says Perry, director of the Centre on Hate Bias and Extremism at the University of Ontario Institute of Technology.

“There’s something symbolic about saying, ’This is not something we can tolerate as an inclusive and respectful nation.’”

Still, Perry notes that Gab’s promises to return highlight just how tough it is to deal with hate online. Ban something on one corner of the web and it’ll pop up on another.

“We all just sort of throw up our hands and say, ‘I don’t know what to do about it.’ It’s really hard to regulate.”

The law vs. online hate

Before Minassian allegedly killed 10 people in a van attack last spring, he put a post on Facebook that has since been taken down that read: “The incel rebellion has already begun! We will overthrow all the Chads and Stacys! All hail the Supreme Gentleman Elliot Rodger!”

Per incel terminology, Chads are attractive men and Stacys are the women who want to be with them and not the incels. Rodger is the man responsible for killing six people and injuring 14 before killing himself in Isla Vista, Calif., in 2014. He is hailed as an incel hero, and per notes he left behind, said he wanted to “punish all females for the crime of depriving me of sex” and referred to himself as a “supreme gentleman.”

Story continues below advertisement

After the Toronto attack, incels praised Minassian.

But Canadian law doesn’t exactly give police the power to crack down on every dark, misogynistic corner of the web. Nor have there even been enough case studies to know yet how many people who are hateful online and murderous offline would still commit physical violence if they hadn’t connected with an online hate group.

“It’s just ubiquitous,” Perry says. “Online hate makes the circulation of these ideas that much easier and that much quicker for those who want to share vile kinds of sentiments.”

Plenty of vile sentiments, she notes, don’t quite qualify as hate speech. Even if they did, says Richard Moon, the law is reticent to crack down.

WATCH: CSIS’s extremist probe ended months before shooting

Click to play video: 'CSIS’s extremist probe ended months before shooting'
CSIS’s extremist probe ended months before shooting

Moon, a law professor at the University of Windsor, wrote a report on the regulation of online hate speech for the Canadian Human Rights Commission. Unlike most other charges, police can’t lay hate crime charges without the approval of attorneys general.

Story continues below advertisement

“There are certain times where attorneys general have been pretty wary about these prosecutions,” Moon says. “They’re not always successful.”

It’s hard to imagine police actually having enough resources to go after everyone online spewing hate speech and extremism, Balgord says.

“People feel really emboldened by the fact that they’re doing it online. They’re not following proper social norms of not being horrible to other people.”

The source of the hate

The scale of the problem is why Moon doesn’t think the law is the most practical option when it comes to curbing online hate. Certainly, he says, the law should handle the most extreme cases but society needs to deal with the proliferation of less extreme views that feed the more extreme ones.

“We have to find ways to talk about it, address it, not give it a platform.”

In the case of Bissonnette, who carried out the Quebec City mosque attack, investigators found he spent hours on far-right, anti-immigrant websites. He told investigators that the day before he attacked the mosque, he saw a tweet declaring Canada open to refugees from Prime Minister Justin Trudeau. It enraged him. It made him think he had to do something before refugees came to kill his family, he said.

WATCH: Bissonnette spent hours on far-right websites

Click to play video: 'Bissonnette spent hours on far-right websites'
Bissonnette spent hours on far-right websites

That logic makes sense when you think about the anti-Muslim, anti-refugee rhetoric he was consuming, Moon says.

Story continues below advertisement

“If you believe what people are saying about Muslim people then you take extreme action.”

Every person who fed him those beliefs have a hand in what Bissonette did, Balgord says.

“All of them are responsible for the dead men at the mosque but they’re never going to be held accountable for those actions because [they’re] one step removed.”

That’s why the Canadian Anti-Hate Network spends time hunting down people anonymously making threats and using dehumanizing language online.

It’s a cumbersome process that involves matching usernames of different social media accounts and compiling little breadcrumbs in each account that together can reveal someone’s true identity, Balgord says.

So far, he says, they helped expose the hosts of a Canadian neo-Nazi podcast with resulted in the shutdown of one of the largest international neo-Nazi forums. Balgord’s frustrated that more people aren’t tackling the issue.

“This is our greatest threat right now and there is a distressing lack of attention, lack of action.”

How to turn people away from hate

Elizabeth Moore is a former white supremacist, sucked in as a teenager by a slick flyer that appealed to her struggle to cope. Moore grew up in Scarborough, Ont., and during her high school years in the early 1990s, immigration was changing the culture of her neighbourhood and she didn’t feel like she had anyone to help her cope with that.

Story continues below advertisement

It was a friend who introduced her to the Heritage Front which was, at the time, one of Canada’s leading white supremacist groups. Moore’s friend gave her a flyer and Moore, intrigued, requested more information.

“The more information they sent me, the more it resonated with me in terms of giving me support, giving me an outlet, appealing to my ego.”

The way Moore describes her steady indoctrination then is similar to what the Southern Poverty Law Center described in its 2014 report as the transformation that people undergo online.

FROM 2013: Brotherhood of Hate

Then, as now, she says, many people didn’t seem to take it seriously.

“I’m seeing this again,” she says, “the ‘You can’t curtail people’s freedom of expression, they’re just blowing off steam, that’s an extreme view, nobody will fall for that.’”

Story continues below advertisement

The big difference, she says, seems to be the internet and that’s what scares her about the rise of extremism now.

“I think about how powerful it was reading [the Heritage Front’s] literature and then I think about the young people today who have access to information that’s packaged in such a way that’s much more slick, so much more easily consumed.”

Education is key, says Oren Segal of the Anti-Defamation League’s Center on Extremism.

“We need to show that there are consequences for hate and we need to encourage people that they have a role to play in mitigating how hate is amplified and spread.”

It’s especially important that happens at a young age, he says.

“We need to train people at a young age to be critical thinkers, to understand that whether its advertisements or governments or extremists. People are trying to brainwash you for ideological reasons or to buy something.”

Sponsored content

AdChoices