Advertisement

Faith Goldy banned from Facebook after site enforces extremism, hate policy — now what?

Click to play video: 'Pressure on social media companies to crack down on hate'
Pressure on social media companies to crack down on hate
WATCH: Pressure builds on social media companies to crack down on hateful content – Mar 15, 2019

On Monday, former Toronto mayoral candidate Faith Goldy and several others were banned from Facebook as the social media site said it was removing extremist groups and users that promote hate in an attempt to curb dangerous rhetoric on its platform.

Goldy was kicked off the social media site and Instagram, which is owned by Facebook, along with white nationalist Kevin Goudreau, far-right group Soldiers of Odin and one of its offshoots, Canadian Infidels.

“Individuals and organizations who spread hate, attack or call for the exclusion of others on the basis of who they are have no place (on) our services,” a Facebook spokesperson said in a statement.

READ MORE: 6 steps Canadians can take when they spot hate speech online

“The individuals and organizations we have banned today violate this policy, and they will no longer be allowed a presence on our services.”

Story continues below advertisement

But what does kicking people off Facebook actually accomplish? Does banning certain people and groups address the larger issue of hate speech?

According to experts, a social media ban won’t fix the problem, but it’s a good first step.

Sending a message

Veronica Kitchen, an associate professor of political science at the University of Waterloo, told Global News that “de-platforming” a person or a group can have positive effects.

“Facebook is probably the social media network that most people are on so if people are unable to access those groups… it’s certainly reducing the amount of people who hear their hateful ideas,” Kitchen said.

WATCH: Why are governments slow to regulate social media?

Click to play video: 'Why are governments slow to regulate social media?'
Why are governments slow to regulate social media?

“I think there’s less evidence that (a ban) is going to be successful for those who are already inclined to support those views and support someone like Goldy.”

Story continues below advertisement

Megan Boler, a professor in the Department of Social Justice Education at the Ontario Institute of Studies in Education, told Global News that Facebook’s decision to ban hateful users is sending a clear message.

While Boler acknowledges that the corporation likely faced public pressure to remove certain users from Facebook and cares about its reputation, it was ultimately the right thing to do.

READ MORE: N.Z. privacy watchdog calls Facebook ‘morally bankrupt’ following mosque shootings

“I think, most significantly, (the Facebook ban) sends a message to multiple audiences and to the public… that this kind of incitement and this kind of hate speech is not acceptable and not condoned,” Boler said.

Breaking news from Canada and around the world sent to your email, as it happens.

“Certainly to take no action is tantamount to saying that this kind of hate speech or… white supremacy is acceptable.”

Will hateful people go elsewhere?

Goldy, who was fired from her job at Rebel Media in 2017 after appearing on a podcast produced by a neo-Nazi website, may be off Facebook, but she is still very active on Twitter.

Since being removed from Facebook, she has tweeted advice about how her fans can support her, including donating money through her website and subscribing to her content.

Story continues below advertisement

WATCH: Facebook, Google defend efforts to remove hate speech and white nationalism before Congress

Click to play video: 'Facebook, Google defend efforts to remove hate speech and white nationalism before Congress'
Facebook, Google defend efforts to remove hate speech and white nationalism before Congress

Goldy is also still posting content on YouTube.

In a statement to Global News, a spokesperson from YouTube said that “hate speech and content that promotes violence have no place” on the platform.

“We also know that there will always be content that comes right up to that line but doesn’t cross it,” a spokesperson for YouTube said. “We’ve been working to reduce recommendations of borderline content and apply a set of restrictions that strips those videos of key features such as comments, suggested videos and likes.”

READ MORE: Right-wing platform Gab taken down after Pittsburgh shooting, says it’s been ‘smeared’ by media

A spokesperson from Twitter declined to comment on individual users but pointed to the network’s rules and policies around hateful conduct. Twitter says that it does not tolerate hateful conduct or violence on the platform, including “symbols historically associated with hate groups, e.g., the Nazi swastika.”

Story continues below advertisement

These responses are not shocking to experts.

“The nature of the internet and web-based communications today is that there’s always workarounds,” Boler said. “The challenge of figuring out how to deal with other monopolies like Google and YouTube is very considerable.”

Fans may still follow

People who subscribe to a hateful ideology such as white nationalism are likely to congregate elsewhere online if they’re booted from one platform.

Kitchen says that since Goldy has her own follower base, getting kicked off Facebook doesn’t mean she’s not reaching an audience.

WATCH: Facebook says it’s considering restricting live video after New Zealand shooter streamed attack live

Click to play video: 'Facebook says it’s considering restricting live video after New Zealand shooter streamed attack live'
Facebook says it’s considering restricting live video after New Zealand shooter streamed attack live

“There are other mechanisms for those people who are already inclined to seek her out to do so without very much difficulty,” she explained. “There’s also lots of other internet forums where it’s easier for these people to share their hateful views.”

Story continues below advertisement

Kitchen points to places like Reddit and 4chan where racist, sexist and hateful views are commonly expressed.

Robert Bowers, the 46-year-old man charged with killing 11 people at a Pittsburgh synagogue in October, was reportedly an active user of social network site Gab. Gab has been described as a right-wing social media site where anti-Semitic and anti-black content is often shared. (The site bills itself as a “free speech” platform.)

READ MORE: Facebook, Instagram ban white nationalism as part of expanded definition of hate speech

“There are corners of the web where people who are inclined to these sort of ideas can easily find an echo chamber where their views — or even more extreme views — are going to be reflected back to them,” Kitchen said.

How can Canadians curb hate in a meaningful way?

Boler said that hate speech and violent rhetoric on social media platforms targets people’s emotions, which is why they are effective.

To combat this, fostering in-person conversations is vital.

“One of the things that’s most important now is to sort of revitalize some of the face-to-face public gatherings and public pedagogues in educational spaces where we could have these kinds of large-scale conversations about difficult topics,” Boler explained.

Story continues below advertisement

WATCH: Growing threat of white nationalism in Canada

Click to play video: 'Growing threat of white nationalism in Canada'
Growing threat of white nationalism in Canada

Kitchen said it’s also important for people in positions of power to stand up against hate and speak out against misinformation and violent rhetoric. She points to journalists, teachers and politicians as important figures in combating dangerous groups and ideas.

“Arguments about freedom of expression are used in places where they don’t belong,” Kitchen said.

“A private institution is not required to give a platform to anybody.”

Laura.Hensley@globalnews.ca
Advertisement

Sponsored content

AdChoices