April 25, 2019 2:13 pm
Updated: April 25, 2019 6:37 pm

How social media is helping white supremacist movements

WATCH ABOVE (March 20, 2019): After the New Zealand terror attacks, new scrutiny is on how social media enables hate to spread at the speed of light. Emboldened disturbed and isolated figures join together in like-minded online communities.

A A

A new report out of Alberta finds social media is impacting the recruitment of new members into extremist groups such as white supremacists.

The Organization for The Prevention of Violence (OPV) suggests the availability of white supremacist materials online, the ability to connect with like-minded people and potential recruits and the general social interaction afforded by the internet has been “a game changer” for the white supremacist movement.

Story continues below

READ MORE: New report identifies 7 extremist groups of threat to Alberta, offers recommendations to combat hate

“These spaces allow established veterans of the movement, and new and potential recruits, a greater degree of freedom where they can express views and opinions that would be seen as offensive in much of society,” the report reads.

Don Black is the founder of Stormfront, at one time the largest internet white supremacist forum. The study quotes him describing how technology has benefited the movement.

“The net has provided the movement with the opportunity to bring our point of view to hundreds of thousands of people….websites which are interactive, provide those people who are attracted to our ideas with a forum to talk to each other and form a virtual community.”

Engagement happens on counter-culture forums such as 4Chan, 8Chan and Reddit posts, through memes that are distributed through “highly diffused but inter-connected networks,” according to OPV.

“The increased accessibility of white supremacist texts, cultural artifacts (songs, posters, memes etc.) and discussion groups has made it easier for individuals to ideologically affiliate with the movement without making physical contact with other members,” the report reads.

OPV said there are no formal leaders within the online movement, rather bloggers, writers, social media influencers and supporters who help spread ideas and narratives that drive lone actor and small network terrorism.

READ MORE: N.Z. mosque shooting suspect shows ‘no emotion’ as court orders mental health check

The researchers referred to the terrorist attacks in Christchurch, New Zealand in March as an example of how ideas and messages online can drive an attack by one person.

Brenton Harrison Tarrant, faces 50 murder charges and 39 attempted murder charges in the March 15 attacks of two Christchurch mosques where 50 people were killed.

The alleged shooter is believed to have promoted the massacre ahead of time on 8chan, an online messaging board where violence, racism and misogyny are encouraged. The author of the promotional post remained anonymous, but he or she linked to a Facebook page belonging to user brenton.tarrant.9, where the attack was live streamed.

The post also included links to a 74-page manifesto claiming the attack was motivated by fears of “white genocide.”

“What is emerging among lone actors is a trend where mobilization to violence occurs largely online, outside of the purview of law enforcement and national security agencies,” the report reads.

“Never truly ‘alone,’ these individuals interact with an online community and perceive themselves as part of a broader movement.”

View link »

READ MORE: Social media bans, fines and penalties — how countries attempt to combat hate after tragedy

According to the study, online interactions can serve as motivation for individuals to plan and carry out attacks without physically meeting other members of an extremist group, which are difficult cases for authorities to detect and prevent.

However, OPV said one area that can be improved to stop such attacks is identifying the language and statements used by members of online communities that indicate potential future attacks.

“According to one study, roughly 70 per cent of lone actor terrorists leak their intention to act and 90 per cent leak their ideological convictions,” the report reads.

The organization said there is a growing climate of fear and hatred as a result of divisive ideas being spread online by way of conspiracy theories, hateful and derogatory memes, videos and other forms of media, as well as politicians around the western world propagating and reinforcing bigotry.

READ MORE: Canada considering forcing social media companies to remove extremist content

Following the Christchurch attacks, the Canadian government said it would consider forcing social media companies to remove hateful and extremist content online.

“We will look at that very, very carefully,” Public Safety Minister Ralph Goodale said in March.

“This has been a subject of discussion among ministers at the Five Eyes meetings and at the G7 meetings where ample discussion has been held on how we encourage the social media platforms to move quickly and efficiently to deal with toxic communications like this that incite violence and hatred and obviously do great damage to social cohesion.”

OPV said solving these issues will require society as a whole, with a focus on empowering youth-led and inter-cultural initiatives that create positive messages and reach the most amount of people possible.

WATCH BELOW: Canadian intelligence officials are reporting foreign cyber interference is “very likely” in the upcoming federal election.

The Organization for The Prevention of Violence was founded in 2016 through the support of Public Safety Canada to deal with the issue of extremism in Alberta.

— With files from Josh Elliott 

© 2019 Global News, a division of Corus Entertainment Inc.

Report an error

Comments

Want to discuss? Please read our Commenting Policy first.