It works by using a targeting tool to flag keywords in the search and instead provides a list of videos “debunking” narratives used by extremists to recruit followers on the internet.
WATCH: Tackling terrorism threat online
In an article posted to its official blog on Thursday, YouTube said it wants to “provide more resources and more content that can help change minds of people at risk of being radicalized.”
Social media companies have been facing increasing pressure from governments after the U.K. was hit with back-to-back terrorist attacks.
SEE COVERAGE OF LONDON TERROR ATTACK
The first of these incidents, which took place in Manchester on May 22 outside an Ariana Grande concert, took the lives of 22 people. The second took place on London Bridge on June 3 when three men drove a van through pedestrians, killing eight pedestrians.
SEE COVERAGE OF MANCHESTER TERROR ATTACK
Following these events, British Prime Minister Theresa May attended the G7 Summit with a solution in mind: that internet companies be more strictly regulated so as to curb the spread of extremist content online. The summit’s other attendees agreed.
This message only intensified after an individual connected to the London attack revealed he was motivated by watching videos on YouTube.
WATCH: Theresa May and Emmanuel Macron announce counter-terrorism action plan
Approximately two weeks after the events in London, Google released its four-step plan, which included increasing technology to identify extremist and terrorism-related videos; bringing 50 NGO experts into YouTube’s Trusted Flagger program; making videos that may be inflammatory but do not clearly violate policies harder to find (and preventing them from being monetized); and lastly, countering radicalization through the redirect method.
“Collectively, these changes will make a difference. And we’ll keep working on the problem until we get the balance right. Extremists and terrorists seek to attack and erode not just our security, but also our values; the very things that make our societies open and free. We must not let them,” Google said in its announcement.
While the feature hasn’t been implemented yet, tech news site The Verge reports that YouTube will measure success of the project by how much of the redirected content is engaged.
YouTube claims an earlier pilot of the redirect method that ran between August and September 2015 reached over 300,000 people, who in turn watched over 500,000 minutes of video.
Several other social media companies also responded with their own efforts to curb the spread of terror online, including forming a working group to fight extremist content on their platforms.