Advertisement

Facebook implementing new tools to prevent livestream of suicides

Click to play video: 'Facebook implementing new tools to prevent livestream of suicides'
Facebook implementing new tools to prevent livestream of suicides
WATCH: The new measures include Facebook connecting someone in crisis with a mental health organization or even to a friend they feel comfortable talking to – Mar 1, 2017

Social-media giant Facebook has unveiled new strategies to help prevent the increasing amount of suicides live-streamed on its service, including two recent incidents that made international headlines.

According to Statistics Canada, suicide remains one of the leading causes of deaths in the country, especially for males and those aged 15 to 34. In the United States, suicide was reported to be the tenth leading cause of death in the country.

In January, the story of aspiring actor Frederick Bowdy, 33, and teenager Naika Venant, 14, caught the attention of millions when they decided to take their own lives. What made their stories more worrying was the fact that they both committed suicide on Facebook Live.

On Wednesday, Facebook released new suicide prevention tools to help those in distress and to build a safer community, both online and off.

Story continues below advertisement

The networking platform already had measures in place for the past decade, but updated those tools to be integrated into their live video service.

Receive the latest medical news and health information delivered to you every Sunday.

Get weekly health news

Receive the latest medical news and health information delivered to you every Sunday.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

This includes a viewer reaching out to the person directly and offering their support or by reporting the video to Facebook. As well, the person live-streaming the event will see “a set of resources on their screen,” Facebook said in a press release.

Onscreen, the person in distress will see a message saying Facebook is “reaching out to offer help” because someone thinks they “might need extra support right now.”

The person will then have an option to contact a helpline, view tips to get through their difficult moment, or talk to a friend.

The viewer of the video who is concerned for the person’s safety can report the video, report what they think is going on – in this case, suicide or self-injury – and then they too will be prompted with resources on how to help their friend.

In both instances, Facebook has people working 24/7 who view anything reported to them and “prioritize the most serious reports like suicide.”

Facebook has partnered with organizations such as the Crisis Text Line, the National Suicide Prevention Lifeline and the National Eating Disorder Association to talk to someone over Facebook’s Messenger app.

Story continues below advertisement

These new measures, which are only being tested in the U.S. for the timebeing, will hopefully prevent future livestreams of suicides.

Facebook is also trying to stop these real-time videos through an “artificial intelligent approach” with pattern recognition that will identify previously reported posts that involved suicide or self-injury.

Please note that if you or someone you know is in need of mental health assistance, the Centre for Addiction and Mental Health in Toronto has resources available.

Sponsored content

AdChoices