Advertisement

Facebook’s suicide prevention tool sparks concern

Click to play video: 'Facebook’s suicide prevention tool sparks concern'
Facebook’s suicide prevention tool sparks concern
WATCH: Facebook is admitting to using an algorithm to identify users it believes are at risk of suicide. While the company says it's intended to save lives, some are criticizing the tech giant for conducting medical research on users without consent. Heather Yourex-West reports – Feb 11, 2019

A Harvard Medical School researcher is criticizing Facebook’s new suicide prevention program, saying it amounts to using users for medical research without their consent.

In a commentary published in the Feb. 11, 2019 issue of the Annals of Internal Medicine, John Torous raises concerns about Facebook’s suicide prevention efforts.

“Facebook is now monitoring how you use Facebook and somehow they’re running an algorithm to determine your risk of committing suicide,” Torous told Global News in an interview.

According to the company’s website, when something a user types or posts is flagged by a computer algorithm, trained employees are then called in to assess that person’s risk of suicide.  A decision is then made about whether to call for emergency assistance from local first responders.

“In the last year, we’ve helped first responders quickly reach around 3,500 people globally who needed help,” Facebook CEO Mark Zuckerberg wrote in a November blog post. 

Story continues below advertisement

READ MORE:  Instagram bans graphic self harm images after teenagers’ suicide 

Whether those people actually needed help, or received help in time are questions Torous said he would like to see answered.

The latest health and medical news emailed to you every Sunday.

“We don’t know, as the public, is this working? Is it not working?” he said. “If we’re being experimented on, we have a right to know what is the outcome? Is it benefiting or hurting us?”

According to Tom Keenan, a technology expert at the University of Calgary, being flagged as suicidal could impact a person’s life and make it difficult for people to get insurance, find employment or travel.

“Certain places put a suicide attempt on your police record and the border services people have access,” Keenan said.

Still, mental health advocates say the potential to save lives could outweigh privacy concerns and ethical risks.

“People who consider suicide don’t actually want to die,” said Mara Grunau, the executive director of the Calgary-based Centre for Suicide Prevention. “What happens is, as they’re approaching suicidal crisis, they’ll put out invitations and it might be through messaging.”

Grunau said computer algorithms might be able to help detect these cries for help, but friends and family members should be vigilant as well. Grunau urges anyone who is concerned about someone to ask them if suicide is something they’re thinking about.

Story continues below advertisement

READ MORE:  Facebook needs to to shifting data regulations, U of C expert says 

“Ask the person directly [and] use the word suicide so they know very clearly what you’re saying.”

In a statement e-mailed to Global News, Facebook’s global head of safety said the company was committed to being more transparent about its suicide prevention efforts.

“Suicide prevention experts say that one of the best ways to prevent suicide is for people in distress to hear from friends and family who care about them,” Antigone Davis said. “Facebook is in a unique position to help because of the friendships people have on our platform — we can connect those in distress with friends and organizations who can offer support.

“Experts also agree that getting people help as fast as possible is crucial — that is why we are using technology to proactively detect content where someone might be expressing thoughts of suicide.”

Anyone thinking about suicide, or concerned about someone who may be considering suicide, can access help from Crisis Services Canada, either online or by calling 1-833-456-4566.

 

Sponsored content

AdChoices