Menu

Topics

Connect

Comments

Want to discuss? Please read our Commenting Policy first.

Facebook to review ‘reporting logic’ after New Zealand shootings were streamed live

March 17: The horrific attacks on two mosques in New Zealand were streamed live on Facebook. Social media platforms have come under heavy criticism for not reacting fast enough to stop the spread of violent and hateful material. Robin Gill discusses this pressure with Joan Donovan, director of the Technology and Social Change Research Project at Harvard University – Mar 17, 2019

Facebook has pledged to review its “reporting logic and experiences for live and recently live videos” after video depicting the shootings in Christchurch, N.Z. were streamed live on the social network.

Story continues below advertisement

In a Wednesday statement issued by Guy Rosen, Facebook’s VP of integrity, the company outlined numerous actions it is undertaking after video of the shootings that killed 50 people showed up on the platform.

WATCH: March 15 — Role of social media in Christchurch shootings

The social media giant explained — again — that it never received any reports about the video while it was broadcasting, and that the first one came in 12 minutes after it ended.

Story continues below advertisement

When a video is reported while it’s broadcasting, it’s “prioritized for accelerated review,” Facebook explained.

“We do this because when a video is still live, if there is real-world harm we have a better chance to alert first responders and try to get help on the ground,” the statement said.

READ MORE: No one reported New Zealand mosque shooting video while it was live, Facebook says

Facebook said that it expanded this “acceleration logic” to cover videos that had recently gone live.

Consistent with the company’s focus on suicide prevention, “to date we applied this acceleration when a recently live video is reported for suicide.

The New Zealand broadcast, Facebook said, was reported for reasons “other than suicide and as such it was handled according to different procedures.”

Nevertheless, it is looking at what it can do to “expand the categories that would get to accelerated review.”

Story continues below advertisement

WATCH: March 16 – New Zealand shooting – PM says they’ve attempted to remove video of mosque shootings

Facebook went on to say that it had difficulty removing the video entirely because numerous versions of it circulated on the platform.

Facebook blocked over 800 different versions of this video, it said, and it is “learning to better understand techniques which would work for cases like this with many variants of an original video.”

Story continues below advertisement

Among actions aimed at preventing the spread of such content in future, Facebook pledged to improve its matching technology so that it can stop video like this from spreading on the social network.

It also said it would work to respond more quickly to live video of this nature.

“This includes exploring whether and how AI can be used for these cases, and how to get to user reports faster,” it said.

The social media company pledged to offer continual updates as it learns more.

Advertisement

You are viewing an Accelerated Mobile Webpage.

View Original Article