news

Facebook should hide suicide live stream videos, mental health advocates say.

Facebook’s policy to allow the live broadcast of self-harm on its platform has attracted concern among mental health advocates.

The power to go live and unfiltered to the internet from a phone has opened up enormous possibilities to transform the way we communicate.

But it has created a wave of ethical questions that the world’s biggest social network, Facebook, is being forced to navigate very publicly.

Residents have live streamed the aftermath of fatal shootings in the US, drive-by attacks and racially motivated abuses.

Police around the world now fear there could be a disturbing trend of suicides being live streamed.

In March, the social media giant expanded its suicide prevention tools to Facebook Live, which gives Australian support groups the opportunity to target young people in the moment of their distress.

The scale of the task at hand is growing — a recent leak of documents published in The Guardian reported that moderators were escalating thousands of reports each fortnight.

According to The Guardian, a recent policy update shared with moderators highlighted they were “now seeing more video content — including suicides — shared on Facebook” and that “[Facebook doesn’t] want to censor or punish people in distress who are attempting suicide”.

“Experts have told us what’s best for these people’s safety is to let them live stream as long as they are engaging with viewers,” The Guardian reported.

“However, because of the contagion risk [that some people who see suicide are more likely to consider suicide], what’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person.

ADVERTISEMENT

“We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up.”

It is new, and complex, ground for mental health advocates dealing with emerging platforms like Facebook Live according to SANE Australia — a national charity that helps those affected by mental illness.

“We would prefer that these events weren’t live streamed, but I think as soon as it’s identified that this content is being streamed if it can be responded to as quickly as possible and that means being able to report the content, escalating to emergency services and support services where possible,” Dr Michelle Blanchard from SANE Australia said.

“I would probably prefer that video being hidden from public view while it’s possible to still get in touch with that person and intervene,” she said.

Jono Nicholas, CEO of ReachOut Australia — a youth mental health initiative — agreed.

“We have concerns that live streaming suicide or self-harm can place a lot of pressure on community members who may not have the skills to intervene. It could also put vulnerable young people at greater risk,” he said.

The Guardian’s article referred to documents which show “how Facebook will try to contact agencies to trigger a ‘welfare check’ when it seems someone is attempting or about to attempt suicide”.

In a statement to the ABC, Facebook’s Head of Global Policy Management Monika Bickert said the company worked “hard to make Facebook as safe as possible while enabling free speech”.

“This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously,” she said.

ADVERTISEMENT

“Over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly.”

Keeping up with community expectations ‘tricky’

David Hunter, Associate Professor of medical ethics at Flinders University, said it was tricky territory to navigate because community expectations could shift so quickly.

“There’s no straightforward pointing the finger or saying, ‘Facebook is bad because of this’ or ‘Facebook is good because of this’,” he said.

“There is this question about responsibility and part of the question is about whether Facebook is a platform or a publisher.

“You could say the same things about the internet — there’s a lot of vile stuff on the internet. Do the creators of the internet have a responsibility to remove that? General opinion is probably not.”

It is an ongoing debate as to whether Facebook is truly a platform that merely hosts the infrastructure that facilitates content — or a publisher of the content itself.

“Insofar as Facebook profits from it, it makes it much harder to make the ‘infrastructure — therefore it’s not our responsibility’ argument,” Dr Hunter said.

He suggested a Reddit style of content moderation — where users could vote up or down items.

“People can set their threshold for what they’re willing to tolerate, they can define what’s offensive and if it gets to a level where people might say, ‘I don’t want offensive content’, then it doesn’t appear on their feed,” he said.

This post originally appeared on ABC News.


© 2017 Australian Broadcasting Corporation. All rights reserved. Read the ABC Disclaimer here