Facebook’s policy to allow the live broadcast of self-harm on its platform has attracted concern among mental health advocates.
The power to go live and unfiltered to the internet from a phone has opened up enormous possibilities to transform the way we communicate.
But it has created a wave of ethical questions that the world’s biggest social network, Facebook, is being forced to navigate very publicly.
Residents have live streamed the aftermath of fatal shootings in the US, drive-by attacks and racially motivated abuses.
Police around the world now fear there could be a disturbing trend of suicides being live streamed.
In March, the social media giant expanded its suicide prevention tools to Facebook Live, which gives Australian support groups the opportunity to target young people in the moment of their distress.
The scale of the task at hand is growing — a recent leak of documents published in The Guardian reported that moderators were escalating thousands of reports each fortnight.
According to The Guardian, a recent policy update shared with moderators highlighted they were “now seeing more video content — including suicides — shared on Facebook” and that “[Facebook doesn’t] want to censor or punish people in distress who are attempting suicide”.
“Experts have told us what’s best for these people’s safety is to let them live stream as long as they are engaging with viewers,” The Guardian reported.
“However, because of the contagion risk [that some people who see suicide are more likely to consider suicide], what’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person.
“We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up.”
It is new, and complex, ground for mental health advocates dealing with emerging platforms like Facebook Live according to SANE Australia — a national charity that helps those affected by mental illness.
“We would prefer that these events weren’t live streamed, but I think as soon as it’s identified that this content is being streamed if it can be responded to as quickly as possible and that means being able to report the content, escalating to emergency services and support services where possible,” Dr Michelle Blanchard from SANE Australia said.
“I would probably prefer that video being hidden from public view while it’s possible to still get in touch with that person and intervene,” she said.
Jono Nicholas, CEO of ReachOut Australia — a youth mental health initiative — agreed.
“We have concerns that live streaming suicide or self-harm can place a lot of pressure on community members who may not have the skills to intervene. It could also put vulnerable young people at greater risk,” he said.
The Guardian’s article referred to documents which show “how Facebook will try to contact agencies to trigger a ‘welfare check’ when it seems someone is attempting or about to attempt suicide”.
In a statement to the ABC, Facebook’s Head of Global Policy Management Monika Bickert said the company worked “hard to make Facebook as safe as possible while enabling free speech”.