real life

Facebook announces a new tool to prevent suicide.

Social media is often perceived as having a negative impact on the mental health of those who use it.

A growing body of research suggests Facebook can make people feel worse about themselves, while concerns have been raised about the impact of certain online content on people who are susceptible to eating disorders.

However, Facebook’s latest announcement is being lauded as a positive step forward in suicide prevention.

Yesterday, the social media giant outlined a new tool, to be rolled out first in the US, that will “provide more resources, advice and support” to people potentially grappling with suicidal thoughts. The new feature has been developed with guidance from a number of US mental health organisations.

RELATED: Experts say this kind of selfie could be triggering eating disorders

Users will be able to notify Facebook if they see troubling updates and content from their friends appearing in their Newsfeed. These reports will then be reviewed by a trained team, active around the clock, to confirm whether that person is at risk.

If so, that person will receive a notification from Facebook letting them know a friend is concerned for their wellbeing, and providing links to support services.

The person who flagged the post will also receive access to support resources, including options for them to call or message their distressed friend or reach out for further support.

Georgie Harman, CEO of Beyond Blue, views this as a positive example of a social media platform being proactive about the wellbeing of its users. "Anything that intervenes when someone is in crisis; anything that points people to help and support has to be a good thing. In principle, I think it sounds very promising — at the end of the day, [social media] is not going away, it's where people are and we have to find ways to make it work," she says.

RELATED: Social media is in danger of ruining your holiday. Don’t let it

"I think social media platforms absolutely have a responsibility in this space ... when an individual is in a digital environment and is expressing suicidal thoughts or feelings or they're in high distress, there has to be an intervention."

To some, Facebook's reporting feature might sound like a somewhat intrusive approach. The thought of anonymously flagging someone else's post as 'troublesome' is quite confronting, and yet it could be the trigger that person needs to seek crucial support.

Ms Harman says whenever she speaks to people who have contemplated, or attempted, suicide in the past, they always point to a moment in their life where an intervention took place at a critical moment and saved their life — although it might have angered or upset them at the time.

"Someone told me once that sometimes, you have to risk a friendship to save a life," Ms Harman says. "People are very afraid of saying the wrong thing and crossing boundaries in terms of respecting people's privacy ... this is another strategy we can put in the tool kit that helps people get people connected to help."

ADVERTISEMENT

RELATED: What to do if you think a loved one might be suicidal

In recently years, certain digital platforms have taken action to protect or support their users where mental wellbeing is concerned. For example, Instagram's current guidelines don't allow accounts, hashtags and images that promote or glorify self-harm, and the company encourages users to flag content of this nature.

Beyond Blue's digital platforms use technology that can identify 'trigger words' in forum posts — these are phrases that are shown to indicate distress or high risk of suicide. From there, the team can quickly contact the organisation's support service to reach out to the person who wrote the post.

"We opened those [forums] knowing we have a responsibility to ... intervene if necessary, and move quickly to remove inappropriate content. We have clear guidelines and rules and we're diligent about speaking to them," Ms Harnam explains.

RELATED: The signs of depression we rarely notice

"We need to take this stuff seriously — it can be a matter of life or death. It is happening, we need to get major players and major platforms like Facebook, Twitter, Instagram and others to get on board with this and be another way in which we can connect people with help and break down stigma."

Although it's important for social media players to take action on behalf of their users, Ms Harnum also points out that users of digital communities are often quite protective of one another, and can be trusted to police inappropriate behaviour.

"Society has a great way of regulating itself. If you look at social media conversations sometimes, there's often quite negative and disrespectful behaviour - but quite quickly you see other community members will come in and ... tell people to pull their heads in. Social media can be very protective of people and it can be a lifeline."

What do you think of Facebook's new tools? How do you think social media can better look after its users?

There are a number of organisations in Australia that offer crisis support services for suicide prevention, information and resources, both for people who are suicidal and their concerned loved ones. These include:

beyondblue: call 1300 224 636, visit www.beyondblue.org.au

Lifeline: call 13 11 14, visit www.lifeline.org.au

Suicide Prevention Australia: visit www.suicidepreventionaust.org

The Black Dog Institute: visit www.blackdoginstitute.org.au