Parents have been warned about a terrifying online "game", Momo, that encourages self-harm.

Police have warned parents about a dangerous online “game” named Momo, which appears to be targeting kids through YouTube and Whatsapp.

The “game” involves an avatar known as “Momo” contacting children and asking them to complete a series of tasks, which include self-harm. It is not yet clear if this is an actual person or an automated bot.

In the UK, there have been several reports of children interacting with the character online. And in South America, the game has been linked to the deaths of three teens.

Ireland’s State Police Force issued a warning this week on Facebook about the “game”, urging parents to be vigilant.

“The Momo challenge is a form of cyberbullying where momo asks to be contacted through a social media site and then asks the person to perform a series of dangerous tasks including self-harm,” the statement read.

“Please, please, please always supervise your children or those that are vulnerable while online.”

In September 2018, a 16-year-old boy and 12-year-old girl from the town of Barbosa, Colombia were believed to have taken their own lives within 48 hours of each other late last month after taking part in the sinister challenge, local news outlets reported.

Police in Argentina were investigating whether a 12-year-old girl’s death in July was due to playing Momo.

What is Momo?

Momo is an online “game” operated through the messaging service WhatsApp and YouTube. It has been likened to the infamous online game Blue Whale, which was reportedly linked to the deaths of at least 130 Russian teenagers in 2016.

Like Blue Whale, the game is based on the player completing escalating challenges.

what is momo
A warning released by Mexican authorities.

It then threatens the user not to disobey the game's orders. It is believed the operator hacks the person's phone or otherwise obtains videos, pictures or information, which they then threaten to release if the victim does not comply.

The avatar for 'Momo' is a horrifying image of a woman with a distorted face and bird legs for arms. It was created by Japanese artist Midori Hayashi, who is not associated with the game.

It is believed the game originated in Japan, but is now a growing trend in Europe and South America, even prompting Mexican authorities to release a warning to parents and produce posters about it.


Momo likened to Blue Whale and The Slender Man myth

The set-up of Momo sounds eerily similar to the Blue Whale game that reportedly claimed the lives of at least 130 Russian teens in 2016.

The social media game involved children being set escalating tasks such as self-harming and watching horror movies. On the 50th day of the challenge, the victims are told to take their own lives.

Similarities to the online legend of Slender Man have also been suggested.

Anissa Weier and Morgan Geyser were found not guilty on the grounds of insanity.

In May 2014, two 12-year-old girls in Wisconsin, US stabbed their 12-year-old classmate, claiming it was because the Slender Man told them to.

Anxious parents were told at the time of the Blue Whale phenomenon to protect their children by talking to them, and the same advice applies to Momo.

UK children's charity National Society for the Prevention of Cruelty to Children told The Mirror parents should reinforce to their children that they should not feel pressure into doing anything that makes them feel unsafe or scared.

"Parents should talk with their children and emphasise that they can make their own choices and discuss ways of how to say no.

"Reassuring a child that they can still be accepted even if they don’t go along with the crowd will help stop them doing something that could hurt them or make them uncomfortable."

Suicide tips hidden in YouTube videos

Meanwhile, in the US parents are warning others to watch out for suicide tips hidden within videos on YouTube Kids.

Spliced a few minutes into gameplay videos and other innocent child-aimed clips, a Florida doctor and mum has found footage of a man offering instructions on how to commit suicide, The Washington Post reports.

A spokesperson for YouTube said they have rules against videos that encourage harm and are constantly removing clips that break their strict policies.

However, she said they rely on user flagging, as well as smart detection technology, to find these videos.

If you or anyone you know is experiencing suicidal thoughts, you are urged to phone Lifeline on 13 11 14.