Video-sharing site TikTok is struggling to require down clips showing a person killing himself.
The footage, which is overall the platform for many days, originated on Facebook and has also been shared on Twitter and Instagram.
TikTok is hugely fashionable children – and lots of have reported seeing the video and being traumatized by the content.
The app said it might ban accounts repeatedly uploading clips.
“Our systems are automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide,” a representative said.
He further said that they appreciate their community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform, out of respect for the person and their family.
Facebook told: “We removed the first video from Facebook last month, on the day it had been streamed, and have used automation technology to get rid of copies and uploads since that time.
‘My daughter could have post-traumatic stress disorder’
On Monday, Brenda’s 14-year-old daughter came running down the steps. She was covering her mouth, crying, and saying she was getting to vomit.
“She was in such a state, shaking sort of a leaf and properly sobbing,” Brenda, who lives near Edinburgh, said.
“I have never seen her that distressed. It was horrific and took ages to urge the words out of her.”
Brenda explained that her daughter had seen the suicide video after it appeared within the recommended clips of TikTok’s For You section.
“She was scrolling through songs and funny videos when a bearded man during a white shirt appears behind a desk,” the mother recalled, saying shortly after he was seen to kill himself.
“I have heard about trolling and nasty things but this tops it all. I phoned the police but they jogged my memory that it’s not their job to police the web.
“My daughter was during a state of shock, still is during a state of shock and this might stick with her for months.”
Since the incident, she added, her daughter had slept with the sunshine on and kept reliving the pictures she had seen. She added that her daughter felt scared to go away from the house and was missing each day of the faculty as a result.
It is reported that some users are sharing the video, disguised behind images of kittens or other content. Others have put together their own videos warning about the content and urging people to delete it.
TikTok’s algorithms often recommend content from people indirectly followed by a user.
Several people have streamed their suicides on Facebook Live since its 2015 launch.
Facebook, which owns Instagram, has also faced criticism that the platform shares content sensationalizing self-harm and suicide.
After the death of Molly Russell, in 2017, her father said Instagram had “helped kill his daughter”.