Jonathan Bailey, from Newcastle-under-Lyme, England, initially told his followers on Saturday that he intended to take his own life before broadcasting the livestream.
The broadcast attracted more than 400 viewers, with several begging him to stop what he was doing. Some users also reportedly attempted to alert Facebook, however, the platform failed to stop the livestream, according to The Mirror.
Bailey—a popular member of the local community who was called Baz or Bazza by those who knew him—was pronounced dead at the scene by paramedics who had responded to reports of a “medical emergency.” The details of any post-mortem have not been revealed.
“Our thoughts go out to Mr Bailey’s family at this difficult time,” the Facebook spokesperson said. “We take the responsibility of keeping people safe on our platforms seriously, and we will continue to work closely with experts like The Samaritans to ensure our policies continue to support those in need.”
Facebook said that as soon as the company became aware of Bailey’s original post regarding his intent to take his own life, they provided him with “support resources,” which have been created in partnership with experts, such as The Samaritans.
The company said mental health issues were “incredibly complex,” affecting people in different ways. “We want Facebook to be a space where people can share their experiences, raise awareness about these issues and seek support from one another, which is why we allow people to discuss suicide and self-injury,” the company’s Community Standards state.
Facebook has been working with mental health experts in order to help guide its policies on suicide prevention and safety. In September last year, it announced it was tightening its policies and expanding resources in relation to suicide and self-harm.
The company said it removes content encouraging, coordinating or providing instructions for suicide or self-harm, particularly certain graphic cutting images, which experts say have the potential to unintentionally promote these acts.
A Facebook spokesperson told Newsweek that the company removes livestream content involving suicide attempts or self-harm as soon as they are made aware of it. However, if the content involves someone threatening suicide, the platform will leave it up, giving followers the chance to intervene. As was the case with recent death, Facebook will also send information to the poster about local organizations that can provide support.
In the U.S. and other parts of the world, the company uses artificial intelligence to monitor posts and identify people who may be at risk of suicide, based on certain phrases or concerned comments from others. Worrying posts from an individual in distress can also be reported directly to the platform by loved ones.
When cases like these are flagged, a trained member of Facebook’s Community Operations team determines whether the person may genuinely be at risk. If they are, the individual is provided with support resources, including helpline numbers. In cases where the person may be in imminent danger, Facebook can also contact relevant local authorities.
According to the company’s Community Standards, Facebook also removes any content that “identifies and negatively targets victims or survivors of self-injury or suicide seriously, humorously or rhetorically.”
However, photos or videos depicting a person’s death by suicide that are determined to be newsworthy is allowed on the platform, although they are restricted to adults over the age of 18 and include a sensitivity screen so that it is clear to viewers that the content may be distressing.
Suicide is among the leading causes of death in the United States, claiming the lives of more than 47,000 people in 2017, according to the CDC. There is evidence to suggest that the use of the internet and social media can influence suicide pro-suicide behavior, while also providing new opportunities for assistance and prevention, according to a study published in the American Journal of Public Health.
If you have thoughts of suicide, confidential help is available for free at the National Suicide Prevention Lifeline. Call 1-800-273-8255. The line is available 24 hours, every day.
This article was updated to clarify and add further information about Facebook’s policy on removing livestream content of suicide and self-harm.