Facebook will allow users to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress who are attempting suicide”, according to leaked documents.

However, the footage will be removed “once there’s no longer an opportunity to help the person” – unless the incident is particularly newsworthy.

The policy was formulated on the advice of experts, the files say, and it reflects how the social media company is trying to deal with some of the most disturbing content on the site.

The Guardian has been told concern within Facebook about the way people are using the site has increased in the last six months.

For instance, moderators were recently told to “escalate” to senior managers any content related to 13 Reasons Why – a Netflix drama about the suicide of a high school student – because of fears it could inspire copycat behaviour.

Figures circulated to Facebook moderators appear to show that reports of potential self-harm on the site are rising. One document drafted last summer says moderators escalated 4,531 reports of self-harm in two weeks.

Sixty-three of these had to be dealt with by Facebook’s law enforcement response team – which liaises with police and other relevant authorities.

Figures for this year show 5,016 reports in one two-week period and 5,431 in another.

The documents show how Facebook will try to contact agencies to trigger a “welfare check” when it seems someone is attempting, or about to attempt, suicide.

A policy update shared with moderators in recent months explained the shift in thinking.

It says: “We’re now seeing more video content – including suicides – shared on Facebook. We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers.

“However, because of the contagion risk [ie some people who see suicide are more likely to consider suicide], what’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up.”

– Nick Hopkins

Read More: Facebook will let users livestream self-harm, leaked documents show

Image by Fernando from Unsplash