Style Living Self Celebrity Geeky News and Views
In the Paper BrandedUp Hello! Create with us Privacy Policy

Social media safety tools to take note of on World Suicide Prevention Day

By Kara Santos Published Sep 09, 2021 4:36 pm

Sept. 10, 2021 marks World Suicide Prevention Day. 

The World Health Organization estimates that every year, almost 800,000 people die by suicide. 

Experts have also noted that suicide rates can increase in prolonged periods of crisis. With the COVID-19 global pandemic dragging on for more than a year and a half, many people are left in distress and vulnerable to mental health problems and suicidal behavior, which can manifest online. 

Amid the pandemic, some people have turned to social media to express their thoughts and talk openly about mental health concerns brought about by the ongoing public health crisis.

Here’s what some major social media platforms are doing to address suicide prevention globally.

Twitter

According to Twitter’s help center, users who see someone thinking about engaging in self-harm or suicidal behavior on the Twitter platform can alert the Twitter team focused on handling reports associated with accounts.

“After receiving a report of someone who may be thinking about self-harm or suicide, Twitter will contact the affected individual to let them know that someone who cares about them identified that they might be at risk of harm. We will also encourage them to seek support, and provide information about dedicated online and hotline resources that can help,” according to the microblogging site.

Twitter also advises concerned parties to offer support to the person directly.

“If you are concerned and know the person involved, it can be helpful to contact them personally  and encourage them to seek advice from dedicated services who may be able to help.”

Clicking the hashtag #WorldSuicideDay in the Philippines reveals the following message at the top of the Twitter feed:

“Kung ikaw ay nahihirapan o may kakilala kang nakararanas ng matinding krisis, hindi ka nag-iisa. Ang aming mga katuwang-ang #MentalHealthPH at ang National Center for Mental Health (NCMH) ay narito at handang makinig. Tumawag lamang sa NCMH 1553, (+63) 917-899-8727, (02) 7-989-8727, o Crisis Line by In-Touch (02) 8893 7603.

The site also directs users to contact the NCMH Crisis Hotline and Mental Health PH, an organization that promotes and protects mental health through social media and digital technology directly from the platform.

This year, the organization is hosting an event via Twitter Spaces on Sept. 10 at 6:00 p.m. to discuss more about suicide and suicide prevention.

Facebook

In the past few years, Facebook has tightened its policies around some difficult topics, including self-harm, suicide and eating disorders content after consulting with a series of experts on these topics. 

The social network generally allows people to openly discuss their mental health struggles with family, friends and other online support groups, as they believe it can be beneficial. However, it has updated policies to prevent the spread of more harmful imagery and content.

It changed its policy around self-harm images to no longer allow graphic cutting images, which can unintentionally promote or trigger self-harm. These types of images will not be allowed even if someone is seeking support or expressing themselves to aid their recovery, according to Facebook.

The social network also displays a sensitivity screen over healed self-harm cuts to help unintentionally promote self-harm.

Facebook’s Help Center urges those who know someone in immediate danger of self-harm to contact local emergency services immediately for help.

“After you've called emergency services, connect with your friend or call someone who can. Showing that you care matters. Make sure they know that you're there for them and that they aren't alone.”

The social media platform also shares several helpful resources for those who are not in an immediate threat of physical danger.

They advise Facebook users to: 

  • Provide support to this person or contact a family member, friend, counselor or teacher who may be able to provide support. 
  • Find a local helpline in your country and share resources and contact information that this person may find helpful.
  • Report the content to Facebook so they can reach out to this person with information that may be helpful to them.

Instagram

Similar to Facebook's policies, Instagram is making it more difficult to find harmful content through the search and explore feature. Instagram’s Community Guidelines supports a safe and open environment for everyone, and may flag or hide graphic or descriptive content that can be harmful or distressing for others, including content on suicide and self-harm.

According to Instagram's Help Center, users who notice someone they know exhibiting dangerous behavior online should contact local law enforcement or help them get to the next level of care. 

“Encouraging your friend to talk about what they're going through can be one of the most helpful things you can do for them. Being a good listener, and giving them the space they need to talk is important, as is following up with them regularly. You also can help by getting them to someone else they can trust, like a health care professional or another friend.”

Among their tools for support, Instagram allows Anonymous Reporting for Self-Injury Posts and Anonymous Reporting for Live Video. Users can report at-risk behavior during a live broadcast, and the person will receive a message offering help, support, and resources.

“If you see self-injury in a post or a post that makes you think the person who posted it is at-risk, report it and we will connect them to organizations that offer help,” according to the photo sharing platform.

YouTube

According to YouTube's help center, those who come across content in which someone expresses suicidal thoughts or is engaging in self-harm in videos should contact local emergency services immediately for help and flag the video to bring it to YouTube's immediate attention.

"At YouTube, we take the health and well-being of all our creators and viewers seriously. Awareness and understanding of mental health is important and we support creators sharing their stories, such as posting content discussing their experiences with depression, self-harm, or other mental health issues. We do not, however, allow content on YouTube that promotes suicide, self-harm, or is intended to shock or disgust users."

YouTube's community guidelines prohibit content creators from posting content on YouTube that fits any of the descriptions noted below:

  • Promoting or glorifying suicide
  • Providing instructions on how to self-harm or die by suicide
  • Graphic images of self-harm posted to shock or disgust viewers

The guidelines apply to videos, video descriptions, comments, live streams, and any other YouTube product or feature. Any content that violates the policies will be removed with strikes given towards creators. 

YouTube may also terminate the channel or account after a single case of severe abuse or when the channel is dedicated to a policy violation.

(Banner image via Shutterstock)