It is true that the number of suicide rates has increased to an enormous extent at present. A lot of people suffer from depression. Some people have to face job-related stress, while others face relationship issues. One thing that is common to all people is, they use social media. At present, a majority of people use social media, mostly Facebook, to express their feelings.
The most popular social media platform, Facebook decided to take some actions to help people. So how exactly will Facebook assist in preventing suicides? It is planning to integrate the users with the suicide prevention tools. It will analyse the posts, live streaming videos, Facebook Live and also the messenger services.
Facebook life-saving algorithm
The social networking site has developed a pattern recognising algorithm. It will spot the warning signs within the posts of the users. It will also check what comments the users are providing in response. The human review team will review the whole post and comments.
Soon you can avail the feature in Facebook mobile and Facebook app. After confirmation, the company will contact anyone who is presumed to be at risk of self-harm and suggest ways in which they can seek help. According to a suicide helpline chief, this move was not only necessary; but it was also a critical one. At present, the US is testing the tool.
It is the first initiative that will use the AI technology in reviewing the messages. Mark Zuckerberg is also hoping to use the algorithms for identifying posts from terrorists and other explicit content. Also, the social media platform announced few more ways to tackle the suicidal issues. The Facebook Live broadcast in Facebook app, partnered with different US mental health organisations. It will help the vulnerable users get help from the Messenger app platform.
Pattern recognition feature
There is a pattern recognition algorithm Facebook developed for recognising if someone is struggling. The algorithm will track the signals from talk of sadness, pain or any other extreme depressive posts. Once the post is identified, it is sent to rapid review for the network’s community operations. Speed is of the essence here. The question is how to take action in a way so that the person doesn’t feel invasive.
In such a circumstance, contacts from a friend or family are an effective one than a message from Facebook app. Sometimes even that would not be appropriate. The objective is to help “at-risk” users without waiting till the video is complete and reviewed. When someone watching the stream clicks on the menu, the Facebook app will display advice to the viewer. The advice is in what way they can help the broadcaster. The stream is then flagged for immediate review.
Some say to cut the video stream the very moment when there is a hint of talking about suicide. While experts also say cutting the video stream early will reduce the chance of people to reach out and offer help. The new option to contact a Crisis Counselor through the Facebook Messenger helpline tool is still limited to the US. The new system will roll out soon.