Facebook To Use Artificial Intelligence to Prevent Suicides

Facebook launched live-video streaming last year as a new feature for users to share their experiences in real time. However, since Facebook Live has gone mainstream, users have captured the more painful moments of reality including fights, murders, racism, and even suicides. In fact, earlier this year a woman was arrested for abusing her child after taping her child to a wall and sharing it on Facebook Live.

But, the social network announced on Wednesday that they are adding suicide-prevention tools into Facebook Live that will allow users who are watching the live video to report that video to Facebook and reach out to the person directly. Also, Facebook will begin providing resources to help people report the live video and put the person in immediate contact with a support line.

Facebook also announced they are testing live chat support from crisis support groups through Facebook Messenger. Facebook will be partnering with Crisis Text Line, the National Eating Disorder Association, and the National Suicide Prevention Lifeline. The company also said they are creating reporting tools by using artificial intelligence.

Live-Reporter-Support
via Facebook.com

The company also stated that they are creating reporting tools by using artificial intelligence. Facebook indicated that they would be using information from mental-health experts to test a reporting process that uses computer pattern recognition in posts reported for suicide. The artificial intelligence will make the option to report a post about suicide a more prominent option for users that the AI identifies potential risk. Facebook has also started using artificial intelligence to identify posts that are linked to thoughts of suicide. Those posts are then reviewed by the company’s Community Operations team to provide that user with help.

Facebook will also be launching a video advertising campaign to raise awareness about how to help people considering suicide.

“We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide,” Facebook explained in a blog post. “We work to address posts expressing thoughts of suicide as quickly and accurately as possible.”

It is important to note that at least three Facebook users live-streamed their suicides since the start of 2017.