Remember meForgot password?
    Log in with Twitter

article imageFacebook's new AI helps spot suicidal users

By James Walker     Mar 1, 2017 in Technology
Facebook has announced a new machine learning tool which it's using to offer advanced suicide prevention tools on its social network. The company noted that somebody commits suicide every 40 seconds, many of whom could be saved if helped early.
In a blog post, Facebook said it has developed a new AI-driven system that can programmatically detect people who could be feeling suicidal. It looks for potential indicators of suicidal feelings in posts that users make.
Once alarmed, the system flags the case up for manual review by Facebook's human monitoring teams. Unlike the social network's current suicide detection systems, it can single out people who haven't been reported by other users. There's currently no indication of when it will roll out across the site.
Facebook already helps people who have signalled suicidal thoughts by offering support options within its app. It connects users with close friends who may be able to help, suggesting a new conversation with pre-populated text that makes it easier to start talking about the topic. The prompt also contains links to suicide prevention help lines and offers advice that immediately helps the user.
The suicide prevention tools already available for written posts are now being extended to livestreamed videos on Facebook Live. Viewers will be able to report concern for the streamer's welfare using a new menu option. It be flagged up to Facebook and options presented to connect with a friend or emergency help lines.
Facebook recognised that it's in the unique position of being able to identify people who could be feeling suicidal and offer preventative advice before it's too late. Since its site is built on friendship, it has the capacity to directly connect people who need help with other users capable of supporting them.
Facebook suicide prevention tools
Facebook suicide prevention tools
"Facebook is in a unique position – through friendships on the site – to help connect a person in distress with people who can support them," the company said. "It's part of our ongoing effort to help build a safe community on and off Facebook."
Facebook has partnered with suicide help organisations including the National Suicide Prevention Lifeline, Crisis Text Line and National Eating Disorder Association to create new ways for people to connect with support groups. These help lines are now available as business pages in Messenger, making it possible to start chatting from within the app. Facebook described the current system as a "test" that will expand to include more organisations in the next few months.
Facebook's announcement has been welcomed by experts. For a company often criticised for monitoring its users, this technology has been praised as something which has the potential to help people in need without feeling invasive. However, calls for Facebook to directly alert friends of potentially suicidal users have proved divisive, with many considering this to be beyond the social network's remit.
Dr John Draper, director of the U.S. National Suicide Prevention Lifeline, told the BBC he'd like Facebook to add this functionality, saying his organisation is already "discussing" the idea with the company. Facebook product manager Vanessa Callison-Burch downplayed the comments though, noting that Facebook doesn't always understand the "personal dynamics between people and their friends." The company is continuing to investigate new functionality to expand its suicide prevention tools, while treading the fine line between personal support and privacy invasion.
More about Facebook, Ai, Artificial intelligence, machibe learning, Social media
Latest News
Top News