Facebook adopts Artificial Intelligence to prevent suicide instances

Facebook adopts Artificial Intelligence to prevent suicide instances

In an interesting development, Facebook has turned to Artificial Intelligence to prevent its users from committing suicides. It also intends to update its tools and services directed towards the same. With this Facebook has yet again proved its efforts to build a safe community on and off Facebook.

Citing on its blog that one out of every death caused around the world in 40 seconds is by suicide, Facebook is working towards finding the best cure for preventing them– that is by helping those in distress being heard by people who care about them. Live-streaming feature, Facebook live and Messenger service are some of the features where it intends to bring the suicide prevention tool.

Facebook has facilitated the users with an option of reaching out to the company directly or reporting the post to them in case a user finds that the post could be of concern. “We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide”, Facebook said in the blogpost.

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

Though suicide prevention tools have been available on Facebook for more than 10 years, this is the first time they have used AI for the same. However, it’s not the first instance where the company has used AI. Facebook already uses artificial intelligence to monitor offensive material in live video streams.

How would it use AI?

Pattern recognition algorithms is what Facebook working on. Though it has been offering help to those thought to be at suicidal risk for a long time, it has relied on other users to bring report the case to Facebook. To do away with the human dependence and make the process more autonomous, algorithms are been developed to train with previous messages that has been reported or flagged.

“This artificial intelligence approach will make the option to report a post about “suicide or self injury” more prominent for potentially concerning posts like these”, the company blogpost said.

The pattern recognition would also identify posts that are likely to include thought of suicide. For instance, talk of pain, sadness, phrases like “are you okay”, “I am worried” etc would be identified and reviewed by the network’s community operations team and immediate action would be taken.

Facebook has partnered with organizations like Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline to reduce the instances of suicides, which according to a National Center for Health Statistics study, has seen a 24% jump in the US alone between 1999 and 2014.

In a nutshell, this is what FB intends to do:

  • Integrate suicide prevention tools to help people in real time on Facebook Live
  • Live chat support through Messenger which would be from crisis support organizations
  • Artificial Intelligence assisted streamlined reporting for suicide

More Great AIM Stories

Srishti Deoras
Srishti currently works as Associate Editor at Analytics India Magazine. When not covering the analytics news, editing and writing articles, she could be found reading or capturing thoughts into pictures.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

Conference, in-person (Bangalore)
Cypher 2023
20-22nd Sep, 2023

3 Ways to Join our Community

Whatsapp group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our newsletter

Get the latest updates from AIM