Skip to content Skip to footer

Facebook’s AI will scan ‘cry for help’ posts to help prevent suicides

Facebook is looking to make their contribution to the world’s mental health issues by using artificial intelligence. The social media platform’s AI will scan users’ posts for signs they’re having suicidal thoughts.

This wouldn’t be the first time we’ve seen Facebook try and reach out to those who might be crying out for help.

Back in 2015, Facebook paired up with mental health organizations and rolled out tools that allowed users to report statuses that contained suicidal thoughts or that indicated self-harm.

Once reported, the suicide prevention tools which were created in conjunction with Facebook’s “clinical and academic partners” encouraged those looking to hurt themselves to speak with a mental health expert at the National Suicide Prevention Lifeline.

Now, with the use of AI, the statuses and live feeds won’t just have to be reported by users in order to raise a red flag. AI will be able to scan and detect previous cases that have telltale signs danger.

So, how will it work? According to a Facebook blog post, the social media empire will use pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide.

Plus, the AI program will use “signals like the text used in the post and comments (for example, comments like “Are you ok?” and “Can I help?”).”

In a Facebook post, CEO of the company, Mark Zuckerberg, said the use AI could be a big help in identifying those who are looking to harm themselves before it’s too late.

“Starting today we’re upgrading our AI tools to identify when someone is expressing thoughts about suicide on Facebook so we can help get them the support they need quickly. In the last month alone, these AI tools have helped us connect with first responders quickly more than 100 times.”

Of course, the issue of privacy does raise questions and the use of AI and its data is scary AF. Even Facebook’s chief security officer Alex Stamos addressed these concerns on Twitter, he tweeted:

https://twitter.com/alexstamos/status/935184558797889536

The data could be used to profile users and even though Facebook has been testing the AI program in the US, countries in the European Union aren’t having it.

According to the blog post, the tool won’t be active in any EU nations, because data protection laws prevent companies from profiling users in this way.

Globally this could help a whole bunch of people getting them the professional help they need in record time. Just to put it in perspective, the World Health Organization estimates that each year that there approximately there is one suicide death every 40 seconds.

Suicide is also the tenth leading cause of death in the US, accounting for more than one percent of all deaths, according to statistics we found on Mental Health America. To add nine of out ten people who attempt suicide and survive, do not go on to complete suicide at a later date.

We can only hope Facebook is not hiding behind suicide prevention and using the AI program’s collected data to harm us even more.

Anyway, stay safe and remember there always someone you can call. National Suicide Prevention Lifeline – 1-800-273-8255