Can AI Help Counsellors To Prevent Teen Suicides

Can AI Help Counselors To Prevent Teen Suicides

The conversation around keeping mental health in check seems to have intensified in the recent past. However, dealing with troubled teenagers is a different ball game altogether, which can be challenging for the best of mental health experts. 

Everyone will agree that teen counsellors need more than a big helping hand. Technology, specifically AI, may just be the perfect contender to take on this arduous task. However, AI for teenage counselling does sound a bit much, as trusting technology with something fragile like this can be risky. 


Sign up for your weekly dose of what's up in emerging technology.

Case in point: The Trevor Project

The Trevor Project is a suicide hotline for LGBTQ youth in America. It is a not-for-profit organisation that is leveraging AI tools to help suicidal LGBTQ American teens arrive at a safe place before they can be counselled into feeling accepted and validated. 

The Trevor Project is just one of those crisis services with a very large caller base to cater to. Since they are on a critical mission, crisis management services like them cannot afford to fail or fall short. Realising the concerns, the group turned to artificial intelligence in order to help them handle every sensitive situation for every American youth whose life is on its brink. 

The AI has been used for counsellor training to prepare suicide responders to connect with mental health patients. It’s used for a roleplay to mimic a potential distressed teen on the verge of suicide. AI acts as the first point of contact between a caller and the hotline to stall disastrous circumstances before an actual human counsellor can take over the conversation.

It is a tool to help the organisation respond to the large incoming volume of people reaching out for help. It will assist the volunteers in being available and in providing timely assistance to those in need. The biggest challenge faced by helplines is determining high-risk callers and prioritising, who should receive aid first. The machine learning algorithm can perform accurate suicide risk evaluations and predictions so that the responding team can take timely actions to save the most vulnerable people first. 

How AI Works in Suicide Prevention

The AI works like a training tool for counsellors to be prepared to take on the challenges of saving teenagers from self-harm. A GPT-2 chatbot has been installed on the suicide hotline, which will work alongside volunteers to prevent as many suicides as is possible, and more.

AI will erect a virtual persona of a potential victim based on acquired characteristics after ‘learning’ from several roleplay transcripts fed to it. The role play enactment tool, called a ‘crisis contact simulator’, works on AI and machine learning. It will be used both for training as well as to perform a risk assessment. 

To facilitate this, the Trevor Project agreed to deploy OpenAI’s GPT-2 algorithm that uses natural language processing to recreate humane conversations so that a person interacting with the GPT-2 is tricked into believing they are speaking to a human. This will give counsellors the window required to step in and take charge of the exchange. The GPT-2 chatbot has been trained on various scenarios, including 45 million pages from the web, that equips it with the basic structure and grammar of the English language, along with the transcripts of conversations.

Deconstructing a suicide call scenario:

When a suicidal individual first calls the helpline number, they are made to provide answers to a basic question or two. These initial responses are crucial since they are fed into a clinical assessment model.

NLP then kicks in to analyse the response and match it to a diagnosis risk level. High-risk words in the response are identified, and the system then classifies the caller on a suicide risk scale. The calls placed on their appropriate positions on the risk meter are then queued and given priorities so that a counsellor can take the best action to tackle individual cases based on the priority assigned to them. 


The grit with which suicide prevention volunteers work restlessly to engage with such individuals is truly unimaginable. AI’s intervention to bring a sense of ease and insight into this process is commendable and very much welcome. 

Technology, especially AI, may never truly (not any time soon) replace the human touch in counselling. However, even a fraction of the percentage of LGBTQ youth suicides can be prevented; more mental health organisations like the Trevor Project will follow suit and take the hand of technology. 

More Great AIM Stories

Anju Nambiar
Anju is a writer from Mangalore with a particular taste for creating compelling business stories. She has been devoutly immersed in the tech world right from her engineering days and can't seem to shake off her curiosity for the latest enterprise technology.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM