Now Reading
Can AI Help Counsellors To Prevent Teen Suicides

Can AI Help Counsellors To Prevent Teen Suicides

  • The AI works like a training tool for counsellors to be prepared to take on the challenges of saving teenagers from self-harm.
Can AI Help Counselors To Prevent Teen Suicides

The conversation around keeping mental health in check seems to have intensified in the recent past. However, dealing with troubled teenagers is a different ball game altogether, which can be challenging for the best of mental health experts. 

Everyone will agree that teen counsellors need more than a big helping hand. Technology, specifically AI, may just be the perfect contender to take on this arduous task. However, AI for teenage counselling does sound a bit much, as trusting technology with something fragile like this can be risky. 

Register for FREE Workshop on Data Engineering>>

Case in point: The Trevor Project

The Trevor Project is a suicide hotline for LGBTQ youth in America. It is a not-for-profit organisation that is leveraging AI tools to help suicidal LGBTQ American teens arrive at a safe place before they can be counselled into feeling accepted and validated. 

The Trevor Project is just one of those crisis services with a very large caller base to cater to. Since they are on a critical mission, crisis management services like them cannot afford to fail or fall short. Realising the concerns, the group turned to artificial intelligence in order to help them handle every sensitive situation for every American youth whose life is on its brink. 

The AI has been used for counsellor training to prepare suicide responders to connect with mental health patients. It’s used for a roleplay to mimic a potential distressed teen on the verge of suicide. AI acts as the first point of contact between a caller and the hotline to stall disastrous circumstances before an actual human counsellor can take over the conversation.

It is a tool to help the organisation respond to the large incoming volume of people reaching out for help. It will assist the volunteers in being available and in providing timely assistance to those in need. The biggest challenge faced by helplines is determining high-risk callers and prioritising, who should receive aid first. The machine learning algorithm can perform accurate suicide risk evaluations and predictions so that the responding team can take timely actions to save the most vulnerable people first. 

How AI Works in Suicide Prevention

The AI works like a training tool for counsellors to be prepared to take on the challenges of saving teenagers from self-harm. A GPT-2 chatbot has been installed on the suicide hotline, which will work alongside volunteers to prevent as many suicides as is possible, and more.

AI will erect a virtual persona of a potential victim based on acquired characteristics after ‘learning’ from several roleplay transcripts fed to it. The role play enactment tool, called a ‘crisis contact simulator’, works on AI and machine learning. It will be used both for training as well as to perform a risk assessment. 

To facilitate this, the Trevor Project agreed to deploy OpenAI’s GPT-2 algorithm that uses natural language processing to recreate humane conversations so that a person interacting with the GPT-2 is tricked into believing they are speaking to a human. This will give counsellors the window required to step in and take charge of the exchange. The GPT-2 chatbot has been trained on various scenarios, including 45 million pages from the web, that equips it with the basic structure and grammar of the English language, along with the transcripts of conversations.

See Also
How This AI-Enabled Chatbot Radically Transformed Cancer Care Amid Pandemic

Deconstructing a suicide call scenario:

When a suicidal individual first calls the helpline number, they are made to provide answers to a basic question or two. These initial responses are crucial since they are fed into a clinical assessment model.

NLP then kicks in to analyse the response and match it to a diagnosis risk level. High-risk words in the response are identified, and the system then classifies the caller on a suicide risk scale. The calls placed on their appropriate positions on the risk meter are then queued and given priorities so that a counsellor can take the best action to tackle individual cases based on the priority assigned to them. 


The grit with which suicide prevention volunteers work restlessly to engage with such individuals is truly unimaginable. AI’s intervention to bring a sense of ease and insight into this process is commendable and very much welcome. 

Technology, especially AI, may never truly (not any time soon) replace the human touch in counselling. However, even a fraction of the percentage of LGBTQ youth suicides can be prevented; more mental health organisations like the Trevor Project will follow suit and take the hand of technology. 

What Do You Think?

Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.
Join our Telegram Group. Be part of an engaging community

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top