Now Reading
Chatbots In Mental Health. Friendly But Not Too Friendly. 

Chatbots In Mental Health. Friendly But Not Too Friendly. 

Kunal Kislay
W3Schools

Mental health is the proverbial elephant in the room that no one wants to address. India is on the verge of a mental health epidemic, yet one would hardly find a public discourse on ways to prevent it or treat it. There are hardly any steps being taken at the scale required to manage this the increasing number of people with mental health issues. Corona is to blame for this massive spike in the number of cases. The isolation because of lockdown, fear and uncertainty due to job cuts and general discomfort due to the inability to control several aspects of life has triggered severe mental trauma in people across the country. The issue is even more complicated by the fact that no one has any idea when things will ever return to normal, if at all. 

There is a huge gap in the treatment that should be available and of the help available at hand, easily and cost-effectively. Even in developed countries, the ratio of psychiatrists, psychologists, psychiatric social workers, and mental health nurses to patients is 1: 10,000. The lacuna in the system ensures that most people suffering from mental health issues are never able to get the help they need.  

Many digital interfaces are emerging as viable complementary services to fulfil some Artificial intelligence-based solutions are developed by working closely with healthcare professionals to provide a person with assistance, and often, some sort of companionship. This can also bring down the cost of psychiatric diagnosis and treatment. Most people have faced stigma prevalent in our society when it comes to its psychiatric disorders, which often hampers with effective treatment. 



Chatbots are Natural Language Processing(NLP) based frameworks that interact with human users using spoken, written, and visual languages. Built specifically to communicate with people struggling with mental problems, Chatbots have the potential to be useful tools. According to experts, “suicide prediction and prevention, identification of predictors for a response, and identifying which particular drug is best suited for a particular patient are some of the areas where AI has been found to be useful in psychiatry.”

NLP-based Personalisation

A chatbot can simulate a conversation on various platforms using text or audio and sometimes video too. Some chatbots are fully automated while some use human interface. The AI frameworks need to be trained with large amounts of data to make these bots compatible with the complexity of human communication have the ability to understand cultural nuances.

Woebot is a fully automated conversational agent that treats depression and anxiety using a digital version of cognitive behaviour therapy(CBT), which is primarily used for behaviour modification in patients; while other chatbots like Ellie can detect subtle changes in our facial expressions, rates of speech, or length of pauses and builds a diagnosis accordingly. In case of a deep-seated problem in the patient is detected, an option to meet an actual therapist is provided to them, and relevant helpline numbers are shared.

A person trying to get over the death of a near one can benefit from a chatbot that can talk in the voice of the deceased person. With time messages are given that help that person in forgetting the trauma caused by an immediate loss and the absence of closure. This concept has been well explored in a wonderful ad by Samsung’s #VoiceForever for Bixby Voice Assistant. The ad shows the voice of a mother being integrated with the voice assistant that helps a little girl accept the loss of her mother better. 

Chatbots addressing mental health are built on the premise of cognitive-behavioural therapy (CBT). CBT uses structured exercises to encourage a person to question and change their habits of thought— this format is well suited to a step-by-step software guide or chatbot. Timely interventions by chatbots with patients can help to manage mental health conditions. This is mainly done by changing the way they think and behave by encouraging patients to reframe their negative thoughts into positive ones by deploying natural language processing and clinical expertise. This helps to curate a cathartic and therapeutic experience for the end-user. Karim is an example of an Arabic-speaking chatbot, designed by the Silicon Valley startup X2AI that helps Syrian refugees deal with their mental health issues. 

Bridging Or Widening The Gap?

In response to the unprecedented COVID-19 disruption, the majority of the world was coerced into a lockdown. Staying cooped up at home for extended periods had gotten extremely uncomfortable and stressful for everyone. And there has been no channels to vent out or relax. The isolation at home is compounded by the fact that when one steps out, its onto a sea of masked people. There are no familiar faces, no smiles, no handshakes and hugs.

This onslaught of loneliness will have long term effects on the psyche of people, some of which are manifesting in depression and an increase in suicide rates already. People have had emotional breakdowns, feel of nervous, tense, stressed, and lonely, and are having panic attacks, anxiety, depression, and sleep disturbances. All this compounded with fear of losing livelihood, freedom to go out, worry about the health of oneself and loved ones, added up to the mental struggle woes. According to Statista, on May 31, approximately 28% of respondents in the United States stated that their mental health is among their main concerns about the COVID-19.

A mental illness crisis burgeoned parallel to the disruption caused by the pandemic. Experts are now turning towards AI solutions to sustain harmony among our mental states. These AI chatbots are especially helpful in areas where physical accessibility is an ordeal. 

Therapy bots have been providing much-needed assistance to overworked mental healthcare professionals. In reality, the things a chatbot can currently handle is still very limited and are under the purview of research. Humans are complicated by nature, even the ones not struggling with mental health issues. Those suffering from depression, anxiety, schizophrenia, and suicidal thoughts have multiple layers of thinking and multiple faces that they put on for their human interactions. A chatbot, in this, came often helps bring down that wall because its biggest weakness is its biggest strength too. Because a chatbot will never be judgemental, it will never have empathy too. Most questions used by chatbots are fed in by a script based on the CBT technique.

There is a lack of emotional bonding between the two parties, which is what a person suffering from mental health is often looking for. Someone who ‘cares’. Some of the primitive mental health chatbots don’t use sophisticated neural nets to learn but instead rely on rather simplistic answer trees.  

See Also
covid-19 chatbot

Chatbots can effectively achieve early interventions and a primary diagnosis of mental health disorders But it’s best not to let bots engage in full-blown therapy with half-baked psychology theories without any human supervision. Any escalation or wrong turn can result in a greater crisis for the patient.

The Way Forward

People tend to make connections, always. In the movie ‘Her’, this phenomenon is explored in detail. An ai based programme is like a friend in your pocket. The characters in the movie get possessive about it and have even fallen in love with ‘Her’, knowing full well it’s just a software program. There is a risk of attachment and dependency on the chatbot because unlike its human counterpart, a chatbot is never tired and is available 24×7, all through the year. 

Chatbots can provide companionship, support, and therapy which can reduce the load on therapists drastically. It emerges as a viable option for people who face challenges in terms of accessibility and affordability.  

However, there has been a rise of scepticism over the discourse of chatbots when it comes to sensitive and subjective issues like mental health. Confidentiality is the foremost concern, followed by the universality of application, lack of standardisation and monitoring, patient developing escapism tendencies by over-dependence on artificial bots, and inaccurate diagnosis of severe mental disorders. 

Going forward, the tech crusaders need to develop chatbots that can strike the fine balance of being empathetic without being judgemental. People using them are made aware of the limitations of the chatbot and the areas in which it can help — a transparent system that’s better than none at all.

If you ask Siri, ‘I am feeling depressed’, it tells you, ‘I am always here if you want to talk’. Sometimes that’s all a chatbot needs to do. Talk to you. It will help a person self-heal much better than googling ways to commit painless suicide. 

What Do You Think?

If you loved this story, do join our Telegram Community.


Also, you can write for us and be one of the 500+ experts who have contributed stories at AIM. Share your nominations here.

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top