Chatbots have become an integral part of modern enterprises. For example, Mahindra has been working with Amazon Alexa for four years now. Mahindra XUV 700 was the first vehicle in India to integrate Alexa. The customers can use the Mahindra Skill for Alexa to manage vehicle information such as fuel level or tire pressure, and remotely give commands for locking or unlocking the car door, preparing the cabin temperature, etc.
Earlier, ICICI Prudential Life launched a voice chatbot called ‘LiGo‘ on Google Assistant to handle insurance-related queries.
Sign up for your weekly dose of what's up in emerging technology.
Amazon’s Alexa, Google Assistant and Apple’s Siri are leading the way for conversational AI for general/customer-focused use cases (B2C use-cases). In terms of research and advanced language modelling, industry and businesses specific (B2B) chatbots have remained largely rule-based or menu-driven.
However, the conversational AI landscape has been undergoing a shift in the last few years.
At the Google I/O conference last year, Sundar Pichai said the company is exploring ways to cater to the developer and enterprise customers. Further, he said the language model for dialogue application (LaMDA) would be a huge step forward in natural conversation.
RASA is an open source conversational AI platform that help adapt generalised language pipelines layered with injected vocabularies – albeit in a convoluted fashion; meaning, developers are left to do the heavy lifting.
While tech giants and open source initiatives are trying their best to push conversational AI solutions and products to enterprise customers, most of them are limited to supporting or assisting teams to a certain level. Generalised platforms with probabilistic language models have largely remained in the periphery use-cases ( or trophy use-cases if we are a bit unkind).
The customisation of chatbots play a critical role in enhancing the end-user experience. Quick training and deployment of chatbots with minimal downtime is also key. Organisations use multilingual queries, speech-to-text methods, and NLP techniques to build such products.
Towards hyper-personalised solutions
Many companies today are looking at developing chatbots to address end customers’ problems in real-time and offer them a seamless experience. BFSI, retail, agritech, and fintech, are major adopters of AI chatbots.
In 2020, Axis Bank collaborated with Vernacular.ai to develop an AI solution and automated customer interactions via an intelligent human-like dialogue. Similarly, HDFC Bank uses a chatbot called ‘EVA‘ to address customer queries. ICICI Bank uses iPal, which lets customers check account balances, credit card dates, etc. Tata Capital, in partnership with Yellow.ai, developed TIA charbot to answer loan related queries.
The core challenges faced by companies investing in chatbots include:
- High infrastructure cost
- High-end skill requirements
- Ease of access by the users
- Complex languages and dialects within each state in the country
- Lack of training datasets
A final thought
Today, the conversational AI market remains fragmented to a large extent. While creating chatbots have become easier, their impact is still not quantified. To that, enterprises are looking to develop specialised solutions by taking advantage of domain knowledge, and deep learning and ML skills to build products that understand customers and automates tasks seamlessly.
This article is written by a member of the AIM Leaders Council. AIM Leaders Council is an invitation-only forum of senior executives in the Data Science and Analytics industry. To check if you are eligible for a membership, please fill the form here.