AIM Banners_978 x 90

Bhabha AI Unveils Gajendra, Hindi-Hinglish-English Model

The model is built on top of Sarvam AI's OpenHathi.
Bhabha AI Unveils Gajendra

Amidst the rise of Indic LLMs, Bhabha AI has unveiled Gajendra, an early release of their 7B Hindi-Hinglish-English instruction fine-tuned language model. 

Built on top of SarvamAI’s OpenHathi, which is in turn built on Llama 2, Gajendra emerges as a specialised model catering to the intricacies of Hindi, Hinglish, and English instructional language.

Check out the model here.

In a forward-looking move, Bhabha AI is delving into the exploration of filtering examples that can be translated from English to Hindi. The initiative includes the release of initial versions of both the dataset and the corresponding model.

Bhabha AI has extended gratitude to Sarvam AI and AI4Bhārat for their previous contributions in the domain. Emphasising a commitment to collaboration with the open-source community, the company aims to accelerate the development and release of Hindi LLMs.

Bhabha AI’s Hugging Face page is currently built by three team members – Satpal Singh Rathore, Sourabh Singh, and Arjun Singh.

Bhabha AI has previously also released five datasets on Hugging Face including translation-classify, news-summary, Aksharantar-hindi, alpaca-gpt4-hindi-trans, and Hi-Instruct-v0.

📣 Want to advertise in AIM? Book here

Picture of Mohit Pandey
Mohit Pandey
Mohit writes about AI in simple, explainable, and often funny words. He's especially passionate about chatting with those building AI for Bharat, with the occasional detour into AGI.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed