Amidst the rise of Indic LLMs, Bhabha AI has unveiled Gajendra, an early release of their 7B Hindi-Hinglish-English instruction fine-tuned language model.
Built on top of SarvamAI’s OpenHathi, which is in turn built on Llama 2, Gajendra emerges as a specialised model catering to the intricacies of Hindi, Hinglish, and English instructional language.
In a forward-looking move, Bhabha AI is delving into the exploration of filtering examples that can be translated from English to Hindi. The initiative includes the release of initial versions of both the dataset and the corresponding model.
Bhabha AI has extended gratitude to Sarvam AI and AI4Bhārat for their previous contributions in the domain. Emphasising a commitment to collaboration with the open-source community, the company aims to accelerate the development and release of Hindi LLMs.
Bhabha AI’s Hugging Face page is currently built by three team members – Satpal Singh Rathore, Sourabh Singh, and Arjun Singh.
Bhabha AI has previously also released five datasets on Hugging Face including translation-classify, news-summary, Aksharantar-hindi, alpaca-gpt4-hindi-trans, and Hi-Instruct-v0.