MITB Banner

Hugging Face Takes on ChatGPT With HuggingChat 

For now, it runs on OpenAssistant’s latest LLaMA based model
Share
Listen to this story

Hugging Face, the go-to AI platform for open-source, has just released an open-source alternative to the internet’s favourite chatbot, ChatGPT. Named HuggingChat, this release offers various functionalities and integrations catering to both developers and users alike.

With its sleek web interface, HuggingChat is available for testing, allowing firsthand experience. Additionally, it can be integrated with existing apps and services through HF’s API, opening up possibilities to customise and use in various domains. From writing complex code to composing emails and even crafting awe-inspiring rap lyrics, HuggingChat is quite versatile.

The model behind the chatbot was developed by Open Assistant, a passion project organised by LAION, a German nonprofit recognised for creating the data set to train Stable Diffusion, an text-to-image AI model. Open Assistant’s efforts is empowering users to personalise and extend HF’s product as per their needs and also remains efficient enough to run on a common hardware. 

However, challenges do exist. HuggingChat, like its counterparts, is not immune to setbacks. Depending on the questions it’s asked, it may veer off course, a fact that Hugging Face acknowledged. As per HF, the chatbot showcases that it is now possible to build an open source alternative to ChatGPT. 

For now, it runs on OpenAssistant’s latest LLaMA based model but the long term plan is to expose all good-quality chat models from the Hub. As Meta’s LLaMA is bound by industrial licences it is not possible to directly distribute LLaMa-based models. Instead Open Assistant provided XOR weights for the OA models.

Including Meta’s LLaMA, large language models were in a legal grey area as they were being trained on ChatGPT output up until two weeks ago. Databricks figured out a way around this with Dolly 2.0. The differentiating factor between other ‘open source’ models and Dolly 2.0 is that it is available for commercial purposes without the need to pay for API access or share data with third parties unlike the rest. 

But Meta’s commercial licence has not stopped the developer community from taking full advantage of the technology. They have optimised the model to operate on even the most basic of devices, introduced additional functionality, and even employed LLaMA to create use cases. 

Read more: 7 Ways Developers are Harnessing Meta’s LLaMA

PS: The story was written using a keyboard.
Share
Picture of Tasmia Ansari

Tasmia Ansari

Tasmia is a tech journalist at AIM, looking to bring a fresh perspective to emerging technologies and trends in data science, analytics, and artificial intelligence.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India