‘Its entire purpose is to be fun’, a media report said in 2017 after Hugging Face launched its AI-powered personalised chatbot. Named after the popular emoji, Hugging Face was founded by Clément Delangue and Julien Chaumond in 2016. What started as a chatbot company, has transformed into an open-source provider of NLP technologies to companies such as Microsoft Bing.
The New York-based startup is extending access to conversational AI by creating abstraction layers for developers. This would help users to adopt conversational AI technologies such as BERT, XLNET, and GPT-2.
Recently, Hugging Face raised $40 million in Series B funding led by Addition. Lux Capital, A.Capital, Betaworks, NBA star Kevin Durant, MongoDB CEO Dev Ittycheria, and former Salesforce chief scientist Richard Socher, and Datadog CEO Olivier Pomel participated in the round.
The startup said the fresh funds would be used to expand the workforce at their New York and Paris offices.
What Is Hugging Face?
In its early days, when Hugging Face was still a chatbot, co-founder Delangue said in an interview, “We’re building an AI so that you’re having fun talking with it. When you’re chatting with it, you’re going to laugh and smile — it’s going to be entertaining.” The app was a runaway hit. The startup wanted to prove that chatbots needn’t have a complex command-line interface for customer support. The app allowed the user to generate a digital friend (think Tamagotchi) to text back and forth. The app could detect emotions and answer questions based on the context and emotions.
Hugging Face aims to become GitHub for machine learning. Hugging Face is one of the leading startups in the NLP space. Big tech companies such as Apple, Monzo, and Bing use its library in production.
Open Source
Hugging Face has a large open-source community, with Transformers library among its top attractions. Transformers Library is backed by deep learning libraries– PyTorch and TensorFlow. It provides thousands of pretrained models to perform text classification, information retrieval, question and answer, translation, text generation, and summarisation. Transformers provide APIs that can be quickly downloaded and use pretrained models on a text to fine-tune users’ datasets.
Transformers has been downloaded over a million times and has garnered over 42,000 stars on GitHub. Researchers at Google, Facebook, and Microsoft have extensively used the Transformers library in their projects.
The startup is putting efforts into growing the open-source community for the development of language models. The company said there is a disconnect between the research and the engineering teams in NLP. Big tech companies do not completely embrace the open-source approach, and even in a few cases where they have, the open-sourced repositories are hard to use and not well-maintained.
Delangue said the democratisation of AI is key to extend the benefits of emerging technologies to smaller organisations. In an earlier interview, Delangue said, “I think one of the big challenges that you have in machine learning, it seems these days, is that most of the power is concentrated in the hands of a couple of big organisations. We’ve always had acquisition interests from Big Tech and others, but we believe it’s good to have independent companies — that’s what we’re trying to do.”
Credit: Crunchbase
Investor Interest
Hugging Face has become one of the fastest-growing open-source projects. In December 2019, the startup had raised $15 million in a Series A funding round led by Lux Capital. OpenAI CTO Greg Brockman, Betaworks, A.Capital, and Richard Socher also invested in this round.
As per Crunchbase data, across four rounds of funding, Hugging Face has raised over $60 million till now.