Is Hugging Face the Next OpenAI?

While Hugging Face's open approach has garnered much appreciation, it has also given rise to questions about its future direction
Listen to this story

Hugging Face is having its day in the sun. The leading open-source proponent is now valued at $4.5 billion after raising $235 million in round D funding from Google, Amazon, Nvidia, Intel, AMD, Salesforce and others. With this recent round, Hugging Face has now raised a total of $395 million, CEO Clement Delangue revealed in an interview. 

Hosting over 500,000 AI models and an extensive collection of 250,000 datasets, the platform allows users to freely share their models and datasets, fostering a spirit of collaboration and innovation within the AI community.

What Future Holds for Hugging Face

While Hugging Face’s open approach has garnered much appreciation, it has also given rise to questions about its future direction. As its influence grows, concerns have been raised regarding the platform’s possible transition from its original open and collaborative ethos to a more profit-driven model. This potential shift has given rise to discussions reminiscent of OpenAI’s journey.

The cautionary “enshittification cycle”, discussed on platforms like Hacker News, is worth considering — a pattern wherein a company’s user-centric services, initially supported by venture capital and investors, give way to business-oriented approaches and eventual monetisation that could diminish user satisfaction and engagement.

When asked during a recent interview about their plans to put the fresh infusion of capital to work, Hugging Face CEO, Clement Delangue said that they would use it to double down on hiring more people and on investments in open-source AI. However, he also spoke about profit and validating the revenue generation from AI.

“I think we’ve validated last year that there was massive usage for generative AI. This year we are validating that there is massive revenue for generative AI… we are on track to 5x our revenue this year with over 10,000 customers today,” Clem said in a recent interview, seemingly indicating that we are in the second phase of the ‘enshittification cycle’.

Way Too Much Power

Over the years, Hugging Face has created a thriving ecosystem that caters to all participants involved.  However, discussions have now moved to the potential consequences of Hugging Face’s growing dominance in model hosting and frameworks. Some stakeholders worry about the platform’s future control over these resources, which could stifle innovation and centralise power dynamics.

For instance, the recent licensing of their TGI library raised concerns.

The spectre of Hugging Face potentially becoming less community-friendly may lead to possible forks or alternatives emerging if necessary. And this is a pattern visible in other community-based platforms like Reddit which blocked access to its API to save its data.

The open-source community’s discussion has been filled with mixed views on the potential limitations and future restrictions on the platform. Some users speculate that venture capitalists and investors may eventually impose restrictions on the platform in a move to monetise, similar to Docker‘s monetisation model — the change in licensing for Hugging Face’s text generation inference library is an indicator.

Docker, at a certain point, sent an email titled ‘Sunsetting free team organisations’ to any Docker Hub user who had created an open source ‘organisation’, telling them their account would be deleted unless they upgraded to a paid plan. The price went from 0 to $420 per year, and naturally, users were not pleased. Docker went from being the darling of the open-source community to being hated.

Unprecedented Growth

Despite these concerns, Hugging Face’s journey showcases its transformative growth since 2016. The balance between commercial success and a collaborative spirit remains an ongoing quest for the company. The evolving landscape raises the question: Is Hugging Face treading a path similar to OpenAI’s, where the quest for profit could lead to a shift from open to closed source?

Indeed, Hugging Face’s trajectory evokes comparisons with OpenAI’s evolution. OpenAI began as an open-source initiative and later transitioned to a more controlled model. Hugging Face’s trajectory is at a crossroads, where it must decide on how to harness its exponential growth while maintaining its community-driven ethos. 

Striking a Balance is the Way to Go

Striking the right balance will determine whether Hugging Face becomes the next OpenAI or charts a unique path that marries profit and collaboration in a harmonious synthesis. As Hugging Face ponders on its future direction, the tech community watches with a mix of excitement and caution, hoping that the platform’s journey continues to be guided by principles that have fuelled its rise. 

Hugging Face is also going to do a whole lot of good with the investment and Clem recently announced a flurry of hiring initiatives on X, emphasising diversity and inviting individuals from non-traditional backgrounds to join the collaborative AI efforts of the company. The move was applauded for tapping into underrepresented talent that may not follow conventional paths, challenging the notion of capability.

The company’s job openings cover a wide spectrum, ranging from machine learning engineers to technical support engineers, account executives, cloud machine learning engineers, and more. The positions are spread across various locations, including the US and Europe, and all offer remote work opportunities. This marks Hugging Face’s commitment to fostering an inclusive environment and expanding its team with professionals from diverse backgrounds. The range of roles also highlights the company’s multidisciplinary approach to AI, spanning technical, customer support, and marketing domains.

Conclusively, the potential pivot towards closed-source profitability poses both challenges and opportunities, with implications that extend beyond the company itself, shaping the landscape of AI collaboration and innovation.

Download our Mobile App

Shyam Nandan Upadhyay
Shyam is a tech journalist with expertise in policy and politics, and exhibits a fervent interest in scrutinising the convergence of AI and analytics in society. In his leisure time, he indulges in anime binges and mountain hikes.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox

6 IDEs Built for Rust

Rust IDEs aid efficient code development by offering features like code completion, syntax highlighting, linting, debugging tools, and code refactoring