MITB Banner

Hugging Face Makes OpenAI’s Worst Nightmare Come True

While the services from both OpenAI and Hugging Face seem to be standing on the grounds to make AI more ‘democratic’ there’s a big difference.
Listen to this story

The battleground for generative AI is becoming an increasingly intense one and it isn’t just the big tech companies that want to stake a claim. There’s big money to be made here and it’s coming in from new directions. Last week, open-source AI platform ‘Hugging Face’ announced a partnership with AWS with the promise to “make AI Open and not the other way around”. The deal provides Hugging Face with the scope to integrate their models onto the AWS cloud and developers will be able to work with any of the models available on Hugging Face using the AWS-managed ML service, ‘SageMaker’. 

Democratising AI for enterprises

The next day OpenAI sneaked in with a new developer platform for users to run their newer models like the GPT-3.5 and a couple of their Da Vinci variants on dedicated capacity. Sam Altman’s startup is calling this service offering ‘Foundry’ and although they haven’t revealed their cloud provider, we can safely assume it is Microsoft Azure. 

While the services from both OpenAI and Hugging Face seem to be standing on the grounds to make AI more ‘democratic’, there’s a big difference. For Microsoft-backed OpenAI, Foundry is “designed for cutting-edge customers running larger workloads” i.e., companies

Additionally, this compute won’t come cheap. A leaked pricing sheet for Foundry found its way onto Twitter—even the lightweight version of GPT-3.5 is said to cost USD 78,000 for three months or USD 264,000 for a one-year time period. 

Foundry has found its first customers already—social media messaging platform ‘Snapchat’ and ‘Coca Cola’ will both be integrating GPT -3.5 into businesses with this service. 

Democratising AI for everyone

Hugging Face, on the other hand, is fighting this mentality of limiting AI in a few hands with deep pockets. They want generative AI models to be accessible to everyone and not only enterprises. While announcing the partnership, AWS and Hugging Face released a statement explaining why the generative AI realm needed more open source models. 

Adam Selipsky, CEO of AWS, stated, “Generative AI has the potential to transform entire industries, but its cost and the required expertise puts the technology out of reach for all but a select few companies.”

Clement Delangue, CEO and founder of Hugging Face, also concurred: “The future of AI is here, but it’s not evenly distributed. Accessibility and transparency are the keys to sharing progress and creating tools to use these new capabilities wisely and responsibly. Amazon SageMaker and AWS-designed chips will enable our team and the larger machine learning community to convert the latest research into openly reproducible models that anyone can build on.”

OpenAI’s reaction: A cheaper API

The company’s continued efforts to break open the playing field is already yielding positive results. A couple of days ago, the Altman-led startup introduced an API for users to build ChatGPT tech for their own applications and products. 

The happy surprise was that the API would be priced at USD 0.002 per 1,000 tokens, or 750 words, thereby making it a ten times cheaper model than their GPT-3.5 variants. For tiny startups or even researchers, this made a world of difference. 

ML engineer Mark Tenenholtz mulled over how OpenAI had pulled it off and tweeted, “I’d give my left eye to read a book written on the challenges of deploying GPT-3 and the engineering endeavour it took to drop the price 10x with ChatGPT’s API”. 

While the API model wasn’t necessarily great for all purposes—it had access only to certain model weights, gradients and training data—there were undeniable advantages. For companies that didn’t have ML experts or even a regular person, the API model was a blessing. 

But as OpenAI is battling to maintain supremacy in generative AI, Hugging Face is looking at other avenues to keep the AI research space open. 

Heated competition

Just today, EleutherAI announced that it was forming a nonprofit foundation called the ‘EleutherAI Institute’. Founded by a decentralised group of researchers, the entity has unsurprisingly been backed by Hugging Face and Emad Mostaque’s Stability AI, former GitHub CEO Nat Friedman, Lambda Labs and Canva. All of these companies are technically still ‘independent’ in their business segments.

To call Hugging Face a simple success would also be selling it short. Last year in May, the company closed a new round of funding with a massive USD 100 million in its Series C round. Following this, the company’s value was pegged at USD 2 billion. 

Besides the money they make from their own subscription model, partnering with big names like AWS is profitable in two ways—it makes Hugging Face more money and keeps potential monopoly at bay.

Access all our open Survey & Awards Nomination forms in one place >>

Picture of Poulomi Chatterjee

Poulomi Chatterjee

Poulomi is a Technology Journalist with Analytics India Magazine. Her fascination with tech and eagerness to dive into new areas led her to the dynamic world of AI and data analytics.

Download our Mobile App

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
Recent Stories