Listen to this story
Hugging Face today announced that it will be partnering with Amazon Web Services (AWS) in a bid to democratise machine learning. With this partnership, the duo look to accelerate the availability of next-generation machine-learning models alongside helping developers build, train, and deploy the latest ML models in the cloud using purpose-built tools.
This move also comes against the backdrop of strategic partnerships between OpenAI and Microsoft, Athropic and Google, and others, which have been taking significant strides in AI, alongside the recent advancements in transformer models (GPT-3.5, LaMDA, etc), diffusion models (DALL.E, Imagen, etc), and large language model-based chatbots (ChatGPT, Bing Chat and Bard). “However, most of these popular generative AI models are not available, widening the gap of machine learning capabilities between largest tech companies and everyone else,” writes Hugging Face in its blog post.
Hugs & Smiles – A Much-Needed Combo in AI/ML
With this partnership, Hugging Face will use AWS as a preferred cloud provider so developers can access AWS’s tools, including Amazon SageMaker, AWS Trainium, AWS Inferentia and others, to train, fine-tune, and deploy ML models on AWS.
AIM Daily XO
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Hugging Face chief Clem Delangue said that accessibility and transparency are critical to sharing process and creating tools to use these new capabilities wisely and responsibly. He believes that Amazon SageMaker and AWS-designed chips will enable their team and machine learning community to convert the latest research into open-source models so anyone can build upon them.
This deal will also offer developers to optimise the performance of their models for their specific use cases while lowering costs, alongside building next-generation AI models. But what’s in it for Amazon? It helps the tech giant onboard/rope in developers into its Amazon SageMaker and AWS-designed chip universe.
Download our Mobile App
AWS chief Adam Selipsky believes that Hugging Face and AWS is making it easier for customers to access popular machine learning models to create their own generative AI applications with the highest performance and lowest cost. He said that this partnership shows how generative AI companies and AWS can work together to put this innovative technology into the hands of more customers.
Hugging Face has become one of the most popular hubs for machine learning, consisting of more than 100,000 free and open-source machine learning models and downloaded more than 1 million times daily by researchers, data scientists, and machine learning engineers. Hugging Face said AWS is the most popular place to run models from the hub. The team also said since the start of their collaboration, Hugging Face on Amazon SageMaker has grown exponentially.
In May last year, Hugging Face announced a similar partnership with Microsoft, where the company introduced Hugging Face Endpoints on Azure, a new service to turn Hugging Face models into scalable production solutions.