AWS Challenges Open Source with Bedrock

AWS has made Amazon Bedrock, its suite of foundational models generally available

Share

After four months of intensive development and testing, Amazon Web Services (AWS) has made Amazon Bedrock, its suite of foundational models generally available. Since its debut, the tool has proven vital in supporting the complex demands of enterprise users. 

This significant release comes on the heels of Bedrock’s debut, where it quickly proved its worth by meeting the demands of enterprise users. Moreover, in July, AWS upgraded the suite by including additional models, like Anthropic Claude 2 and Stability AI SDXL 1.0 models and the essential proprietary Titan Embeddings model.

Among the many things that AWS has improved in Amazon Bedrock is ensuring compliance with regulatory standards, with a particular focus on the European Union’s General Data Protection Regulation (GDPR). Vasi Philomin, the Vice President and General Manager for Generative AI at Amazon, has affirmed this compliance, highlighting the company’s commitment to safeguarding data privacy and protection.

A Hugging Face Competitor

Since it threw a ring in the generative AI space, AWS has emerged as a formidable contender to open-source providers. AWS is soon planning to integrate Meta’s next-generation LLM Llama 2 into Bedrock, along with models from AI21 Labs, Anthropic, Cohere, and Stability AI. While AWS claims itself to be the pioneer in offering Llama 2 (13-billion- and 70-billion-parameter versions) as a fully managed generative AI service, other cloud-hosted generative AI platforms, such as Google’s Vertex AI and Hugging Face, have already been providing this service for some time.

AWS’s models will be accessible through a managed API, optimised to deliver well within AWS infrastructure. This approach poses a challenge to Hugging Face, which has been the trusted host for these models since their debut. While pricing details for Llama 2 on Bedrock have not yet been disclosed, they are expected to be competitive with Hugging Face’s existing pricing structure.

When the models were first released by HF, several users faced issues in running the model. Moreover, they struggled to gain access to the models on the platform initially as well. The users got acceptance from Meta almost immediately while their approval remained from HuggingFace for weeks

Despite the fact that AWS will be offering the same models as Hugging Face, it is unlikely that developers will readily shift to AWS. Hugging Face has successfully carved out a niche for itself as an inviting, engaging and comfortable space for developers, whereas AWS holds greater appeal among enterprise users. Interestingly, both companies are actively engaged in competition while simultaneously collaborating to democratise the field of AI development.

Collaborative necessity 

Earlier this year, AWS and Hugging Face announced a strategic partnership aimed at offering LLMs and generative AI models on AWS’ ML platform. Through this collaboration the Hugging Face community now leverages AWS ML services and infrastructure, streamlining the processes of model training, fine-tuning, and deployment. The timing of these leaps by AWS is crucial, considering the shifting dynamics within the industry. Microsoft has the attention due to its exclusive partnership with OpenAI. 

Although OpenAI’s most powerful generative AI models are available through Microsoft Azure, access to these models remains highly restricted. Developers keen on using them must undergo a stringent application process, providing detailed use cases for consideration. Furthermore, even after gaining access, all solutions that deploy Azure OpenAI models must go through a use case review before being approved for production use. This cumbersome process stands in stark contrast to what AWS offers, making it a better choice for developers and the collaboration with Hugging Face further simplifies the process.

Given the heightened interest and attention surrounding LLMs, AWS’s partnership with Hugging Face was virtually inevitable. Cloud providers are forced to intensify their efforts to deliver scalable infrastructure and platform services, providing developers with options for their specific needs.

AWS is making concerted efforts to attract developers, launching a free, self-paced course called “Amazon Bedrock—Getting Started” to introduce users to the platform’s features and benefits. Additionally, AWS’s partnership with Hugging Face streamlines access to LLMs, making it a more accessible option for developers compared to Microsoft Azure’s stringent access policy and restrictions.

Share
Picture of Tasmia Ansari

Tasmia Ansari

Tasmia is a tech journalist at AIM, looking to bring a fresh perspective to emerging technologies and trends in data science, analytics, and artificial intelligence.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India