MITB Banner

Why Claude 3 Is Bad News for Microsoft Azure

And wake up call for OpenAI

Share

Listen to this story

AWS, a cloud market leader, has been facing stiff competition from Microsoft Azure in the past few months. The same goes for Google Cloud, which raised concerns about Microsoft’s monopolistic cloud computing practices.

The tussle in the cloud space is clearly visible in the recent cloud earnings from generative AI. In the fourth quarter of 2023, Microsoft’s Intelligent Cloud division achieved $25.9 billion in sales, while AWS and Google Cloud recorded $24.2 billion and $9.2 billion, respectively. 

Now, AWS and Google Cloud’s generative AI pal and OpenAI’s rival, Anthropic, might have brought them back into the game. The company recently released the Claude 3 model family, which comprises Claude 3 Haiku, Claude 3 Sonnet, and Claude 3 Opus. 

Claude 3 Opus, the strongest model, outperforms GPT-4 on common benchmarks like MMLU and HumanEval. It has been almost a year since GPT-4 was released, and this is the first time a model has surpassed it. Also, Claude 3 Sonnet, specifically meant for enterprise workloads, is now available on Amazon Bedrock and in private preview on Google Cloud’s Vertex AI Model Garden—with Opus and Haiku coming soon to both. 

Claude 3 also boasts vision capabilities, allowing it to process images and generate text outputs. It analyses and comprehends charts, graphs, technical diagrams, reports, and other visual assets.

With a context window of 200K, Claude 3 is well-suited for enterprise applications dealing with vast amounts of corporate data. Its capabilities encompass analysis, forecasting, content creation, code generation, and multilingual conversation, including Spanish, Japanese, and French proficiency.

Ironically, many believe Claude 3 has reached AGI because of its ability to delight users. That also explains why the customer-obsessed Amazon invested $4 billion, alongside Google investing $2 billion in the AI startup. It is interesting to see how the two cloud majors are joining hands to support OpenAI rivals and give Microsoft Azure a tough competition. 

While Amazon seems to have strategically integrated powerful AI models into its Bedrock umbrella, catering to enterprise customers, Microsoft and Google are upping their small language models game with Phi-2 and Gemma, fueling both startups and developers and giving Meta’s Llama 2 a fight. 

Amazon has none. It is likely to be released in the coming months. 

Hey, Who’s Dancing with Whom Now? 

The cloud war is turning into a dance battle of sorts. While Microsoft made Google dance at the start, Google and Amazon are now making Microsoft dance. But the latter is not going to give up so easily. 

At Microsoft Ignite 2023, the company announced models as a service (MaaS), an approach similar to Amazon Bedrock, an LLM marketplace (the analogy which the company dislikes) providing foundational models for enterprise customers from Anthopic, AI21 Labs, Stability AI, and Amazon via an API. 

More recently, Microsoft invested $16 million in Mistral AI and partnered with them to host their latest model, Mistral Large, on Azure as a MaaS and Azure Machine Learning model catalogue. 

Before the release of Claude 3, Mistral Large held the position as the world’s second-ranked model generally available through an API, surpassing Google’s Gemini Pro and Anthropic’s Claude 2.1, second only to GPT-4.

Apart from the Mistral, Azure hosts all OpenAI models, including GPT-4, GPT-4 Turbo, and GPT-3.5. Furthermore, Azure provides pre-trained models, such as Meta’s Llama 2 series in 7B, 13B, and 70B parameter versions, along with its proprietary Phi-2. 

Google Grooves to GenAI

Like AWS, Google is also putting in a lot of effort to make Vertex AI a success. With Gemini Ultra 1.0 along with the open-source models Gemma and Llama 2 and Gemini 1.5, Google has sent a clear message that they mean business. 

With Claude 3 and Gemini 1.5 in its arsenal, Google Cloud appears to be an excellent choice for developers. 

Both Claude 2 and Gemini 1.5 offer a context window of 1 million tokens, the highest till date, which even OpenAI has not achieved yet. The largest context window OpenAI offers is 128K with GPT-4 Turbo.

Now, the ball is in OpenAI’s court, and everybody is impatiently waiting for the release of GPT-5. But, OpenAI mostly has other priorities, and seems distracted with controversies, like the recent one where Elon Musk sued the company, accusing them of deviating from their original mission and becoming a de facto subsidiary of Microsoft. Microsoft badly needs OpenAI’s GPT-5 to win the cloud war.

Share
Picture of Siddharth Jindal

Siddharth Jindal

Siddharth is a media graduate who loves to explore tech through journalism and putting forward ideas worth pondering about in the era of artificial intelligence.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.