21st-may-banner design

Microsoft 💔 OpenAI 

The tech giant actually loves everyone who can bring new customers.

Share

Listen to this story

Despite rumours of OpenAI’s one-sided love for Microsoft, at Ignite 2023, chief Satya Nadella didn’t mince his words, showering unlimited affection and heaping praises on OpenAI. He went on to say how the company is grateful to have OpenAI in its life, besides showcasing the readiness to give up everything it has built so far. This includes rebranding its search engine, revamping its Azure cloud infrastructure and much more. 

Nadella has, on many instances, expressed his feelings towards OpenAI. Even at DevDay, he said: “We love you guys… You guys have built something magical.” 

The praise doesn’t end there. At Ignite, he reinforced it even further, saying: “As OpenAI innovates, we will deliver all of that as Azure AI. We are bringing the very latest of GPT4, including GPT4 Turbo, GPT4 Turbo With Vision to Azure OpenAI Service…” 

He also said that Microsoft will be fine-tuning Azure OpenAI Service, alongside helping customers bring their own data to create custom versions of GPT-4. 

Actually, Microsoft ❤️ Everyone 

All of this paints a rosy picture of how Microsoft is so in love with OpenAI. But, the reality is quite the opposite. The tech giant actually loves everyone who can bring new customers. This includes the likes of Meta, Mistral AI, and G42 among other open-source, as well as, in desperate cases, even other closed-door companies (Cohere and others), LLM-enablers like LangChain, and not to forget investments in Inflection AI and Hugging Face

“We are all-in on open source and want to bring the best selection of open source models to Azure and do so responsibly,” said Nadella. A day later, Cohere also silently announced that its flagship enterprise AI model, Command, would soon be available though Azure AI Model Catalog and Marketplace. 

Microsoft is not limited to LLMs. At Ignite, it also announced the newest iteration of the Phi small language model (SLM) series, termed Phi-2.

All for the sake of Azure

In its latest earnings report, Microsoft experienced a robust 29% surge in the quarter, surpassing both AWS and Google Cloud. The tech giant is dedicated to making Azure the best cloud service provider and is willing to go to any extent to achieve this, even partnering with its cloud competitor. 

Oracle Cloud Infrastructure (OCI) is 18 months ahead of its competitors, including Azure, AWS, and Google Cloud in terms of infrastructure, said Oracle in recent interview with AIM.

The company said that OCI is a second-generation cloud, which means that networks built on it are not shared between tenants in the cloud. “The network serves as the bottleneck for the cloud. OCI was designed with a distinctly different network compared to other players,” said Christopher G Chelliah, senior vice president, technology & customer strategy, JAPAC. 

Now as Microsoft is set to host numerous models on Azure, infrastructure changes are essential to support all its partners and customers. On the hardware side, strategic partnerships with AMD, Intel, and NVIDIA have been forged to acquire brand new GPUs with advanced capabilities. 

Besides this, most of Microsoft’s recent investments point at improving Azure capabilities, where it has invested in AI chip startup d-Matrix, alongside acquiring hollow core fiber startup Lumenisity.

Interestingly, the tech giant is building its very first custom in-house CPU series, the Microsoft Azure Cobalt CPU—an Arm-based processor tailored to run general-purpose compute workloads on the Microsoft Cloud. 

“Cobalt is the first CPU designed by us specifically for the Microsoft Cloud, and this 64-bit, 128-core Arm-based chip is the fastest among any cloud provider,” said Nadella. It’s already powering parts of Microsoft Teams, Azure Communication Services, as well as Azure SQL as we speak,” he added. 

Apart from the CPUs, Microsoft has also introduced an AI accelerator chip named Maia for running cloud AI workloads. This chip is manufactured on a 5-nanometer process and has 105 billion transistors, “making it one of the largest chips that can be made with current technology,” emphasised Nadella.

Nadella also announced Azure Boost’s general availability, offloading server virtualisation processes to purpose-built software and hardware for significant networking improvements and enhanced storage throughput.

Clearly, Microsoft’s true love is always going to be Azure.

Share
Picture of Siddharth Jindal

Siddharth Jindal

Siddharth is a media graduate who loves to explore tech through journalism and putting forward ideas worth pondering about in the era of artificial intelligence.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.