Microsoft 💔 OpenAI 

The tech giant actually loves everyone who can bring new customers.
Listen to this story

Despite rumours of OpenAI’s one-sided love for Microsoft, at Ignite 2023, chief Satya Nadella didn’t mince his words, showering unlimited affection and heaping praises on OpenAI. He went on to say how the company is grateful to have OpenAI in its life, besides showcasing the readiness to give up everything it has built so far. This includes rebranding its search engine, revamping its Azure cloud infrastructure and much more. 

Nadella has, on many instances, expressed his feelings towards OpenAI. Even at DevDay, he said: “We love you guys… You guys have built something magical.” 

The praise doesn’t end there. At Ignite, he reinforced it even further, saying: “As OpenAI innovates, we will deliver all of that as Azure AI. We are bringing the very latest of GPT4, including GPT4 Turbo, GPT4 Turbo With Vision to Azure OpenAI Service…” 

Subscribe to our Newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

He also said that Microsoft will be fine-tuning Azure OpenAI Service, alongside helping customers bring their own data to create custom versions of GPT-4. 

Actually, Microsoft ❤️ Everyone 

All of this paints a rosy picture of how Microsoft is so in love with OpenAI. But, the reality is quite the opposite. The tech giant actually loves everyone who can bring new customers. This includes the likes of Meta, Mistral AI, and G42 among other open-source, as well as, in desperate cases, even other closed-door companies (Cohere and others), LLM-enablers like LangChain, and not to forget investments in Inflection AI and Hugging Face

“We are all-in on open source and want to bring the best selection of open source models to Azure and do so responsibly,” said Nadella. A day later, Cohere also silently announced that its flagship enterprise AI model, Command, would soon be available though Azure AI Model Catalog and Marketplace. 

Microsoft is not limited to LLMs. At Ignite, it also announced the newest iteration of the Phi small language model (SLM) series, termed Phi-2.

All for the sake of Azure

In its latest earnings report, Microsoft experienced a robust 29% surge in the quarter, surpassing both AWS and Google Cloud. The tech giant is dedicated to making Azure the best cloud service provider and is willing to go to any extent to achieve this, even partnering with its cloud competitor. 

Oracle Cloud Infrastructure (OCI) is 18 months ahead of its competitors, including Azure, AWS, and Google Cloud in terms of infrastructure, said Oracle in recent interview with AIM.

The company said that OCI is a second-generation cloud, which means that networks built on it are not shared between tenants in the cloud. “The network serves as the bottleneck for the cloud. OCI was designed with a distinctly different network compared to other players,” said Christopher G Chelliah, senior vice president, technology & customer strategy, JAPAC. 

Now as Microsoft is set to host numerous models on Azure, infrastructure changes are essential to support all its partners and customers. On the hardware side, strategic partnerships with AMD, Intel, and NVIDIA have been forged to acquire brand new GPUs with advanced capabilities. 

Besides this, most of Microsoft’s recent investments point at improving Azure capabilities, where it has invested in AI chip startup d-Matrix, alongside acquiring hollow core fiber startup Lumenisity.

Interestingly, the tech giant is building its very first custom in-house CPU series, the Microsoft Azure Cobalt CPU—an Arm-based processor tailored to run general-purpose compute workloads on the Microsoft Cloud. 

“Cobalt is the first CPU designed by us specifically for the Microsoft Cloud, and this 64-bit, 128-core Arm-based chip is the fastest among any cloud provider,” said Nadella. It’s already powering parts of Microsoft Teams, Azure Communication Services, as well as Azure SQL as we speak,” he added. 

Apart from the CPUs, Microsoft has also introduced an AI accelerator chip named Maia for running cloud AI workloads. This chip is manufactured on a 5-nanometer process and has 105 billion transistors, “making it one of the largest chips that can be made with current technology,” emphasised Nadella.

Nadella also announced Azure Boost’s general availability, offloading server virtualisation processes to purpose-built software and hardware for significant networking improvements and enhanced storage throughput.

Clearly, Microsoft’s true love is always going to be Azure.

Siddharth Jindal
Siddharth is a media graduate who loves to explore tech through journalism and putting forward ideas worth pondering about in the era of artificial intelligence.

Download our Mobile App

MachineHack | AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIMResearch Pioneering advanced AI market research

With a decade of experience under our belt, we are transforming how businesses use AI & data-driven insights to succeed.

The Gold Standard for Recognizing Excellence in Data Science and Tech Workplaces

With Best Firm Certification, you can effortlessly delve into the minds of your employees, unveil invaluable perspectives, and gain distinguished acclaim for fostering an exceptional company culture.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox