Listen to this story
|
A few days ago, Sam Altman, co-founder of OpenAI, tweeted crediting Microsoft—particularly Azure—for providing the AI infrastructure that powers the GPT language models. Be it conversational AI, ChatGPT, or text-to-image modelling like Dall-E—OpenAI is able to seamlessly launch its products, all thanks to Microsoft.
The OpenAI–Microsoft partnership began in 2019 when Microsoft announced that it would invest $1 billion into OpenAI’s project of building Artificial General Intelligence (AGI) that had wide economic benefits. OpenAI said the partnership aims to develop a hardware and software platform within Azure. The deal also meant that Microsoft would become OpenAI’s exclusive cloud provider.
Terms of the partnership
The partnership covered three key aspects: the two companies will together build Azure’s AI supercomputing technologies, OpenAI will run its services on Microsoft, and finally, Microsoft will be the preferred partner for OpenAI to commercialise its new AI technologies.
As a result, API access to any of OpenAI’s products will only be available to you if you’re using Microsoft. Take, for example, Shutterstock, which recently announced that it would integrate Dall-E onto its platform. While Microsoft doesn’t own Shutterstock, in 2020, it announced that Microsoft Advertising would partner with them to integrate Shutterstock API on its advertising platform. Therefore, access to OpenAI’s API would benefit both Microsoft and Shutterstock.
Microsoft also collaborated with OpenAI to build the fifth most powerful supercomputer, which is meant to be exclusively used by OpenAI. The supercomputer was built for training massive distributed AI models.
However, while OpenAI’s products are solely limited to Microsoft, its models are publicly available and are already employed by many to create their own products. For example, almost all AI companies use the GPT-3 model for writing services—most recently, Notion. Similarly, the GPT-3.5 model is already under use, with many integrating it with messaging platforms like ‘Whatsapp’ and ‘Telegram’.
Read: ChatGPT is Now Available on WhatsApp
Microsoft’s AI Infrastructure
Microsoft Azure publicly released its AI infrastructure service named Singularity in February 2022. The new platform was built from scratch to carry AI workloads both inside Microsoft and outside. The OS to AI leap has proven to be a big step forward for Microsoft, as their platform lowers cost by driving high utilisation across deep learning workloads without impacting performance. For this, the platform uses a workload-aware scheduler which optimises performance by prioritising between different workloads. So, for example, “Singularity adapts to increasing load on an inference job, freeing up capacity by elastically scaling down or preempting training jobs”.
Essentially, the platform allows hundreds and thousands of GPUs and AI accelerators to work together—treating them as a single cluster. The service allows data scientists and AI practitioners to build, scale, experiment, and iterate their models on Microsoft’s distributed infrastructure service.
OpenAI powering Microsoft startups
OpenAI has enabled the launch of several of Microsoft’s AI startups. Under the Azure OpenAI service banner, Microsoft invites companies to leverage large-scale, generative AI models with deep language and code understanding to create cutting-edge applications. Azure OpenAI differs from OpenAI, in that the former’s service gives customers built-in responsible AI and access to enterprise-grade Azure security—thereby detecting and mitigating harmful use of the API.
This cognitive service offered by Azure provides advanced language models like OpenAI GPT-3, Codex, and DALL-E along with Azure’s security and enterprise capabilities. The combination of the two enables private networking, regional availability, and responsible AI content filtering. Currently, customers with an existing partnership with Microsoft are eligible to use the service following a thorough use case review before being released for production use.
Startups like VideoKen, mem.ai, and Pointers are a few among the many registered for Microsoft for Startups.
Further, Virta.ai—a platform for translating media texts—uses a single API integration with Azure Cognitive Services to offload tasks for all their Natural Language Processing (NLP) on a pay-as-you-use model. Likewise, Trelent—the AI application for source code documentation—is powered by Microsoft’s Azure OpenAI service.
OpenAI, always first in the game
Several players in the market work on large models, but OpenAI has always managed to be the first mover in certain respects. With models like GPT-3, Dall-E, and the recent phenomenon of ChatGPT, it has managed to be at the forefront of AI innovation.
The AI coding assistant, Copilot, powered by OpenAI’s Codex, has been widely successful and has killed almost all of its competitors because of the immense coding data available to Microsoft with GitHub. OpenAI has thus been able to harness Microsoft’s existing customer base and training data.
What will be interesting to see from here is how Microsoft gives Bing, its search engine, a second life with a chatbot like ChatGPT and several other products in its pipeline. However, Google’s monopoly on the search engine market will be difficult to overtake, especially since the next iteration of Google’s chatbot LaMDA is already in the works. Therefore, Google will soon be looking to integrate it into its search engine.