Listen to this story
Generative AI has literally shaken the cloud infrastructure. “The shape of Azure has drastically changed and continues to change rapidly to support the models you’re building,” said Microsoft chief Satya Nadella while sharing the stage with Sam Altman at OpenAI’s first developer conference, DevDay.
He further mentioned that OpenAI has challenged Microsoft Azure to change its infrastructure to match OpenAI’s technology prowess. “The first thing that we have been doing in partnership with you is changing the system all the way from power to the DC (data center), to the rack, to the accelerators and to the network,” he added.
To adapt to the demands of generative AI, all major hyperscalers, including AWS and Google Cloud, are compelled to make changes to their infrastructure. For example, earlier this year, the Google Cloud Platform announced that they are working to integrate AI infrastructure more extensively into their overall fleets. Similarly, AWS stated that the company plans to deploy multiple AI-optimized server clusters over the next 12 months.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
“I would argue that the changes they (AWS and Google Cloud) are making is an attempt to build a networking infrastructure like we have at Oracle,” said Christopher G. Chelliah, senior vice president, technology & customer strategy, JAPAC, in an exclusive interview with AIM.
However Microsoft, in addition to updating its infrastructure, has recently also partnered with Oracle in a multiyear agreement to enhance AI services. It will now use both Oracle Cloud Infrastructure (OCI) AI and Microsoft Azure AI for daily Bing conversational searches.
Microsoft’s turn to Oracle implies that there may be certain shortcomings in Microsoft Azure.
What Sets Oracle Apart from Other Hyperscalers
Elaborating further how OCI is built differently from its counterparts, Chelliah said that OCI is a second generation cloud which simply means that networks built by OCI are not shared between tenants in the cloud.
“The network serves as the bottleneck for the cloud. OCI was designed with a distinctly different network compared to other players,” he said.
In contrast, when discussing AWS and Google Cloud, Chelliah noted a challenge faced by them. He said they currently host existing customers within existing tenancies, making it impractical for them to swiftly replace or upgrade their networks. “They cannot rip out those networks and change those networks overnight,” he added. Continuing, he confidently stated, “I believe our competitors are at least 18 months behind us at Oracle in this regard.”
Moreover, he mentioned that Oracle ensured the network topology is non-blocking so that the network is not shared between tenants in the cloud. He explained this using an analogy, where, instead of road intersections, Oracle has built flyovers.
From Oracle’s perspective, he said, “We already have that network, and Oracle is a step ahead.” The next problem Oracle is trying to solve is data privacy.
“We’re helping customers do training inferencing and RAG in isolation and privacy so that you can now bring corporate sensitive, private data, do trade, you know, fine tuning inferencing and RAG in this cloud, without impacting any privacy issue” he added.
Oracle ‘Feeds the Beast’
Running generative AI demands a combination of infrastructure and data.
Oracle is well-equipped in terms of infrastructure, as NVIDIA selected OCI as the first hyperscale cloud provider to offer NVIDIA DGX Cloud. “When NVIDIA thinks of cloud and data, they think of Oracle,” said Chelliah, saying that Oracle utilises MySQL HeatWave data for real-time anomaly detection on NVIDIA clusters for its customers.
“The second biggest ingredient in AI is data; you need to ‘feed the beast’ (AI Machine),” he said. “AI is not just GPUs, it’s important. But for me to be differentiated, it’s data” highlighted Chelliah.
Oracle has the majority of available data, trusted by both enterprises and governments. Recently, Oracle has shifted its strategy, extending its data to other cloud service providers. It is reaching customers where they are.
Moreover, Larry Ellison, CTO of Oracle has taken this commitment of openness to the next level. “Cloud should be open,” he said, talking about its partnership with Microsoft, and eventually partnering with other cloud providers, namely AWS and Google. “Today I can run my data platform on Amazon today with MySQL HeatWave available on Amazon.” said Chelliah.
“We’re playing this game where we tell customers to feed the beast, as I keep calling it, to feed the AI machine you need data,” he concluded.