Redmond tech giant Microsoft has taken the AI battle to Google’s home turf with the launch of Project Brainwave – the deep learning platform. This means that AI-led services such as natural language processing, computer vision etc could be deployed in the cloud. Brainwave is a definitive step in converting Azure into an AI cloud to effectively enable developers to perform neural networking tasks that require a lot of processing power.
Is Microsoft Azure going the Google Cloud ML way?
So, why the rush to own cloud AI workloads? AI has become the must-have capability for any business in analytics or cloud technologies. That said, the big picture here is that Brainwave has been launched as a direct competition to Google’s TPUs that bring machine learning capabilities to developers. Google is heavily banking on Google Cloud ML that provides machine learning capabilities, replete with pre-trained models. Since neural-net based workloads require high computing, Google Cloud ML helps developers crank out their tailored models that are scalable. With serverless computing becoming the norm these days, Microsoft too decided has decided to double down on the AI agenda by bringing a real-time AI system to its Azure users.
Doug Burger, Distinguished Engineer, Microsoft writes in the blog post that Project Brainwave is designed for real-time AI and is a major leap performance and flexibility for cloud-based serving of deep learning models. And here’s the real reason — real-time AI has become increasingly important as cloud infrastructures process live data streams, for search queries, videos, sensor streams, or interactions with users. What’s more, Burger writes that Project Brainwave software stack is designed to support a wide range of popular deep learning frameworks and it already supports Microsoft Cognitive Toolkit and Google’s Tensorflow.
Battle of custom built chips for AI tasks
Meanwhile, Google’s cloud ML engine only supports Google’s TPUs. The tech giant recently made another new announcement centred around security in Google Cloud Platform. The search enterprise revealed it beefed up the architecture at multiple layers with components that include Google-designed hardware, a Google-controlled firmware stack, Google-curated OS images, a Google-hardened hypervisor, as well as data center physical security and services, revealed the new blog update. And with it came another big announcement around its custom chip Titan that “establishes the hardware root of trust for cryptographic operations in our data centers.” Analytics India Magazine had earlier written about tech companies turning chipmakers to boost their hardware platforms and achieve high performance that can also deliver on scale and accuracy. Industry experts believe there is an intense game of one supremacy going on between tech biggies to out-do each other in the custom AI chips space.
Microsoft upped its AI game with FPGAs (Field Programmable Gate Arrays — chips that form the core of Azure AI cloud and when integrated with Microsoft’s framework, deliver on speed, performance and latency. FPGAs will enable the developer community to neural network code, distribute it across the FPGA fabric and run it at a fast speed. And while many experts believe that Microsoft joined the AI party way too late, the Redmond company had been working in this space for a decade. Microsoft CEO Satya Nadella too commented on the shift towards where AI and emphasized how cloud has to be equipped with AI capabilities.
“Finally, the cloud is all about powering the next generation of AI applications and this drives a new infrastructure. So, when we look at the next gen applications being built, it is machine learning and artificial intelligence. So, we are working on making Azure as the first AI supercomputer,” Nadella said reportedly.
AI has conquered cloud
With technology shifting towards artificial intelligence, cloud AI emerged as the next big competitive area tech companies fought over to improve scale and performance. It was either this or risk staying behind. Artificial Intelligence has definitely taken over cloud and it’s not just big companies that are working towards making cloud smarter, smaller companies too have latched on to the trend. Recently, Cloud storage company Box announced that it was layering its hardware with computer-vision technology from Google to its platform.
So, where does all this leave AWS, the undisputed leader in cloud computing. Experts believe AWS will also roll out its AI-optimized hardware platform this year which will support the bulk of AI workloads. AWS already boasts of some of the world’s biggest customers with the promise of more AI advances coming out of AWS. With more and more businesses putting their data in the cloud, cloud computing providers are rushing to add AI capabilities to their cloud infrastructure. And the democratization of cloud AI means small companies that lack the computing power and infrastructure muscle will be able to adopt these capabilities faster. So far, the core capabilities of AI such as computer vision, speech and text recognition, voice recognition and natural language processing were the sole reserve of major companies. By opening up APIs and cloud AI to a wider set of players, smaller companies can bring AI-led solutions to the market faster.
Provide your comments below
If you loved this story, do join our Telegram Community.
Also, you can write for us and be one of the 500+ experts who have contributed stories at AIM. Share your nominations here.
Richa Bhatia is a seasoned journalist with six-years experience in reportage and news coverage and has had stints at Times of India and The Indian Express. She is an avid reader, mum to a feisty two-year-old and loves writing about the next-gen technology that is shaping our world.