Listen to this story
Generative AI has revolutionised the technology landscape, opening up a plethora of possibilities for enterprises across various domains. Today, organisations are actively exploring and uncovering potential use cases to harness the power of generative AI. AWS, which is the largest cloud service provider in the world, is also investing heavily in generative AI.
“It has always been a significant area of focus for AWS. If you look at Amazon’s legacy for nearly 20 years, we have been building several experiences for our customers using machine learning or different AI techniques,” Anupam Mishra, Director, Solution Architecture, AWS India and South Asia, told AIM at the recently held AWS Summit in Mumbai.
He expects generative AI will be an area where AWS keeps investing more and more. The cloud giant’s approach to investing in new technologies is driven by customer feedback. “Over 90% of the features we have launched are a direct result of listening to our customers’ needs. We consistently prioritise understanding and fulfilling their requirements, continuously building and delivering solutions that align with their expectations.”
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
AWS Generative AI strategy
AWS has hundreds of thousands of customers in India and million active customers globally. Mishra has revealed that most of AWS’ customers are exploring the use of Generative AI.
AWS’ Generative AI strategy is to empower customers with the capabilities and resources to build customised Generative AI solutions that cater to their specific needs and requirements. Mishra, who leads AWS India’s technology team of Solution Architects, said AWS’ strategy is to bring AI to the hands of every developer. “We are focused on democratising AI as much as possible, Our Generative AI strategy is divided in three parts.”
Primarily, AWS focuses on providing cost-effective hardware solutions that enable high-performance generative AI tasks. “For example, in the hardware layer we have Inferentia, a new instance designed for efficient inference, and training, a chip optimised for training models.
“These hardware solutions aim to significantly improve both training time and cost efficiency. For instance, Inferentia demonstrates a 40% improvement in price performance compared to other EC2 instances. Similarly, we achieve a significant 50% improvement in performance per watt, a crucial factor for promoting sustainability,” he said.
Secondly, AWS wants to make it effortless for enterprises to leverage large language models and tailor them to deliver the desired experiences through fine-tuning. “With Bedrock, we provide a seamless serverless experience that minimises the need for extensive training time and reduces costs. By leveraging existing foundation models (LLMs), enterprises can save valuable time and resources,” Mishra said.
“Lastly, the third layer after Bedrock is how do we offer API based experiences to our customers. One of the products which we have created is called CodeWhisperer, which allows you as a developer to automatically generate code by writing comments.”
A Generative AI ecosystem
While GPT-4 stands as one of the most advanced large language models to date, enterprises are actively exploring use cases for a variety of models developed by different AI labs. For instance, Anthropic introduced Claude, which has garnered attention and interest among organisations. Besides, in recent months, a multitude of open-source LLMs have emerged, expanding the options available to developers and researchers.
AWS is actively exploring ways to enable customers to leverage these models, allowing them to provide unique and differentiated experiences to their own customers, all while minimising the effort required.
“The aim is to establish an ecosystem where foundational models developed by AWS partners, such as Anthropic, Stability AI, and Hugging Face, can be leveraged by our customers to bring the value to the market,” Mishra said.
Furthermore, earlier this year, AWS launched a set of foundational models called Titan. “For example, the first model is a LLM specifically designed for tasks such as text generation and summarization, among other things. Whereas the second model improves searches and personalisations,” Mishra added.
AWS in India
For AWS, India is one of its key markets. Recently, the cloud giant announced an investment of USD12.7 billion (INR 1,05,600 crores) into cloud infrastructure in India by 2030. “India is a very important market for us. In fact, it is one of the fastest growing areas for AWS. So far we have already invested USD3.7 billion. Further, we also expect about 1,31,700 full time jobs to be created because of the investment,” Mishra said.
So what makes India a lucrative market for AWS? Besides being home to a growing startup ecosystem, India is one of the biggest consumer markets, with numerous companies experiencing rapid growth and expansion within the country. “However, we are seeing that a lot of companies are going global from India where they build the product in India and then they are serving the rest of the world. FreshWorks is a great example, built on AWS, and now they’re serving the whole world from India.”
The large number of SMBs in India also makes it an important market for AWS. India is the largest SMB market in the world with nearly 75 million players in the space. AWS sees huge potential in the SMB segment as well. “Our chairman Jeff Bezos was in India a couple of years back and he made a pledge of digitalising 10 million SMBs in India by 2025. We are committed to deliver on that pledge helping SMBs in different ways and let them ride this digital wave,” Mishra said.
Explaining with the example of Havemor, an ice cream manufacturer, Mishra said that they have leveraged AWS to migrate their SAP HANA workloads, including their fulfilment portal and in-store operations. “By relying on AWS, they can offload the heavy lifting involved in infrastructure management and tool development, allowing them to focus on their core competency of making ice cream. This principle extends to numerous other companies who choose to let AWS handle the infrastructure complexities while they concentrate on excelling in their respective industries,” Mishra concluded.