MITB Banner

[Exclusive] AWS’ Generative AI Play for Bedrock

Olivier Klein, Chief Technologist, AWS, told AIM that the company aims to democratise generative AI and FMs, serving diverse purposes through multiple specialised models for all customer types and business functions.

Share

Listen to this story

When the likes of Google, Microsoft, and Meta went all-in on building their own large languages models like PaLM 2, GPT-4, Llama 2, and their versions, Amazon opted for a different route. Instead of joining the same race, Amazon’s cloud business, Amazon Web Services (AWS), took a surprising twist by tapping into its vast 34% market share in cloud services, leading to the birth of Amazon Bedrock.

This has led to the inception of Amazon Bedrock which aligns with the fact that AWS offers FMs from startups and Amazon Titan through Bedrock. Launched four months ago, Amazon Bedrock provides an API platform to build generative AI-powered apps via foundational models sourced from top-tier AI players like Anthropic, Stability AI, Cohere, and others. 

“We are focused on democratising access to generative AI and FMs, to all customer types and business functions. Instead of concentrating on just one FM, we envision multiple specialised models serving various purposes, said Olivier Klein, Chief Technologist, APAC, AWS, in an exclusive interview with AIM. 

Klein added that AWS is currently in the early phases of exploring the potential of generative AI and highlighted how services like Amazon Bedrock are being used to make generative AI more accessible to a broader audience. 

Bridgewater Associates, Coda, Lonely Planet, Ryanair, Showpad, Travelers, and many others are among some of the companies that are actively using Amazon Bedrock to create generative AI applications. For instance, Coda AI is a workplace assistant designed for enterprise use cases, with thousands of teams depending on it to complete tasks and advance their work. 

Customer Obsession Continues 

Amazon’s customer obsession has its influence on generative AI well. “When it comes to generative AI, your data is your differentiator. That’s how you can differentiate your AI from everyone else’s. Having the right data strategy in place is a common challenge I discuss with customers, which is why I advise our customers to leverage a data lake for many applications,” said Klein. 

According to Klein, Amazon‘s customer-first philosophy remains at the heart of its approach to AI and ML.

Furthermore, he added that AWS also provides secure storage in Amazon Simple Storage Service all the way through serving AI responses with Amazon Bedrock. Customers already using Amazon S3 for data lakes can move faster to a differentiated generative because they have their own data. 

AWS invests in its own FM with Amazon Titan

Back in April, Amazon’s CEO, Andy Jassy, announced that the company is developing a “generalised and capable” LLM for enhancing Alexa. However, the company made no announcements about the same since then. Additionally, AWS has built Amazon Titan consisting of two LLMs which is under preview. The first is a generative LLM for tasks such as summarisation, text generation (for example, creating a blog post), classification, open-ended Q&A, and information extraction. The second is an embeddings LLM that translates text inputs (words, phrases or possibly large units of text) into numerical representations (known as embeddings) that contain the semantic meaning of the text.

For now, AWS believes that in the evolving ML landscape, different models have distinct purposes, and no single model can cover all bases. Say AI21 Jurassic1 is suitable for report summarisation in financial firms, while Cohere‘s Generate Model-Medium is ideal for autocomplete suggestions in online retail. Customers choose between these models based on their unique needs.

Additionally, the company is helping in the creation of generative AI apps that provide real-time answers from private knowledge sources and handle various tasks. For example, in customer service, the AI securely accesses company data, converts it into machine-readable format, and uses relevant information to perform tasks like exchanging products, enhancing developer productivity. 

On the other hand, they are also exploring generative AI’s impact on industries like healthcare. For instance, Amazon Bedrock powers HealthScribe to streamline the integration of AI into healthcare apps, reducing the complexity of tasks like detailed clinical documentation for physicians, without requiring them to manage ML infrastructure or train specialised models. 

Riding the Gen AI Wave

AWS has launched the AWS Generative AI Innovation Center, investing $100 million in creating and deploying generative AI solutions. 

“We are investing $100 million in the program, which will connect AWS AI and ML experts with customers around the globe to help them envision, design, and launch new generative AI products, services, and processes,” Klein added. 

For now, the focus of the AWS Generative AI Innovation Center will be directed towards collaborating closely with global corporate clients, with a significant emphasis on sectors such as finances, healthcare, life sciences, media, entertainment, automotive, manufacturing, energy, utilities, and telco. 

For instance, enterprises in the healthcare and life sciences domain will have the opportunity to expedite their drug research and exploration efforts. Manufacturing enterprises can embark on projects aimed at redefining industrial design and processes. Similarly, financial services entities can explore avenues to furnish patrons with personalised information and advisory services, all while encouraging innovation and development.

“Twilio, Ryanair, Lonely Planet, and Highspot, and are among hundreds of companies we’re already working with – and we look forward to welcoming more from across the world, including India,” he added. 

Taking the Responsible Approach 

As per a report by Business Insider, Amazon is apparently working on a new internal security initiative dubbed “Maverick” focusing on building new tools and fostering collaboration across Amazon to address the security and risk of generative AI and large language models. Led by former Uber CISO John Flynn, Maverick seeks to consolidate risks and create security guidance, and tools for GenAI security testing. 

Meanwhile, Amazon was one of the first companies that cautioned its employees against placing confidential information on OpenAI’s ChatGPT. However, employees have been using ChatGPT for research and practical problem-solving, including interview questions, code writing, and training documents, as per the report.  

Talking about the same, Klein said, “Employees use our AI models every day to invent on behalf of our customers – from generating code recommendations with Amazon CodeWhisperer to creating new experiences on Alexa. We have safeguards in place for employee use of these technologies, including guidance on accessing third-party generative AI services and protecting confidential information,” 

Amazon is committed to long-term thinking and views its current offerings and collaborations as just the start of a technological revolution. 

“We aim to shape the future of AI based on customer needs, focusing on promising, beneficial, and responsible applications,” concluded Klein. 

Read more: Many A Generative AI Tricks Up Amazon’s Sleeves

Share
Picture of Shritama Saha

Shritama Saha

Shritama (she/her) is a technology journalist at AIM who is passionate to explore the influence of AI on different domains including fashion, healthcare and banks.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India