MITB Banner

This Bengaluru-based AI Startup is Helping Enterprise Productionize Generative AI Apps Reliably

Portkey.ai boosts enterprise AI app deployments, simplifying integration in the LLMOps landscape.

Share

Illustration by Nikhil Kumar

Listen to this story

Just eleven months ago, Ayush Garg announced the launch of Portkey.ai, a platform he co-founded with Rohit Agarwal after they left Pepper Content. The company offers a unified interface of tools for model selection, load balancing, and prompt engineering within a single platform. 

Portkey is quick to keep up with the models being announced every week and has already integrated Claude 3 and even models hosted on Groq’s LPU. It has raised $3 million, being used to make their product ready for market. “Today we process 3 billion tokens and about 15-20 million API calls,” Garg said at DevPalooza 3.0. The event hosted founders and experts, who spoke in detail about their work in the field to a crowd of tech and AI enthusiasts last Saturday. 

LLMOps is a rapidly growing ecosystem fueled by the rise of LLMs. It is a response to companies figuring out the best means to adopt AI features to their existing products. Until last year, companies were early in their adoption of AI. But now, about 42% companies are still exploring AI use cases like virtual assistants, voice searches or AI stats, chatbots, etc on their platform to find the best integration tools.

“While this was the case last year, we’re finding that more companies are moving from the Proof-of-Concept (POC) stage and are going into production,” Garg explained in an interview with AIM. This is where tools like Portkey are most useful. Features like load balancing, for example, become a requirement where Portkey would stand like a traffic cop reducing the risk of downtime and performance issues often encountered in production.

Companies in the LLMOps space

The ecosystem is thriving with each company focusing on building different functions. Fiddler.ai, another LLMOps platform, focuses on AI observability, while Deepchecks is a pioneer in ensuring that the responses the customer receives from these LLMs are of the highest quality and accuracy. 

“We’ve differentiated from the others in this space by being a one-stop solution for multiple hurdles that companies face. I’d like to believe that Portkey is an enterprise-ready solution,” Garg explained. 

The Architecture

This LLMOps architecture facilitates a smooth transition from a proof of concept to a production-ready application. Garg describes Portkey as a full-stack LLMOps platform that offers all the tools necessary for this progression. 

It allows for managing various AI models through an API gateway, which is the core component of their high-scale system. Garg noted, “This API gateway is a unified, and it has request-response transformers,” which means it can handle different AI models, allowing users to switch between them without rewriting much of their code.

Portkey’s system also includes security features, like pre-inference and post-inference guardrails, ensuring sensitive data like emails and phone numbers aren’t exposed. The gateway itself can also connect to observability services, making it easier to monitor and evaluate AI performance.

Data management is another key aspect,  Garg explained that, “This is also the key difference between our enterprise and startup customers.” While startups typically use Portkey.ai’s cloud-hosted versions, enterprises often require SOC2, ISO, GDPR, and HIPAA certifications or opt for private VPC deployments tailored to their specific cloud infrastructure.

Portkey has open-sourced its API gateway for those who need it. The extended architecture under the paywall is used by over 1,000 organisations, including Postman, Springworks, and 500 other companies. 

The LLMOps space is adapting to the challenges of operationalising large language models in an enterprise-friendly manner. Portkey, while taking care of the entire process with only nine people in the company, is looking to make good use of the funding to scale the operation. 

Share
Picture of K L Krithika

K L Krithika

K L Krithika is a tech journalist at AIM. Apart from writing tech news, she enjoys reading sci-fi and pondering the impossible technologies, trying not to confuse it with reality.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.