MITB Banner

OpenAI Makes LangChain’s Life Miserable 

But, it is trying really hard “not” to lag behind. 

Share

Listen to this story

LangChain has had its fair share of troubles, some caused by the new updates constantly put out by OpenAI, but each time LangChain pushes back with timely updates of its own. A few days back at the first ever DevDay conference, Sam Altman unveiled the release of GPTs, alongside the Assistant API and a slew of other announcements. 

The fresh API releases from OpenAI could pose a significant challenge for startups such as LangChain, which currently position themselves as vital components for AI-driven applications.

For example, OpenAI’s agents are already replacing traditional RAG (Retrieval-Augmented Generation), a model that enhances AI responses by allowing the system to retrieve and reference information from large datasets. LangChain’s RAG application will take a hit when the whole process can be done at a far less time and cost. 

But, there is still hope for LangChain. One of the users on the Reddit discussion thread, Synyster328, who happened to have tried the Assistant playground with GPT-4 Turbo, said: “Now, it’s basically magic that you can have it read files in like 30 seconds. But it can’t compete with a full RAG-pipeline like you can do with LangChain and really optimise each step,” he added. 

“I don’t think it will affect LangChain usage. Langchain gives more control and is transparent rather than OpenAI APIs,” said another Reddit user. 

“Yes, too much control, and power of the open source community,” shared an user, who goes by the name, meet.org. The user said that LangChain can render the OpenAI APIs obsolete if they want. “I feel that the concept behind the framework is so powerful, not just in terms of its use case but also in terms of system design, and accessibility,” said the user, and sharing his concerns as to why many people hate Langchain. 

A lot of users also stressed upon the flexibility aspects, where LangChain acts as an abstraction layer, which helps developers build in a similar fashion using OpenAI LLM or any other LLM. The same goes with other parts of the pipeline, where they can use Pinecone, or any other vector stores and the code will remain very similar. 

Harrison Chase, the cofounder of LangChain describes the framework as “building context aware reasoning applications.”

LangChain is quite popular among developers despite not have its own language model or vector model. They provide two broad categories of services, one component is the modules like prompt template abstraction, LLM  abstraction, chat model abstraction, text splitters, document loaders that they build and implement or just have integrations with. 

The other more popular application of the framework is the end-to-end chains, for example, question and answer for documents, SQL databases, agents and a series of specific tools. 

LangChain Survives 

Despite its popularity, some of the users have complained in the past that LangChain is poorly designed and filled with overlapping abstractions. There have also been complaints of poor documentation which Harrison Chase assured they’re working to fix. With all this, the strength of LangChain lies in its ability to adapt quickly while also offering a wide range of services. 

Speed of development by the large models and the equally rapid support for them on LangChain is exhausting, said Chase. “We definitely prioritise keeping up with stuff that comes out,” he explained. They added support for function calling and chat model the next day after OpenAI released their updates. Chase said, “I’m really proud of how my team has kept up with that.”

To step up from making the first versions, LangSmith was created for five main functions – fixing problems, checking if things work, judging how good things are, watching how it’s used, and measuring usage. 

LangSmith simplifies the construction of large language models (LLMs) through natural language instructions. It is user-friendly and enables developers to create LLMs with diverse capabilities. CommandBar a company that helps their customers adopt new features, learn new workflows, and get help in-app where they are said, “LangSmith isn’t just a tool for us – it’s become a critical inclusion in our stack. We’ve moved from crossing our fingers and toes hoping our AI works to knowing exactly how and why.”

LangServe, another major update, simplifies deploying language models by turning them into REST APIs, making them accessible to developers and applications. It offers model packaging, API generation, and deployment to various platforms. Additionally, it provides a client library for easy interaction with deployed LMs. It also improves LLM accessibility, allowing integration into other projects and reducing development time, letting developers focus on using LM-powered applications.

LangChain is upping the ante with its latest release of GPTs alternative – “Open GTPs,” alongside allowing access to OpenAI Assistant API, running them like any other LangChain agent. It has successfully integrated it to the Assistants API (which comes with retrieval as well as code interpreter capabilities) to their framework to their 100+ tools. There is no stopping. 

Share
Picture of K L Krithika

K L Krithika

K L Krithika is a tech journalist at AIM. Apart from writing tech news, she enjoys reading sci-fi and pondering the impossible technologies, trying not to confuse it with reality.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.