Listen to this story
LangChain has had its fair share of troubles, some caused by the new updates constantly put out by OpenAI, but each time LangChain pushes back with timely updates of its own. A few days back at the first ever DevDay conference, Sam Altman unveiled the release of GPTs, alongside the Assistant API and a slew of other announcements.
The fresh API releases from OpenAI could pose a significant challenge for startups such as LangChain, which currently position themselves as vital components for AI-driven applications.
For example, OpenAI’s agents are already replacing traditional RAG (Retrieval-Augmented Generation), a model that enhances AI responses by allowing the system to retrieve and reference information from large datasets. LangChain’s RAG application will take a hit when the whole process can be done at a far less time and cost.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
But, there is still hope for LangChain. One of the users on the Reddit discussion thread, Synyster328, who happened to have tried the Assistant playground with GPT-4 Turbo, said: “Now, it’s basically magic that you can have it read files in like 30 seconds. But it can’t compete with a full RAG-pipeline like you can do with LangChain and really optimise each step,” he added.
“I don’t think it will affect LangChain usage. Langchain gives more control and is transparent rather than OpenAI APIs,” said another Reddit user.
“Yes, too much control, and power of the open source community,” shared an user, who goes by the name, meet.org. The user said that LangChain can render the OpenAI APIs obsolete if they want. “I feel that the concept behind the framework is so powerful, not just in terms of its use case but also in terms of system design, and accessibility,” said the user, and sharing his concerns as to why many people hate Langchain.
A lot of users also stressed upon the flexibility aspects, where LangChain acts as an abstraction layer, which helps developers build in a similar fashion using OpenAI LLM or any other LLM. The same goes with other parts of the pipeline, where they can use Pinecone, or any other vector stores and the code will remain very similar.
LangChain is quite popular among developers despite not have its own language model or vector model. They provide two broad categories of services, one component is the modules like prompt template abstraction, LLM abstraction, chat model abstraction, text splitters, document loaders that they build and implement or just have integrations with.
The other more popular application of the framework is the end-to-end chains, for example, question and answer for documents, SQL databases, agents and a series of specific tools.
Despite its popularity, some of the users have complained in the past that LangChain is poorly designed and filled with overlapping abstractions. There have also been complaints of poor documentation which Harrison Chase assured they’re working to fix. With all this, the strength of LangChain lies in its ability to adapt quickly while also offering a wide range of services.
Speed of development by the large models and the equally rapid support for them on LangChain is exhausting, said Chase. “We definitely prioritise keeping up with stuff that comes out,” he explained. They added support for function calling and chat model the next day after OpenAI released their updates. Chase said, “I’m really proud of how my team has kept up with that.”
To step up from making the first versions, LangSmith was created for five main functions – fixing problems, checking if things work, judging how good things are, watching how it’s used, and measuring usage.
LangSmith simplifies the construction of large language models (LLMs) through natural language instructions. It is user-friendly and enables developers to create LLMs with diverse capabilities. CommandBar a company that helps their customers adopt new features, learn new workflows, and get help in-app where they are said, “LangSmith isn’t just a tool for us – it’s become a critical inclusion in our stack. We’ve moved from crossing our fingers and toes hoping our AI works to knowing exactly how and why.”
LangServe, another major update, simplifies deploying language models by turning them into REST APIs, making them accessible to developers and applications. It offers model packaging, API generation, and deployment to various platforms. Additionally, it provides a client library for easy interaction with deployed LMs. It also improves LLM accessibility, allowing integration into other projects and reducing development time, letting developers focus on using LM-powered applications.
LangChain is upping the ante with its latest release of GPTs alternative – “Open GTPs,” alongside allowing access to OpenAI Assistant API, running them like any other LangChain agent. It has successfully integrated it to the Assistants API (which comes with retrieval as well as code interpreter capabilities) to their framework to their 100+ tools. There is no stopping.