Listen to this story
Earlier this month, LangChain, a Python framework for LLMs, received seed funding to the tune of $10 million from Benchmark. Soon after, the startup received another round of funding in the range of $20 to $25 million from Sequoia, quoting a valuation of $200 million.
At its base, the main offering of LangChain is an abstraction wrapper that makes it easier for programmers to integrate LLMs into their programs. While it provides a relatively simple interface for LLMs, many developers swear against using LangChain in a production environment due to its anachronisms.
However, the fact remains that many developers treat LangChain as the go-to framework in their applications, with the project garnering 20,000 stars on GitHub. This evokes a question on the valuation of the project — is an open-source abstraction wrapper really worth $200 million?
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
What makes LangChain great?
The framework was released in October 2022, in the midst of the generative AI wave. Based on the belief their true potential lay in bringing LLMs together with other applications, Harrison Chase, a programmer, developed this library. At first, it only had support for the OpenAI and Cohere API, along with a Python interpreter. Today, the project has blossomed to support over 20 model providers and hosting platforms, over 50 document loaders, more than 10 vector databases, and over 15 tools commonly used by LLMs.
Developers have stated that what NumPy and Pandas did for machine learning, LangChain has done for LLMs, greatly increasing their usability and functionality. By using LangChain, developers can empower their applications by connecting them to an LLM, or leverage a large dataset by connecting an LLM to it.
The main plus point of LangChain is its ability to create a chain of commands, an intuitive way to relay instructions to an LLM. Each command or ‘link’ of this chain can either call an LLM or a different utility, allowing for the creation of AI agents that can decide information flow based on user input. In addition to chaining, the package also has ways to implement LLM memory and built in benchmarks to evaluate the potential utility of an LLM.
Even though LangChain is a fully featured and well supported framework, it is said to be notoriously unusable in a production environment. While the developers have created a support channel for those using this framework in a production environment, certain developers have spoken against using it. User Karrot_Kream on the Hacker News forum stated, “There also seems to be really poor observability in the code and performance seems to be an afterthought. I tell friends who ask about LangChain that it’s great to experiment with but not something I’d put into production.”
Indeed, the dominant perspective is that while LangChain does bring some value to the table, especially in terms of integrating an LLM with minimal effort, serious users would rather just write their own wrapper. Moreover, large language models have evolved since the first launch of LangChain, from simple text generators to more fully-featured and nuanced programs. While there is still consensus that LangChain does offer some value, it seems that the value proposition is slowly shrinking.
Open-source vs OpenAI
Looking at the AI landscape when LangChain first launched, the necessity of the framework becomes apparent. The value proposition was adding so-called ‘dumb’ LLMs to programs, simply leveraging them as a workhorse for certain NLP workloads. Today, LLMs have gone beyond that, partly thanks to the launch of ChatGPT.
The chatbot opened people’s eyes to what was really possible with LLMs, incentivising OpenAI to create better APIs for their SOTA models. Today, OpenAI’s APIs, along with ChatGPT plugins, solve 90% of the problems LangChain was created to solve. It is not only accessible to a majority of developers, it also helps them to integrate GPT models quickly and easily.
However, LangChain is platform agnostic, meaning that even if OpenAI goes down in the future, LangChain can still be used to connect models to programs. Moreover, the project also has a considerable amount of momentum, considering the projects being built on it. AgentGPT and Baby AGI both use LangChain to create their agents. The platform also has a considerable user-base, with the GitHub repo boasting over 2.8k forks at the time of writing this article.
It seems that the valuation ascribed to the platform might be over-the-top for what it is currently offering. However, LangChain might grow into a beast in the near future, with some programmers even predicting that it might become the Hugging Face of LLMs. In this monetisation approach, LangChain might start offering pre-built services hosted on their own servers on a as-a-service basis.
This means that companies can simply call an API or directly integrate the services into their products, solving the issues that come with using LangChain in product. Even though this model is akin to OpenAI’s API access monetisation model, LangChain’s value addition will come from the addition of open source models to its portfolio.
In just over six months of launch, the framework has 10x’d its compatibility, with current and future funding paving the way for further polish to be added to the product. Keeping aside the current issues LangChain has, it is possible that it might just become the Android to OpenAI’s iOS.