OpenAI’s 128k Context Window Threatens Anthropic and Others

OpenAI’s GPT-4 Turbo with a higher context window along with other API releases, comfortably places OpenAI ahead of Anthropic and others

OpenAI might have brought Christmas early, for developers and enterprises on OpenAI’s first DevDay conference. By not just giving a special $500 in API credits for all the people in attendance, the company’s announcements on new features and improvements is probably out to change the way things work, with some touting it to even fold a number of AI startups. With GPT-4 Turbo, AI assistants, Code Interpreter API, and more, it looks like OpenAI is looking to have an edge in the competitor market. 

‘Turbo’ Power 

OpenAI may have not launched a GPT-5, but their upgraded version of GPT- 4, termed as GPT-4 Turbo, is said to offer an improved version of the model. With a 128k context window, which can fit more than 300 pages of text in a single prompt, it is a significant improvement from the last version that only has a 32k context window – something that might have limited customers who were looking to use longer context window models. Anthropic’s Claude has a 100k context window.

Previously, in an interview with AIM, Devang Agrawal, co-founder and CTO of Glyphic AI, which is an AI copilot for the sales team, spoke about how they choose an apt LLM based on their use cases. “Claude is able to understand long context and can understand 100,000 tokens at one go. This is something you can’t do with GPT-4, which can understand only up to 16,000 tokens,” said Agrawal, referring to the GPT-4-16k context window. 

On a similar line, with respect to context windows, Lentra, a digital-lending SaaS platform that employs AI/ML, had a similar experience when it came to experimenting with GPT APIs. When AIM got in touch with Rangarajan Vasudevan, chief data officer of Lentra, he said that the API version came with its limitations in terms of number of tokens. “What we were playing out was that if the context I had was slightly broad, I have to break it down into multiple smaller contexts, such that it fits within that limit. But, when I do that, there is a bearing on the correctness,” he said.  

Now, with a larger context window on GPT-4 Turbo, companies that switched to other LLMs for lack of context length on GPT, will now have an option to opt for the latter. Furthermore, OpenAI will be offering GPT-4 Turbo at a lower price as compared to GPT-4 : 3x cheaper price for input tokens and 2x cheaper price for output tokens. 

AI Assistants To Rule The Way

In a bid to support developers and help build agent-like interactions with their respective applications, OpenAI announced Assistants API. The API is said to aid developers to build their own GPT-like experience into their apps and services. It currently supports three types of tools – Code Interpreter, Retrieval and Function Calling. 

Code Interpreter that was released to ChatGPT Plus users in July, will now be available as an API, which means OpenAI is directly going head-on with applications such as LangChain and LlamaIndex. LlamaIndex, a data framework that connects custom data sources to large language models, are pretty much addressed through Assistants API. Furthermore, with ChatGPT-4V, which offers a multitude of features, coding capabilities have only been enhanced in the model. 

Text-To-Speech Upgrade

OpenAI’s text-to-speech (TTS) feature that was launched a month ago got an interesting update at Dev Day. Developers can create human-quality speech from text via the text-to-speech API, with and six preset voices across two model variants are offered. Interestingly, applications such as Eleven Labs and PlayHT offer a similar feature, and with OpenAI’s TTS upgrade, its direct competition to eliminate smaller players is evident. 

OpenAI’s DevDay keynote speech might have been a short one which concluded in 45 minutes, but the impact of the product announcements will ripple across its competitors, who might have to up the game. 

Google Gemini, are you listening? 

Vandana Nair
As a rare blend of engineering, MBA, and journalism degree, Vandana Nair brings a unique combination of technical know-how, business acumen, and storytelling skills to the table. Her insatiable curiosity for all things startups, businesses, and AI technologies ensures that there's always a fresh and insightful perspective to her reporting.

Download our Mobile App

MachineHack | AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIMResearch Pioneering advanced AI market research

With a decade of experience under our belt, we are transforming how businesses use AI & data-driven insights to succeed.

The Gold Standard for Recognizing Excellence in Data Science and Tech Workplaces

With Best Firm Certification, you can effortlessly delve into the minds of your employees, unveil invaluable perspectives, and gain distinguished acclaim for fostering an exceptional company culture.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR