21st-may-banner design

OpenAI Kills Arrakis

The reasons for dropping Arrakis can be aplenty. Let's see a few here

Share

OpenAI Kills Arrakis
Listen to this story

OpenAI is getting ready for its first-ever DevDay conference. Moreover, it might have even cracked the AGI code with the recent rumours of Jarvis. But on the other hand, it is reportedly killing one of its other dear projects, codenamed Arrakis, as it did not live up to the company’s expectations during training.

Undoubtedly, OpenAI’s models have to live up to the standard of ChatGPT, which might be a little too high even for them. Unlike GPT-4, which is huge in size and more powerful than its predecessor GPT-3.5, Arrakis was expected to be smaller and allow the chatbots to run more efficiently and less expensively. 

This was probably in line with the release of Meta’s LLaMA and Llama 2, which even Microsoft, the biggest backer of OpenAI, had started using in many of its products and services. But the news about OpenAI building Arrakis started long before Llama was ever in the picture, ever since they started training GPT-4. But now, The Information reports that they have scrapped the model.

Why, though?

What if Arrakis actually has a lot of flaws? Gary Marcus points out after the report that Sam Altman might actually be right. “They might have decided that GPT-5, if it was simply a bigger version of GPT-4, would not meet expectations, and that it wasn’t worth spending the hundreds of millions of dollars required, if the outcome would only turn out to be disappointing, or even embarrassing.”

Arrakis, though, should not be compared with GPT-5. OpenAI has not made an official announcement about the release of GPT-5, and Altman has said that they are not training it, though they have filed a trademark for it, the Arrakis model was always expected to be a smaller one. The reasons can be that the company is shifting its focus on building AI models for wearables and smart devices, instead of just chatbots. 

The reasons for dropping Arrakis can be plenty. One of them can be the price for training such AI models, for which OpenAI has been reportedly spending almost a million dollars each day. Or the other one can be that the model is simply not good. Or, they want to keep it within themselves and use it in their upcoming products. 

If the company decides to shift its focus on smaller models again, it has to weigh the value of training versus the benefit that it would give to them. At the same time, the loss of time and resources has also disappointed some of the Microsoft employees, according to the report, as they have been paying OpenAI to develop smaller models for a long time. 

Nonetheless, the company has been generating revenue and is back on track, according to Altman. It is expected to generate an annual revenue of $1.3 billion this year, compared to $28 million last year. Instead of scrapping Arrakis altogether, the company can integrate it within Gobi, which is expected to be something similar to what the company has released with the GPT-4 Vision model.

But, there are other concerns

This can also be a major setback for the company when it comes to adoption of their products. Though enterprises are heavily using GPT-4, the need for smaller models is on the rise. Some of the models are even outperforming OpenAI’s capabilities on various fronts. Even Microsoft has worked on smaller LLMs such as Orca, that run comparatively cheaper for the company. 

On similar lines, a recent Microsoft research also highlights some trust issues with GPT-3.5 and GPT-4. Researchers say that GPT-4 can be easily jailbroken with prompts and be led to wrong hands. Interestingly, this can also include Arrakis models as the research revolves around the same time. 

But according to the researchers, the bugs found in the models were reportedly fixed before the models were released. Possibly, it can be the reason that the models that OpenAI is dropping now are heavily filled with these bugs, given the smaller size of the models. 

It seems like even though OpenAI is riding high on the revenue waves at the moment, there indeed are chances that the company might have to drop a smaller model soon. Otherwise, Microsoft might have to steer some other route and find a different island to land on.

Share
Picture of Mohit Pandey

Mohit Pandey

Mohit dives deep into the AI world to bring out information in simple, explainable, and sometimes funny words. He also holds a keen interest in photography, filmmaking, and the gaming industry.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.