UAE Unveils GPT-4’s New Rival, Falcon 180B

With 180 billion parameters, it is trained on dataset of 3.5 trillion tokens from TII's RefinedWeb dataset.
Listen to this story

Adding to the list of open-source LLMs, Abu Dhabi’s TII has released Falcon 180B, a highly scaled-up version of Falcon 40B. According to the official blog post, this is the largest open-source language model, boasting a staggering 180 billion parameters. 

Back in June, the institute released the three variants of Falcon – 1B, 7B, and 40B.

It is trained on a dataset of 3.5 trillion tokens from TII’s RefinedWeb dataset, making it the longest single-epoch pretraining process for an openly accessible model. The training process involved the simultaneous use of up to 4096 GPUs, using Amazon The chat model was fine-tuned leveraging a combination of various extensive conversational datasets focused on chat and instructions.

Falcon 180B Vs Llama 2 Vs GPT 3.5

Currently topping the HuggingFace leaderboard, Falcon 180B surpasses Llama 2 in size by 2.5 times and utilises four times the computing power. It also outperforms when compared to Llama 2 70B and OpenAI’s GPT-3.5 in terms of MMLU. It also uses multi-query attention (MQA).

Additionally, it exhibits comparable results to Google’s PaLM 2-Large in assessments involving HellaSwag, LAMBADA, WebQuestions, Winogrande, PIQA, ARC, BoolQ, CB, COPA, RTE, WiC, WSC, and ReCoRD. However it is yet to perform as well as GPT- 4.

 Read more: Llama 2 vs GPT-4 vs Claude-2 

Commercial Use

While the research paper for the model has not yet been released, Falcon 180b can be commercially used but under very restrictive conditions, excluding any “hosting use”, making it less commercially friendly than previous Falcon models. 

Open Source Gets a New Player

Surpassing Meta’s Llama 2, this is the largest open-source language model. 

Even though Meta has been championing the open-sourcing ecosystem, it comes with its own set of restrictions like its complicated licensing policy. But even Meta seems to follow a closed-door approach with its upcoming models which are touted to be even bigger and better. 

However, meanwhile, no matter how much effort Meta puts in, the real controller of open source is OpenAI, as AIM reported earlier. But now with the release of Google’s Gemini getting closer, it is high time that OpenAI releases GPT-5 to stay ahead in the race.

Read more: Meta Launches Open Source Models, OpenAI Controls Them

Download our Mobile App

Shritama Saha
Shritama is a technology journalist who is keen to learn about AI and analytics play. A graduate in mass communication, she is passionate to explore the influence of data science on fashion, drug development, films, and art.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR

6 IDEs Built for Rust

Rust IDEs aid efficient code development by offering features like code completion, syntax highlighting, linting, debugging tools, and code refactoring