AIM Banners_978 x 90

TII Unveils Falcon Mamba 7B, Outperforming Llama 3.18B and Other SLMs 

The open-source state space language model Mamba 7B outperforms Meta’s Llama 3.1 8B, Llama 3 8B, Mistral’s 7B, and claims the top spot on Hugging Face’s benchmark leaderboard.
TII Falcon Mamba 7B

The Technology Innovation Institute (TII), the applied research arm of Abu Dhabi’s Advanced Technology Research Council (ATRC), has launched the Falcon Mamba 7B, a groundbreaking addition to its Falcon series of LLMs. Open-source State Space Language Model (SSLM), Falcon Mamba 7B has been independently verified by Hugging Face to outshine all competitors.

Marking a significant departure from previous Falcon models, which relied on transformer-based architecture, the Falcon Mamba 7B introduces SSLM technology to the Falcon lineup. This model not only outperforms Meta’s Llama 3.1 8B, Llama 3 8B, and Mistral’s 7B in new benchmarks but also claims the top spot on Hugging Face’s tougher benchmark leaderboard.

Source: Falcon

SSLMs excel at processing complex, time-evolving information, making them ideal for tasks like book-length comprehension, estimation, forecasting, and control tasks. Falcon Mamba 7B demonstrates superior capabilities in Natural Language Processing, machine translation, text summarisation, computer vision, and audio processing, with significantly lower memory requirements compared to traditional transformer models.

H.E. Faisal Al Bannai, Secretary General of ATRC and Adviser to the UAE President for Strategic Research and Advanced Technology Affairs “The Falcon Mamba 7B marks TII’s fourth consecutive top-ranked AI model, reinforcing Abu Dhabi as a global hub for AI research and development. This achievement highlights the UAE’s unwavering commitment to innovation.”

Source: Falcon

TII Continues Growth with SLMs

In a focussed shift to building small language models, Hakim Hacid, executive director and acting chief researcher at Technology Innovation Institute (TII), had discussed the same with AIM in an exclusive interaction, earlier this year. 

“We were asking at some point the question, as to how big should we go? I think now the question is how small we could go by keeping a small model,” said Hacid, saying that they are exploring that path. 

Further, he said that they are making models smaller because, again, “if we want the pillar of the deployment to succeed, we need to actually have models that can run in devices, and in infrastructure that is not highly demanding.”

With over 45 million downloads of Falcon LLMs to date, the Falcon Mamba 7B continues TII’s tradition of pioneering research and open-source contributions. The model will be released under the TII Falcon License 2.0, a permissive software licence based on Apache 2.0, emphasising the responsible use of AI.

TII continues to build on the open-source culture and believes not everyone will be able to sustain it. “You need a lot of funding to sustain open-source and we believe that not everyone will be able to do it,” said Hacid. 

📣 Want to advertise in AIM? Book here

Picture of Vandana Nair
Vandana Nair
As a rare blend of engineering, MBA, and journalism degree, Vandana Nair brings a unique combination of technical know-how, business acumen, and storytelling skills to the table. Her insatiable curiosity for all things startups, businesses, and AI technologies ensures that there's always a fresh and insightful perspective to her reporting. She now hosts her tech segment 'Point Break' on AIM Tv.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed