MITB Banner

Open Source AI Has a New Champion

Falcon uses a modified Apache licence, meaning the models can be fine-tuned and used for commercial purposes.
Share
Listen to this story

When the codes of Meta’s LLaMA were leaked on GitHub, it gave researchers across the globe access to the first GPT-level large language model (LLM). What followed is a host of LLMs giving open-source AI a whole new dimension. LLaMA set the stage for models like Stanford’s Alpaca and Vicuna-13B to arrive, thus making it an open-source champion.

However, now, as it appears, we might have a new contender, called Falcon. Developed by the Technology Innovation Institute (TII) in Abu Dhabi, United Arab Emirates, Falcon offers better performance than LLaMA. It has three variants-1B, 7B, and 40B.

( Source: Twitter)

According to the institute, FalconLM is the most powerful open-source language model to date. Its largest variant, Falcon 40B, which has 40 billion parameters is relatively smaller than LLaMA, which has 65 billion parameters. Faisal Al Bannai, secretary general of the Advanced Technology Research Council (ATRC) believes the release of Falcon will disrupt LLM access and enable researchers and entrepreneurs to come up with the most innovative use cases.

Top of the charts

Two variants of FalconLM, Falcon 40B Instruct and Falcon 40B top the Hugging Face OpenLLM Leaderboard with Meta’s LLaMA at third. Hugging Face evaluates the models against four popular benchmarks-AI2 Reasoning Challenge, HellaSwag, MMLU, and TruthfulQA.

( Source: Hugging Face)

Even though the paper for the LLM is not out yet, so far we know that Falcon 40B has undergone extensive training on a massive dataset of 1 trillion tokens of the refined-web dataset, ensuring careful filtering for quality and relevance. Special emphasis was placed on achieving high-quality data at scale, the researchers revealed. LLMs are known to be sensitive to the quality of their training data, which is why considerable effort was dedicated to constructing a data pipeline capable of efficient processing across tens of thousands of CPU cores. The pipeline was designed to extract top-notch content from the web, incorporating extensive filtering and de-duplication techniques.

TII has released the refined-web dataset, a meticulously filtered and de-duplicated dataset, which has proven to be highly effective. Models trained solely on this dataset can match or even surpass models trained on carefully curated corpora, showcasing their exceptional quality and impact.

Falcon models also have multilingual capabilities. It understands English, German, Spanish, and French and has limited capabilities in some European languages such as Dutch, Italian, Romanian, Portuguese, Czech, Polish, and Swedish. Additionally, Falcon-40B stands as the second genuinely open-source model, following the release of H2O.ai’s model. However, evaluating both models becomes challenging since H2O.ai did not benchmark against other models on this leaderboard.

Commercial use

Even though the codes for LLaMA are available on GitHub, its weights were never made open-source. This means the commercial use of the model is restricted. Furthermore, all the variants were reliant on the original LLaMA licence, rendering them unsuitable for small-scale commercial applications.

However, Falcon uses a modified Apache licence, meaning the models can be fine-tuned and used for commercial purposes. Falcon stands out as the first open-source large language model that extends beyond research limitations. Originally, the licence stipulated a default royalty payment of 10% on attributable revenue exceeding one million dollars. However, it was subsequently announced that Falcon has transitioned to an Apache 2.0 licence, eliminating the need for royalty obligations.

Falcon achieves significant performance gains compared to GPT-3, utilising only 75% of the training compute budget, while requiring just one-fifth of the compute at inference time. Further, the model utilises 75% of GPT-3’s training compute, 40% of Chinchilla’s, and 80% of PaLM-62B’s, achieving efficient utilisation of computational resources.

Open source vs closed

While GPT-4 is the most advanced LLM to-date, it’s close-source and OpenAI has not revealed any details about the model’s architecture, model size, hardware, training compute, dataset construction, training method, etc. However, it prevents researchers and developers from understanding the technical details and inner workings of the model.

Open-source AI promotes collaboration, transparency, and greater innovation in the field. Open-source models allow for greater collaboration and knowledge sharing, which can lead to faster progress and innovation, as we have seen since the launch of LLaMA. LLaMA’s availability has allowed researchers and developers to access powerful language models without having to invest in proprietary solutions or expensive cloud resources. LLaMA has provided an alternative to closed-source models, which some experts have criticised for their lack of transparency and potential bias.

PS: The story was written using a keyboard.
Share
Picture of Pritam Bordoloi

Pritam Bordoloi

I have a keen interest in creative writing and artificial intelligence. As a journalist, I deep dive into the world of technology and analyse how it’s restructuring business models and reshaping society.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India