Facebook’s New AI Models Run 5x Faster On GPUs, Outperforms EfficientNet Models

Researchers from Facebook AI recently introduced a new network design paradigm known as RegNet. RegNet – or Regular Networks – is a low-dimensional design space that consists of simple, regular networks. The researchers analyzed the RegNet design space and arrived at interesting findings, which are a unique match to the current practice of network design. 

Facebook AI Research (FAIR) is at the forefront of deep learning techniques. The social media giant has been focused on building products on several domains. This includes open-sourcing AI tools, Building Perception, Facial Recognition with DeepFace, and DeepText, among others. 

Visual recognition techniques such as ResNet, LeNet, and AlexNet have gained much traction over the past few years. It helps in the advancement of both effectiveness of neural networks, as well as in the understanding of network design, in case network instantiations and design principles can be generalized and applied to numerous settings.

Behind RegNet

To find simple models that are easy to understand, build upon, and generalize, the researchers presented a new network design paradigm that combines the advantages of manual design and Neural Architecture Search (NAS). Neural Architecture Search (NAS) overcomes the limitations of manual network design, and helps find a suitable model within a fixed search space of possible networks. 

Unlike manual design, this work took advantage of semi-automated procedures and focused on designing design spaces, which help in parametrizing the population of networks. The researchers referred to this process as a design space design.

Design space is a large – possibly infinite – population of model architectures. According to the researchers, the main motive behind this project is to help advance the understanding of network design and discover design principles that generalize across settings.

How RegNet Works

The core of the RegNet design space is composed of stage widths and depths, which are determined by a quantized linear function. The researchers designed the RegNet design space in a low-compute, low-epoch regime, using a single network block type on ImageNet dataset.

In each step of the design process, the input is an initial design space, and the output is a refined design space, where each design step aims to discover design principles that yield populations of simpler or better performing models.

The primary tool used by the researchers for analyzing design space quality is the error empirical distribution function (EDF). They used a relatively unconstrained design space to build RegNet, known as AnyNet, where the widths and depths vary freely across stages. 

The researchers said, “We propose to design network design spaces, where design space is a parametrized set of the possible model architecture, and we characterize the quality of a design space by sampling models and inspecting their error distribution.”

Contributions In This Project

Here are some of the contributions mentioned by the researchers of this project:-

  • According to the researchers, the RegNet design space has simpler models, is easier to interpret, and has a higher concentration of good models
  • An important property of the design space design in this project is that it is more interpretable, and can lead to interactive learning insights
  • The researchers compared the top REGNET models to existing networks in various settings. This showed that simple RegNet models achieve surprisingly good results.
  • REGNET models lead to considerable improvements over standard RESNE(X)T models in all metrics

Wrapping Up

According to the researchers, designing network design spaces is a promising avenue for future research. Under comparable training settings and flops, the RegNet models outperform the popular EfficientNet models, while being up to 5X faster on GPUs.

Read the paper here.

Download our Mobile App

Ambika Choudhury
A Technical Journalist who loves writing about Machine Learning and Artificial Intelligence. A lover of music, writing and learning something out of the box.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR

Can OpenAI Save SoftBank? 

After a tumultuous investment spree with significant losses, will SoftBank’s plans to invest in OpenAI and other AI companies provide the boost it needs?

Oracle’s Grand Multicloud Gamble

“Cloud Should be Open,” says Larry at Oracle CloudWorld 2023, Las Vegas, recollecting his discussions with Microsoft chief Satya Nadella last week.