Active Hackathon

Top Works In Neural Architecture Search

Currently employed neural network architectures have mostly been developed manually by human experts, which is a time-consuming and error-prone process. This is when Neural architecture search, a subset of AutoML, came to the rescue. Neural Architecture Search (NAS) is the process of automating architecture engineering.

Source: automl.org

Here we list top research works in Neural Architecture Search based on their popularity on Github. These works have set new baselines, resulted in new networks and more.

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

This work proposes Efficient Neural Architecture Search (ENAS), a fast and inexpensive approach for automatic model design. In this approach, a controller (RNN), trained with policy gradient, learns to discover neural network architectures by searching for an optimal subgraph within a large computational graph. The authors state that ENAS is fast, delivers strong empirical performances using much fewer GPU-hours than all existing automatic model design approaches, and notably, 1000x less expensive than standard Neural Architecture Search. 

Link to the paper.

The Evolved Transformer

In this work, the researchers from Google Brain attempt to apply NAS to search for a better alternative to the Transformer. They first constructed a large search space and then ran an evolutionary architecture search by seeding the initial population with the Transformer. The results showed that the architecture — the Evolved Transformer — demonstrated consistent improvement over the Transformer on four well-established language tasks.

Link to the paper.

MobileDets

In this work, the authors state to have achieved substantial improvements in the latency-accuracy trade-off by incorporating regular convolutions in the search space and effectively placing them in the network via neural architecture search. This work resulted in a family of object detection models, MobileDets, that achieve state-of-the-art results across mobile accelerators.

Link to the paper

In this work, the authors propose a new method for learning the structure of convolutional neural networks (CNNs) that is more efficient than recent state-of-the-art methods based on reinforcement learning and evolutionary algorithms. This approach uses a sequential model-based optimisation (SMBO) strategy. Results show that this method is up to 5 times more efficient than the popular RL methods in terms of the number of models evaluated, and 8 times faster in terms of total compute.

Link to the paper.

DARTS

This paper addresses the scalability challenge of architecture search by formulating the task in a differentiable manner. Instead of applying evolution or reinforcement learning over a discrete and non-differentiable search space, this method is based on the continuous relaxation of the architecture representation, allowing an efficient search of the architecture using gradient descent. Experimental results show that this algorithm excels in discovering high-performance convolutional architectures for image classification and recurrent architectures for language modelling.

Link to the paper.

MorphNet

With MorphNet, the researchers aim to automate the design of neural network structures. MorphNet iteratively shrinks and expands a network. In contrast to previous approaches, MorphNet is scalable to large networks, adaptable to specific resource constraints and capable of increasing the network’s performance. When applied to standard network architectures on a wide variety of datasets, this approach discovers novel structures in each domain, obtaining higher performance.

Link to the paper.

Neural Architecture Search Without Labels

This paper tries to explore the most pressing question: can we do away with human annotations of images? Can we find high-quality neural architectures using only images? To answer this, the researchers first defined a new setup called Unsupervised Neural Architecture Search (UnNAS). They then trained a large number of diverse architectures with either supervised or unsupervised objectives and found that the architecture rankings produced with and without labels are highly correlated. The results reveal that labels are not necessary, and the image statistics alone may be sufficient to identify good neural architectures.

Link to the paper.

SpineNet

This paper states that convolutional neural networks do not perform well for tasks requiring simultaneous recognition and localisation. The encoder-decoder architectures are proposed to resolve this by applying a decoder network onto a backbone model designed for classification tasks. In this work, the authors argue encoder-decoder architecture is ineffective. So, they propose SpineNet, a backbone with scale-permuted intermediate features and cross-scale connections that is learned on an object detection task by Neural Architecture Search. Using similar building blocks, SpineNet models outperform ResNet-FPN models.

Link to the paper.

Randomly Wired Neural Networks

The researchers at Facebook explored a more diverse set of connectivity patterns with Randomly Wired Neural Networks for Image Recognition. Randomly wired networks are founded on random graph models in graph theory.

According to the authors, randomly wired neural networks managed to outperform human-designed networks such as ResNet and ShuffleNet with a typical computation budget and can go toe to toe with the increased computational regime. 

Read more about this here.

To keep yourself up to date with the latest developments in NAS, check this.

More Great AIM Stories

Ram Sagar
I have a master's degree in Robotics and I write about machine learning advancements.

Our Upcoming Events

Conference, Virtual
Genpact Analytics Career Day
3rd Sep

Conference, in-person (Bangalore)
Cypher 2022
21-23rd Sep

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
21st Apr, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM
MOST POPULAR

The curious case of Google Cloud revenue

Porat had earlier said that Google Cloud was putting in money to make more money, but even with the bucket-loads of money that it was making, profitability was still elusive.

Global Parliaments can do much more with Artificial Intelligence

The world is using AI to enhance the performance of its policymakers. India, too, has launched its own machine learning system NeVA, which at the moment is not fully implemented across the nation. How can we learn and adopt from the advancement in the Parliaments around the world? 

Why IISc wins?

IISc was selected as the world’s top research university, trumping some of the top Ivy League colleges in the QS World University Rankings 2022

[class^="wpforms-"]
[class^="wpforms-"]