Are Labels Necessary For Neural Architecture Search? Probing Unsupervised Regimes Of AutoML

The performance of deep learning applications has improved proportionally with the increase in automation techniques such as Neural Architecture Search(NAS). Today hierarchical feature extractors are learned in an end-to-end fashion from data rather than manually designed.  NAS is the process of automating architecture engineering, a part of AutoML. NAS can be seen as a subfield of AutoML and has significant overlap with hyperparameter optimization and meta-learning. Researchers exploit the transferability of architectures to enable search to be performed on different data and labels. However, one thing common with the automated search and the traditional methods is that both need images and the (semantic) labels to search for an architecture. In other words, NAS flourish in the
Subscribe or log in to Continue Reading

Uncompromising innovation. Timeless influence. Your support powers the future of independent tech journalism.

Already have an account? Sign In.

📣 Want to advertise in AIM? Book here

Picture of Ram Sagar
Ram Sagar
I have a master's degree in Robotics and I write about machine learning advancements.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed