Now Reading
Jeffrey Dean From Google On How Deep Learning Is Transforming Computer Architecture

Jeffrey Dean From Google On How Deep Learning Is Transforming Computer Architecture

Deep Learning


Within a few years, deep learning has witnessed a number of improvements and computational advancements by the researchers and academicians. The advancements have not only seen in fundamental areas like speech recognition, computer vision, language understanding, among others but also include other fields such as quantum chemistry, flood forecasting, protein folding, genomics and other such. 

We are currently at the advancing state of the computational performance which has been solving various real-world problems in real-time and thus increasing the potential of machine learning models in the computing industry. According to this analysis report, there are three main factors that drive the advancement of artificial intelligence, which are algorithmic innovation, data and the amount of computing available for training.



The growth of machine learning researchers has been exponentially rising over the past decades. One profound instance is that there are more than 100 research papers per day posted to Arxiv in the machine-learning-related subtopic areas. Recently, Jeffrey Dean, a researcher at Google discussed the revolution of deep learning and its implication for computer architecture and chip design. 

According to Dean, there are three main properties of deep learning as mentioned below:

  • They are very tolerant of reduced-precision computations.
  • The computations performed by most models are simply different compositions of a relatively small handful of operations like matrix multiplies, vector operations, application of convolutional kernels, and other dense linear algebra calculations. 
  • The opportunity exists to build computational hardware that is specialised for dense, low-precision linear algebra is programmable at the level of specifying programs as different compositions of mostly linear algebra-style operations. 

The machine learning research field is moving at a very fast pace and to keep the track and be at the same pace researchers have been trying every possible way. In order to estimate the revolution, researchers such as computer architect, higher-level software system builders and machine learning researchers discussed in various related topics like what are the interesting research trends are starting to appear, implications for machine learning hardware and other such topics. In this discussion, the researchers found out some of the potential points which have been accelerating the growth and research in deep learning.   

Deep Learning For Chip Design

Researchers have found significant potential in the use of machine learning to learn to automatically generate high-quality solutions for a number of different NP-hard optimisation problems which exist in the overall workflow for designing custom ASICs. Besides this, the automated ML-based system also enables rapid design space exploration, as the reward function can be easily adjusted to optimize for different trade-offs in target optimization metrics.  

Deep Learning for Semiconductor Manufacturing Problems

Though Computer Vision has attained dramatic improvements over the last few years, yet there are certain problems in the domain of visual inspection of wafers during the semiconductor manufacturing process which needs to be improved in order to gain more accuracy over the existing approaches. 

See Also

Deep Learning for Learned Heuristics in Computer Systems

The third point for accelerating the growth of deep learning is the use of learned heuristics in computer systems such as compilers, operating systems, file systems, networking stacks, etc. Advancement of these heuristics must be taken into account by allowing them to adapt more readily to the actual usage patterns of a system that will help in accelerating the growth and research in deep learning. 

Wrapping Up

The researcher concluded the discussion by mentioning a few interesting threads of research which are occurring in the ML research community at the moment are mentioned below:

  • Work on Sparsely-Activated Models: Sparsely-activated models such as the sparsely-gated mixture of experts model shows how to build very large-capacity models. This model will help the routing functions to learn and expertise on their own and thus specialise on particular instances. 
  • Work on Automated Machine Learning (AutoML): Automated Machine Learning such as Neural Architecture Search (NAS), Evolutionary Architectural Search involves running many automated experiments and can automatically learn effective structures and other aspects of machine learning models. 
  • Multi-Task Training At Modest Scales: Multi-task training at modest scales or transfer learning from a model on a large amount of data has been seen to be very effective these days in a number of complex computation problems. 

Enjoyed this story? Join our Telegram group. And be part of an engaging community.

Provide your comments below

comments

What's Your Reaction?
Excited
1
Happy
2
In Love
0
Not Sure
2
Silly
0
Scroll To Top