MITB Banner

Trained on Chips, GenAI is Reshaping the Chip Industry

GenAI capabilities can be applied to every facet of semiconductor manufacturing such as design, fabrication, testing, and packaging.

Share

Listen to this story

Bill Dally, NVIDIA’s chief scientist, recently introduced ‘ChipNeMo‘, a custom LLM developed by their engineers, at the International Conference on Computer-Aided Design, an event for electronic design automation (EDA).

ChipNeMo is trained on the company’s internal data to generate and optimise software in designing semiconductors by customising large language models with the help of NVIDIA NeMo—a cloud-native framework for developers to create and deploy generative AI models with billions of parameters.

“This effort marks an important first step in applying LLMs to the complex work of designing semiconductors,” said Dally. “It shows how even highly specialised fields can use their internal data to train useful generative AI models.”

After evaluating possible use cases of ChipNeMo, the research team at NVIDIA decided to start with three: a chatbot, a code generator, and an analysis tool. Of these, the analysis tool — automating the time-consuming tasks of maintaining updated bug descriptions — has garnered the most positive feedback.

The prototype chatbot responds to questions regarding GPU architecture and design, helping engineers quickly locate technical documents. The code generator, which currently produces snippets of 10-20 lines of software in specialised chip design languages, will be integrated with existing tools, providing a valuable assistant for ongoing designs.

The research paper by NVIDIA explains how the team gathered design data and employed it to craft a specialised generative AI model, a process that can be adapted to any industry. 

They started with a foundational model and customised it using NVIDIA NeMo, a framework included in the NVIDIA AI Enterprise software platform for building, customising, and deploying generative AI models. The chosen NeMo model boasts 43 billion parameters and was trained on over a trillion tokens.

The model was refined through two training rounds, with the first using approximately 24 billion tokens of internal design data and the second incorporating about 130,000 conversations and design examples.

This research represents just one of several instances where generative AI is making its mark in the semiconductor industry. Sharing their valuable experience, NVIDIA Research director and the paper’s lead author Mark Ren underscored the importance of customisation in LLMs.

Custom ChipNeMo models, with as few as 13 billion parameters, outperformed even much larger general-purpose LLMs in certain chip-design tasks. However, Ren emphasises the importance of careful data collection and cleaning, as well as staying updated with the latest tools to streamline the work.

The increasing complexity of chip design, driven by the relentless march toward smaller transistors is also straining engineering resources, as the industry faces a daunting 4x increase in workload while grappling with a talent gap of 10%-20%. According to Synopsys, the average number of transistors per chip has increased by a staggering 1,000 times since 2000.

Optimising Cost 

Rising costs have been another prominent issue in semiconductor manufacturing. The industry has witnessed a significant increase in the cost of designing and producing semiconductor chips. Factors such as growing design complexity, shrinking feature sizes, a surging number of masks, higher equipment costs, and stringent quality requirements have collectively contributed to this surge.

McKinsey reports that the average cost per transistor has soared by 50% since 2013.

Generative AI can be a game-changer in semiconductor manufacturing by applying its capabilities to every facet such as design, fabrication, testing, and packaging. “I believe, over time large language models will help all the processes across the board,” said Ren.

It can optimise chip design through reinforcement learning, specifically in component placement, known as floorplanning. For instance, Google’s floorplanning algorithm uses deep reinforcement learning to achieve remarkable improvements in power consumption, wire length, and congestion, reducing product development life cycle time significantly.

In the competitive landscape of semiconductor manufacturing, achieving shorter time-to-market cycles is paramount. However, as chip design becomes more intricate, the time available for development and delivery significantly decreases. McKinsey indicates that the average time to market for chips has shrunk by 25% because of outstanding demand.

Companies like Synopsys, an American EDA company headquartered in California, offer a full-stack AI-driven EDA suite that has made contributions to improving efficiency and reducing development cycles for major semiconductor companies like Samsung Electronics Co Ltd and ST Microelectronics, bringing down costs in turn.

Shankar Krishnamoorthy, GM of Synopsys’ EDA Group illustrated how AI, particularly AI-driven EDA tools, can optimise the design of low-power chips. He cited figures from Synopsys.ai EDA solutions, demonstrating energy and power savings of up to 15%.

Synopsys is in direct competition with Cadence Design Systems, which is making big moves to add AI to chip design software. However, experts believe that the former is lapping the latter in competition.

Additional Opportunities in Chip Manufacturing Pipeline

In addition to overcoming these challenges, generative AI presents a range of opportunities for the semiconductor manufacturing industry. 

Generative AI can also play a pivotal role in improving defect detection. The quality of chips depends on a multitude of factors, including material properties, process parameters, environmental conditions, and human error. However, as feature sizes shrink below 10 nanometers, defect rates increase exponentially due to factors such as quantum effects, variability, and noise.

McKinsey highlights that the average defect density has surged by 10 times since 2013. By utilising unsupervised learning, it can detect defects in semiconductor chips without requiring labelled data or prior knowledge. This technology can enhance defect detection accuracy by up to 30%, surpassing traditional methods.

GenAI can also lead to the discovery of new materials or the optimisation of existing ones for better performance and reduced costs. It can also drive the creation of new products or improvements in existing ones, significantly reducing product development cycles. Moreover, generative AI can broaden the market potential of semiconductor devices by creating custom ICs for specific tasks, offering superior performance, lower power consumption, and reduced costs compared to general-purpose ICs. 

In conclusion, while the semiconductor manufacturing industry faces substantial challenges, the integration of generative AI promises to be a transformative force. It offers innovative solutions that can optimise chip design, enhance defect detection, and unlock new horizons of efficiency, quality, and innovation. 

Share
Picture of Shyam Nandan Upadhyay

Shyam Nandan Upadhyay

Shyam is a tech journalist with expertise in policy and politics, and exhibits a fervent interest in scrutinising the convergence of AI and analytics in society. In his leisure time, he indulges in anime binges and mountain hikes.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.