Trained on Chips, GenAI is Reshaping the Chip Industry

GenAI capabilities can be applied to every facet of semiconductor manufacturing such as design, fabrication, testing, and packaging.

Bill Dally, NVIDIA’s chief scientist, recently introduced ‘ChipNeMo‘, a custom LLM developed by their engineers, at the International Conference on Computer-Aided Design, an event for electronic design automation (EDA).

ChipNeMo is trained on the company’s internal data to generate and optimise software in designing semiconductors by customising large language models with the help of NVIDIA NeMo—a cloud-native framework for developers to create and deploy generative AI models with billions of parameters.

“This effort marks an important first step in applying LLMs to the complex work of designing semiconductors,” said Dally. “It shows how even highly specialised fields can use their internal data to train useful generative AI models.”

After evaluating possible use cases of ChipNeMo, the research team at NVIDIA decided to start with three: a chatbot, a code generator, and an analysis tool. Of these, the analysis tool — automating the time-consuming tasks of maintaining updated bug descriptions — has garnered the most positive feedback.

The prototype chatbot responds to questions regarding GPU architecture and design, helping engineers quickly locate technical documents. The code generator, which currently produces snippets of 10-20 lines of software in specialised chip design languages, will be integrated with existing tools, providing a valuable assistant for ongoing designs.

The research paper by NVIDIA explains how the team gathered design data and employed it to craft a specialised generative AI model, a process that can be adapted to any industry. 

They started with a foundational model and customised it using NVIDIA NeMo, a framework included in the NVIDIA AI Enterprise software platform for building, customising, and deploying generative AI models. The chosen NeMo model boasts 43 billion parameters and was trained on over a trillion tokens.

The model was refined through two training rounds, with the first using approximately 24 billion tokens of internal design data and the second incorporating about 130,000 conversations and design examples.

This research represents just one of several instances where generative AI is making its mark in the semiconductor industry. Sharing their valuable experience, NVIDIA Research director and the paper’s lead author Mark Ren underscored the importance of customisation in LLMs.

Custom ChipNeMo models, with as few as 13 billion parameters, outperformed even much larger general-purpose LLMs in certain chip-design tasks. However, Ren emphasises the importance of careful data collection and cleaning, as well as staying updated with the latest tools to streamline the work.

The increasing complexity of chip design, driven by the relentless march toward smaller transistors is also straining engineering resources, as the industry faces a daunting 4x increase in workload while grappling with a talent gap of 10%-20%. According to Synopsys, the average number of transistors per chip has increased by a staggering 1,000 times since 2000.

Optimising Cost 

Rising costs have been another prominent issue in semiconductor manufacturing. The industry has witnessed a significant increase in the cost of designing and producing semiconductor chips. Factors such as growing design complexity, shrinking feature sizes, a surging number of masks, higher equipment costs, and stringent quality requirements have collectively contributed to this surge.

McKinsey reports that the average cost per transistor has soared by 50% since 2013.

Generative AI can be a game-changer in semiconductor manufacturing by applying its capabilities to every facet such as design, fabrication, testing, and packaging. “I believe, over time large language models will help all the processes across the board,” said Ren.

It can optimise chip design through reinforcement learning, specifically in component placement, known as floorplanning. For instance, Google’s floorplanning algorithm uses deep reinforcement learning to achieve remarkable improvements in power consumption, wire length, and congestion, reducing product development life cycle time significantly.

In the competitive landscape of semiconductor manufacturing, achieving shorter time-to-market cycles is paramount. However, as chip design becomes more intricate, the time available for development and delivery significantly decreases. McKinsey indicates that the average time to market for chips has shrunk by 25% because of outstanding demand.

Companies like Synopsys, an American EDA company headquartered in California, offer a full-stack AI-driven EDA suite that has made contributions to improving efficiency and reducing development cycles for major semiconductor companies like Samsung Electronics Co Ltd and ST Microelectronics, bringing down costs in turn.

Shankar Krishnamoorthy, GM of Synopsys’ EDA Group illustrated how AI, particularly AI-driven EDA tools, can optimise the design of low-power chips. He cited figures from EDA solutions, demonstrating energy and power savings of up to 15%.

Synopsys is in direct competition with Cadence Design Systems, which is making big moves to add AI to chip design software. However, experts believe that the former is lapping the latter in competition.

Additional Opportunities in Chip Manufacturing Pipeline

In addition to overcoming these challenges, generative AI presents a range of opportunities for the semiconductor manufacturing industry. 

Generative AI can also play a pivotal role in improving defect detection. The quality of chips depends on a multitude of factors, including material properties, process parameters, environmental conditions, and human error. However, as feature sizes shrink below 10 nanometers, defect rates increase exponentially due to factors such as quantum effects, variability, and noise.

McKinsey highlights that the average defect density has surged by 10 times since 2013. By utilising unsupervised learning, it can detect defects in semiconductor chips without requiring labelled data or prior knowledge. This technology can enhance defect detection accuracy by up to 30%, surpassing traditional methods.

GenAI can also lead to the discovery of new materials or the optimisation of existing ones for better performance and reduced costs. It can also drive the creation of new products or improvements in existing ones, significantly reducing product development cycles. Moreover, generative AI can broaden the market potential of semiconductor devices by creating custom ICs for specific tasks, offering superior performance, lower power consumption, and reduced costs compared to general-purpose ICs. 

In conclusion, while the semiconductor manufacturing industry faces substantial challenges, the integration of generative AI promises to be a transformative force. It offers innovative solutions that can optimise chip design, enhance defect detection, and unlock new horizons of efficiency, quality, and innovation. 

Shyam Nandan Upadhyay
Shyam is a tech journalist with expertise in policy and politics, and exhibits a fervent interest in scrutinising the convergence of AI and analytics in society. In his leisure time, he indulges in anime binges and mountain hikes.

Download our Mobile App

MachineHack | AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIMResearch Pioneering advanced AI market research

With a decade of experience under our belt, we are transforming how businesses use AI & data-driven insights to succeed.

The Gold Standard for Recognizing Excellence in Data Science and Tech Workplaces

With Best Firm Certification, you can effortlessly delve into the minds of your employees, unveil invaluable perspectives, and gain distinguished acclaim for fostering an exceptional company culture.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox