MITB Banner

MIT Researchers Make New Chips That Work On Light

Share

“The new architecture could be used in convolutional neural networks and recurrent neural networks.”

As the ability of AI systems improve, they will likely require even more processing capacity. This will probably be beyond ordinary computing technology. MIT spinout Lightelligence is working on next-generation computing hardware to overcome this challenge.

In contrast to standard electronic architectures, the optical chips that Lightelligence creates offer improvement in terms of high speed, low power consumption and low latency. In 2017, co-founder and CEO of Lightelligence Yichen Shen, in his paper, “Deep learning with coherent nanophotonic circuits”, had presented a new neural network architecture, based on unique advantages of optics, that offered two-orders-of-magnitude speed boost and three-orders-of-magnitude power efficiency gain over cutting edge models for learning tasks. It was demonstrated using a programmable nanophotonic processor.

According to the paper, the main principle behind Artificial Neural Networks is based on the computational network models found in the nervous system. The proposed architecture could be used to perform matrix multiplications and nonlinear activations in various artificial neural network techniques, such as convolutional neural networks and recurrent neural networks.

Instead of using the fabrication platform typically used for regular semiconductor chips, Lightelligence uses it in a revolutionary way. Lightelligence designs light-powered computing components, which could be the hardware required to power the AI revolution. Lightelligence’s optical processors have orders of magnitude better performance than standard architectures. To conduct arithmetical calculations electronic circuits need to integrate tens, perhaps hundreds, of logical gates. To carry out this process the electronic chip transistors must be turned on and off for several clock cycles. Each time a logical port is switched on, heat is generated, and power is consumed.This is not the case with Lightelligence chips. Shen’s optic computing chips allow for a significantly lower power usage than their electron-powered counterparts, thereby generating very little heat. Furthermore, their ONN can potentially provide direct training of the network on the photonic chip with the higher forward propagation speed and power efficiency by employing only propagation. 

While lightelligence’s CEO doesn’t intend to replace the electronic computing industry in its entirety. He rather aims to accelerate certain linear algebra operations to perform quick, power-efficient tasks like those found in artificial neural networks.

Lightelligence’s competitive advantage 

(Source : Lightelligence)

In this emerging field of optical computing, Shen and his colleagues are not alone. Only recently, another MIT spinout Lightmatter announced an additional round of $80 million in Series B fundraising. The technology of Lightmatter is built on patented silicon photonics technology that handles consistent light within a chip to conduct calculations very fast with extremely low power. Lightmatter’s CEO Nick Harris is one of the contributing authors of “Deep learning with coherent nanophotonic circuits”. But unlike their competition, Shen and his colleagues have crucial advantages. In 2017, Shen teamed up with Soljajic and two other MIT alumni to start Lightelligence. Lightelligence’s vice president of photonics, Dr Huaiyu Meng, holds a doctorate in electrical engineering. Adding to the founding team is business administration major Spencer Powers. They not only invented the technology at the institute, but they are also the first firm to have developed a complete optical hardware solution. Irrespective of competition, Shen is confident in Lightelligence’s innovation potential. To date, Lightelligence has raised over $40 million and currently, the team is working on building the world’s most extensive integrated photonic system.

Since data centres such as Amazon and Microsoft play a major role in AI computing, which takes place in the cloud. Centres that run computationally intensive AI algorithms that take up a huge chunk of data centre capacity. Every year millions of dollars’ worth of electricity is burned by thousands of servers running continuously. On the other hand, Lightelligence servers consume far less power, at a significantly lower cost. Their AI chips not only reduce the cost but also significantly increase the computational capability making Lightelligence a lucrative startup. 

Share
Picture of Ritika Sagar

Ritika Sagar

Ritika Sagar is currently pursuing PDG in Journalism from St. Xavier's, Mumbai. She is a journalist in the making who spends her time playing video games and analyzing the developments in the tech world.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.