MIT Researchers Make New Chips That Work On Light

“The new architecture could be used in convolutional neural networks and recurrent neural networks.”

As the ability of AI systems improve, they will likely require even more processing capacity. This will probably be beyond ordinary computing technology. MIT spinout Lightelligence is working on next-generation computing hardware to overcome this challenge.

In contrast to standard electronic architectures, the optical chips that Lightelligence creates offer improvement in terms of high speed, low power consumption and low latency. In 2017, co-founder and CEO of Lightelligence Yichen Shen, in his paper, “Deep learning with coherent nanophotonic circuits”, had presented a new neural network architecture, based on unique advantages of optics, that offered two-orders-of-magnitude speed boost and three-orders-of-magnitude power efficiency gain over cutting edge models for learning tasks. It was demonstrated using a programmable nanophotonic processor.

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

According to the paper, the main principle behind Artificial Neural Networks is based on the computational network models found in the nervous system. The proposed architecture could be used to perform matrix multiplications and nonlinear activations in various artificial neural network techniques, such as convolutional neural networks and recurrent neural networks.

Instead of using the fabrication platform typically used for regular semiconductor chips, Lightelligence uses it in a revolutionary way. Lightelligence designs light-powered computing components, which could be the hardware required to power the AI revolution. Lightelligence’s optical processors have orders of magnitude better performance than standard architectures. To conduct arithmetical calculations electronic circuits need to integrate tens, perhaps hundreds, of logical gates. To carry out this process the electronic chip transistors must be turned on and off for several clock cycles. Each time a logical port is switched on, heat is generated, and power is consumed.This is not the case with Lightelligence chips. Shen’s optic computing chips allow for a significantly lower power usage than their electron-powered counterparts, thereby generating very little heat. Furthermore, their ONN can potentially provide direct training of the network on the photonic chip with the higher forward propagation speed and power efficiency by employing only propagation. 

While lightelligence’s CEO doesn’t intend to replace the electronic computing industry in its entirety. He rather aims to accelerate certain linear algebra operations to perform quick, power-efficient tasks like those found in artificial neural networks.

Lightelligence’s competitive advantage 

(Source : Lightelligence)

In this emerging field of optical computing, Shen and his colleagues are not alone. Only recently, another MIT spinout Lightmatter announced an additional round of $80 million in Series B fundraising. The technology of Lightmatter is built on patented silicon photonics technology that handles consistent light within a chip to conduct calculations very fast with extremely low power. Lightmatter’s CEO Nick Harris is one of the contributing authors of “Deep learning with coherent nanophotonic circuits”. But unlike their competition, Shen and his colleagues have crucial advantages. In 2017, Shen teamed up with Soljajic and two other MIT alumni to start Lightelligence. Lightelligence’s vice president of photonics, Dr Huaiyu Meng, holds a doctorate in electrical engineering. Adding to the founding team is business administration major Spencer Powers. They not only invented the technology at the institute, but they are also the first firm to have developed a complete optical hardware solution. Irrespective of competition, Shen is confident in Lightelligence’s innovation potential. To date, Lightelligence has raised over $40 million and currently, the team is working on building the world’s most extensive integrated photonic system.

Since data centres such as Amazon and Microsoft play a major role in AI computing, which takes place in the cloud. Centres that run computationally intensive AI algorithms that take up a huge chunk of data centre capacity. Every year millions of dollars’ worth of electricity is burned by thousands of servers running continuously. On the other hand, Lightelligence servers consume far less power, at a significantly lower cost. Their AI chips not only reduce the cost but also significantly increase the computational capability making Lightelligence a lucrative startup. 

More Great AIM Stories

Ritika Sagar
Ritika Sagar is currently pursuing PDG in Journalism from St. Xavier's, Mumbai. She is a journalist in the making who spends her time playing video games and analyzing the developments in the tech world.

Our Upcoming Events

Masterclass, Virtual
How to achieve real-time AI inference on your CPU
7th Jul

Masterclass, Virtual
How to power applications for the data-driven economy
20th Jul

Conference, in-person (Bangalore)
Cypher 2022
21-23rd Sep

Conference, Virtual
Deep Learning DevCon 2022
29th Oct

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM
MOST POPULAR

What can SEBI learn from casinos?

It is said that casino AI technology comes with superior risk management systems compared to traditional data analytics that regulators are currently using.

Will Tesla Make (it) in India?

Tesla has struggled with optimising their production because Musk has been intent on manufacturing all the car’s parts independent of other suppliers since 2017.

Now Reliance wants to conquer the AI space

Many believe that Reliance is aggressively scouting for AI and NLP companies in the digital space in a bid to create an Indian equivalent of FAANG – Facebook, Apple, Amazon, Netflix, and Google.

[class^="wpforms-"]
[class^="wpforms-"]