MIT Researchers Develop New Silicon Chip To Boost Applications In AR/VR, 5G & Gaming

GRAND chip can effectively decode any redundancy code up to 128 bits in length, with only about a microsecond of latency.
MIT Researchers Develop New Silicon Chip To Boost Applications In AR/VR, 5G & Gaming

Researchers at MIT, Maynooth University, and Boston University recently created the first silicon chip using a universal decoding algorithm called GRAND aka Guessing Random Additive Noise Decoding, that can decode any code with greater accuracy, irrespective of its structure.  

GRAND has eliminated the need for multiple, computationally complex decoders. The chip has enabled increased efficiency that could have applications in augmented and virtual reality (AR/VR), 5G networks, gaming and connected devices that process a high volume of data with minimal delay


Sign up for your weekly dose of what's up in emerging technology.

Backed by the Battelle Memorial Institute and Science Foundation of Ireland, the research is expected to be presented at the European Solid States Device Research and Circuits Conference (ESSCIRC ESSDERC) held this week. 

How Noise hampers Data Transfer 

Every piece of information that travels over the internet, from paragraphs in an email to 3D graphics in a virtual reality environment, can be altered by the noise it encounters along the way, say, electromagnetic interference from a microwave or Bluetooth device. Typically, the data are coded so that when they arrive at their destination, a decoding algorithm can undo the negative effects of that disturbances and retrieve the original data. 

Traditionally, most error-correcting codes and decoding algorithms have been designed together. As a result, each code had a structure corresponding with a distinct, highly complex decoding algorithm, which often required dedicated hardware. Thanks to GRAND, it has eliminated the need for multiple, complex hardware components. 

To understand how GRAND works, let’s think of these codes as redundant hashes (1s & 0s) added to the end of the original data. The rules of creating that hash are stored in a particular codebook.

As the encoded data travels on the network, they are affected by noise that disrupts the signal, which is often generated by electronic devices. So, when they (coded data and the noise that affected them) arrive at their respective destination, the decoding algorithm checks its codebook and uses the structure of the hash to guess what the stored information is. 

The way GRAND works is that it guesses the noise affecting the message and uses the noise pattern to deduce the original information. It generates a series of noise sequences in the order they are likely to occur, deducts them from the received data, and reviews to see if the codeword is in a codebook. 

This becomes possible because the noise has a particular structure that allows the algorithm to guess what it might be, even though the noise appears random. Muriel Médard, a researcher at MIT, said it is similar to troubleshooting. 

Giving an example of a mechanic shop, she said, “If someone brings their car into the shop, the mechanic does not start by mapping the entire car to blueprints. Instead, they start by asking — what is the most likely thing to go wrong? Maybe it just needs gas. If that does not work, what’s next? Maybe the battery is dead.” 

Inside GRAND chip

The GRAND chip uses a three-tiered structure, including the simplest possible solutions in the first stage and longer and more complex noise patterns in the two subsequent stages. As a result, each stage operates separately, which increases the system’s throughput and saves power. 

Also, the device is designed to switch seamlessly between two codebooks — one cracks codewords, while the other loads a new codebook and later switches to decoding without any downtime or delay. 

In terms of the experimental outcome, the researchers found that the GRAND chip could effectively decode any redundancy code up to 128 bits in length, with only about a microsecond of latency. 

The Journey 

Previously, MIT researchers had demonstrated the algorithm’s success, but, with their latest work, they have managed to showcase the effectiveness and efficiency of GRAND in hardware. Médard said that developing hardware for the novel decoding algorithm required researchers first to toss aside their preconceptions.

She said that they could have gone out and reused things that were already being done. But, they decided to rethink every single aspect from scratch. “It was a journey of reconsideration,” added Médard. 

What’s next?

As GRAND uses codebooks for verification, the researchers believe that the chip not only works with legacy codes but can be used with codes that haven’t even been introduced yet. 

For instance, in the case of 5G implementation, telecom providers and regulators struggle to find common ground for which codes need to be used in the new network. Unfortunately, regulators often tend to choose traditional codes for 5G infrastructure in different scenarios. Leveraging GRAND could help in eliminating the need for rigid standardisation in the future, said Médard. 

Furthermore, the researchers believe that their chip could even open a new wave of innovation in coding. “I am hoping this will recast the discussion, so it is not so standards-oriented, enabling people to use codes that already exist and create new codes,” she added

In the coming months, the researchers plan to tackle problems around soft detection with a new version of the GRAND chip, as received data in soft detection are less accurate. Also, they plan to test the ability of the chip to crack longer, more complex codes and further adjust the structure of the silicon chip, improving its energy efficiency. 

More Great AIM Stories

Amit Raja Naik
Amit Raja Naik is a seasoned technology journalist who covers everything from data science to machine learning and artificial intelligence for Analytics India Magazine, where he examines the trends, challenges, ideas, and transformations across the industry.

Our Upcoming Events

Masterclass, Virtual
How to achieve real-time AI inference on your CPU
7th Jul

Conference, in-person (Bangalore)
Cypher 2022
21-23rd Sep

Conference, Virtual
Deep Learning DevCon 2022
29th Oct

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM

What is Direct to Mobile technology?

The Department of Technology is conducting a feasibility study of a spectrum band for offering broadcast services directly to users’ smartphones.