Spiking Neural Networks: Why Is It Trending?

TTFS is a time-coding technique in which neurons' activity is proportionate to their firing delay.

Researchers at the Heidelberg University and the University of Bern have recently developed a technique for computing using spiking neuromorphic substrates that is both fast and energy-efficient. This technique is a rigorous adaptation of a time-to-first-spike (TTFS) coding scheme and a matching learning rule for certain networks of artificial neurons. Julian Goeltz, one of the leading researchers at Heidelberg University, and his colleagues set out to build a mathematical framework for obtaining deep learning in spiking neural networks (SNN) via temporal coding. First, let us understand what SNNs are and how they work.

Spiking Neural Networks

SNNs have emerged as the next generation of neural networks owing to their higher energy efficiency, which is a result of their integrate-and-fire and event-based operation characteristics. While SNNs can significantly improve the energy efficiency of artificial neural networks, deep SNNs have not been widely deployed in many applications due to a lack of scalable training procedures. 

Application: SNNs can be used for the same tasks as ANNs. They can also represent the central nervous system of living animals, such as an insect seeking food in an unknown environment.

Next, let us explore what TTFS is. How does it contribute to the enhancement of deep learning? There are various coding schemes developed to describe the mechanics of information encoding.

Coding Schemes 

A neural code is a term that refers to the neural representations of data included inside a pattern. The term “neural coding” refers to a technique for representing information via spike trains, which includes both encoding and decoding techniques. Input pixels are converted into spikes that are delivered to excitatory neurons using neural coding techniques. Rate coding, TTFS coding, phase coding, and burst coding are four different forms of neural coding methods

  • Rate coding

By utilising the firing rate, rate coding has the advantages of simplicity of implementation and robustness to faults. However, because rate coding cannot use temporal information contained in spike trains, it generates a huge number of spikes, resulting in high energy consumption and a long inference delay.

  • Phase coding

Phase coding converts temporal information to spike patterns using a global oscillator, and it has been shown to reduce the number of spikes in deep SNNs drastically. However, if the input varies dynamically and is unpredictable, as hidden layers in deep SNNs are, the efficiency cannot be guaranteed. 

  • Burst coding

Burst coding seeks to circumvent this limitation by producing burst spikes through the use of an inter-spike interval. Burst spikes can convey more information more rapidly and precisely. Although burst coding considerably reduced the number of spikes and improved overall performance, it falls short of the intended latency and efficiency.

  • TTFS coding

To address these problems, deep SNNs use TTFS coding. TTFS is a time-coding technique in which neurons’ activity is proportional to their firing delay. During inference, neurons with TTFS coding emit a single spike and send information using the spike’s timing. It is worth noting that once a neuron generates a spike, it cannot generate more spikes when a suitably long refractory period is applied.

According to recent research, TTFS coding is the most accurate. There are no spikes or synaptic operations (SOPs) required during training or inference. Furthermore, TTFS coding requires 4x less processing latency and 3.5x fewer SOPs than rate coding for training and inference. This is because TTFS coding uses only the first spike and exact timing for information coding. Thus, the authors conclude that TTFS coding outperforms phase, rate and other kinds of coding.

More Great AIM Stories

Dr. Nivash Jeevanandam
Nivash holds a doctorate in information technology and has been a research associate at a university and a development engineer in the IT industry. Data science and machine learning excite him.

More Stories

MORE FROM AIM

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM