Spiking Neural Networks: Why Is It Trending?

TTFS is a time-coding technique in which neurons' activity is proportionate to their firing delay.

Researchers at the Heidelberg University and the University of Bern have recently developed a technique for computing using spiking neuromorphic substrates that is both fast and energy-efficient. This technique is a rigorous adaptation of a time-to-first-spike (TTFS) coding scheme and a matching learning rule for certain networks of artificial neurons. Julian Goeltz, one of the leading researchers at Heidelberg University, and his colleagues set out to build a mathematical framework for obtaining deep learning in spiking neural networks (SNN) via temporal coding. First, let us understand what SNNs are and how they work.

Spiking Neural Networks

SNNs have emerged as the next generation of neural networks owing to their higher energy efficiency, which is a result of their integrate-and-fire and event-based operation characteristics. While SNNs can significantly improve the energy efficiency of artificial neural networks, deep SNNs have not been widely deployed in many applications due to a lack of scalable training procedures. 

Application: SNNs can be used for the same tasks as ANNs. They can also represent the central nervous system of living animals, such as an insect seeking food in an unknown environment.


Sign up for your weekly dose of what's up in emerging technology.

Next, let us explore what TTFS is. How does it contribute to the enhancement of deep learning? There are various coding schemes developed to describe the mechanics of information encoding.

Coding Schemes 

A neural code is a term that refers to the neural representations of data included inside a pattern. The term “neural coding” refers to a technique for representing information via spike trains, which includes both encoding and decoding techniques. Input pixels are converted into spikes that are delivered to excitatory neurons using neural coding techniques. Rate coding, TTFS coding, phase coding, and burst coding are four different forms of neural coding methods

  • Rate coding

By utilising the firing rate, rate coding has the advantages of simplicity of implementation and robustness to faults. However, because rate coding cannot use temporal information contained in spike trains, it generates a huge number of spikes, resulting in high energy consumption and a long inference delay.

  • Phase coding

Phase coding converts temporal information to spike patterns using a global oscillator, and it has been shown to reduce the number of spikes in deep SNNs drastically. However, if the input varies dynamically and is unpredictable, as hidden layers in deep SNNs are, the efficiency cannot be guaranteed. 

  • Burst coding

Burst coding seeks to circumvent this limitation by producing burst spikes through the use of an inter-spike interval. Burst spikes can convey more information more rapidly and precisely. Although burst coding considerably reduced the number of spikes and improved overall performance, it falls short of the intended latency and efficiency.

  • TTFS coding

To address these problems, deep SNNs use TTFS coding. TTFS is a time-coding technique in which neurons’ activity is proportional to their firing delay. During inference, neurons with TTFS coding emit a single spike and send information using the spike’s timing. It is worth noting that once a neuron generates a spike, it cannot generate more spikes when a suitably long refractory period is applied.

According to recent research, TTFS coding is the most accurate. There are no spikes or synaptic operations (SOPs) required during training or inference. Furthermore, TTFS coding requires 4x less processing latency and 3.5x fewer SOPs than rate coding for training and inference. This is because TTFS coding uses only the first spike and exact timing for information coding. Thus, the authors conclude that TTFS coding outperforms phase, rate and other kinds of coding.

More Great AIM Stories

Dr. Nivash Jeevanandam
Nivash holds a doctorate in information technology and has been a research associate at a university and a development engineer in the IT industry. Data science and machine learning excite him.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

Conference, in-person (Bangalore)
Cypher 2023
20-22nd Sep, 2023

3 Ways to Join our Community

Whatsapp group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our newsletter

Get the latest updates from AIM