HyperAIHyperAI
2 months ago

Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object Recognition with Spiking Neural Network

Ding, Yongqi ; Zuo, Lin ; Jing, Mengmeng ; He, Pei ; Xiao, Yongjun
Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object
  Recognition with Spiking Neural Network
Abstract

Neuromorphic object recognition with spiking neural networks (SNNs) is thecornerstone of low-power neuromorphic computing. However, existing SNNs sufferfrom significant latency, utilizing 10 to 40 timesteps or more, to recognizeneuromorphic objects. At low latencies, the performance of existing SNNs isdrastically degraded. In this work, we propose the Shrinking SNN (SSNN) toachieve low-latency neuromorphic object recognition without reducingperformance. Concretely, we alleviate the temporal redundancy in SNNs bydividing SNNs into multiple stages with progressively shrinking timesteps,which significantly reduces the inference latency. During timestep shrinkage,the temporal transformer smoothly transforms the temporal scale and preservesthe information maximally. Moreover, we add multiple early classifiers to theSNN during training to mitigate the mismatch between the surrogate gradient andthe true gradient, as well as the gradient vanishing/exploding, thuseliminating the performance degradation at low latency. Extensive experimentson neuromorphic datasets, CIFAR10-DVS, N-Caltech101, and DVS-Gesture haverevealed that SSNN is able to improve the baseline accuracy by 6.55% ~ 21.41%.With only 5 average timesteps and without any data augmentation, SSNN is ableto achieve an accuracy of 73.63% on CIFAR10-DVS. This work presents aheterogeneous temporal scale SNN and provides valuable insights into thedevelopment of high-performance, low-latency SNNs.

Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object Recognition with Spiking Neural Network | Latest Papers | HyperAI