Online Training Through Time for Spiking Neural Networks

Spiking neural networks (SNNs) are promising brain-inspired energy-efficientmodels. Recent progress in training methods has enabled successful deep SNNs onlarge-scale tasks with low latency. Particularly, backpropagation through time(BPTT) with surrogate gradients (SG) is popularly used to achieve highperformance in a very small number of time steps. However, it is at the cost oflarge memory consumption for training, lack of theoretical clarity foroptimization, and inconsistency with the online property of biological learningand rules on neuromorphic hardware. Other works connect spike representationsof SNNs with equivalent artificial neural network formulation and train SNNs bygradients from equivalent mappings to ensure descent directions. But they failto achieve low latency and are also not online. In this work, we propose onlinetraining through time (OTTT) for SNNs, which is derived from BPTT to enableforward-in-time learning by tracking presynaptic activities and leveraginginstantaneous loss and gradients. Meanwhile, we theoretically analyze and provethat gradients of OTTT can provide a similar descent direction for optimizationas gradients based on spike representations under both feedforward andrecurrent conditions. OTTT only requires constant training memory costsagnostic to time steps, avoiding the significant memory costs of BPTT for GPUtraining. Furthermore, the update rule of OTTT is in the form of three-factorHebbian learning, which could pave a path for online on-chip learning. WithOTTT, it is the first time that two mainstream supervised SNN training methods,BPTT with SG and spike representation-based training, are connected, andmeanwhile in a biologically plausible form. Experiments on CIFAR-10, CIFAR-100,ImageNet, and CIFAR10-DVS demonstrate the superior performance of our method onlarge-scale static and neuromorphic datasets in small time steps.