Spiking neural networks (SNNs) are brain-inspired models based on biological neural mechanisms, acclaimed as the third generation of neural network models, with advantages such as low power consumption, strong real-time capabilities, and high biological interpretability. However, most SNNs adopt the fixed structures of artificial neural networks (ANNs), which fail to fully exploit the characteristics and advantages of SNNs in representing spatiotemporal features through spike sequences. Therefore, inspired by the synaptic plasticity mechanisms of biological neural synapses involving growth, pruning, and regeneration, we design and implement a structurally efficient spiking neural network model. We propose an adaptive learning approach to simultaneously learn weight parameters and network connections. This approach constructs structures at different granularities, including weights, convolutional kernels, and layers, that can be pruned and grown to adapt to various task types and data characteristics. By leveraging neural excitation-inhibition mechanisms to dynamically adjust the network's connectivity state, we facilitate a hybrid of local and global learning. Additionally, we introduce SNNs learning algorithms utilizing knowledge distillation methods, where the hidden knowledge of a teacher network drives the high-performance learning of SNNs, enhancing the network's feature extraction capabilities with prior knowledge and experience. By merging the biological inspirations and computational advantages of spiking neural networks, we pave the way for new possibilities in the development and application of SNNs.
Qi Xu received his B.Eng. degree in the College of Computer Science and Technology from Zhejiang University of Technology in 2015, and Ph.D. degree in the College of Computer Science and Technology from Zhejiang University in 2021. Since 2021, he has been an associate professor at School of Computer Science and Technology, Dalian University. He was granted an honorary visiting fellow in the Centre for Systems Neuroscience, University of Leicester, U.K in 2019. His research interests include brain-inspired computing, neuromorphic computing, neural computation, computational neuroscience, and cyborg intelligence.