Bi-Spectrum Distillation: Addressing Spectral Mismatch in ANN-SNN Knowledge Transfer
DOI:
https://doi.org/10.1609/aaai.v40i34.40085Abstract
Knowledge distillation from Artificial Neural Networks (ANNs) to Spiking Neural Networks (SNNs) is a prominent training paradigm. However, its efficacy is fundamentally limited by a spectral mismatch: SNNs, with their intrinsic low-pass filtering characteristics, struggle to learn high-frequency details from their ANN teachers, creating a bottleneck in knowledge transfer at both the feature and logit levels. To address this, we propose Bi-Spectrum Distillation (BSD), a novel framework that mitigates the mismatch from two complementary perspectives. First, at the feature level, our Spectral Residual Distillation (SRD) enhances the student SNN's features with a parameter-efficient, learnable filter that adaptively compensates for high-frequency information loss, which transforms the student's output to better match the teacher's rich spectral target. Second, at the logits level, our Spectral Semantic Distillation (SSD) enhances fine-grained classification by distilling high-frequency components from teacher-ordered logits. Extensive experiments on CIFAR-10/100, ImageNet, and CIFAR10-DVS demonstrate that BSD achieves new state-of-the-art performance across both CNN and Transformer-based SNNs, validating its effectiveness and broad applicability.Downloads
Published
2026-03-14
How to Cite
Zhang, Y., Sun, Y., Yao, W., Deng, Y., & Li, H. (2026). Bi-Spectrum Distillation: Addressing Spectral Mismatch in ANN-SNN Knowledge Transfer. Proceedings of the AAAI Conference on Artificial Intelligence, 40(34), 28546-28554. https://doi.org/10.1609/aaai.v40i34.40085
Issue
Section
AAAI Technical Track on Machine Learning XI