SmartIdx: Reducing Communication Cost in Federated Learning by Exploiting the CNNs Structures
Keywords:Data Mining & Knowledge Management (DMKM)
AbstractTop-k sparsification method is popular and powerful forreducing the communication cost in Federated Learning(FL). However, according to our experimental observation, it spends most of the total communication cost on the index of the selected parameters (i.e., their position informa-tion), which is inefficient for FL training. To solve this problem, we propose a FL compression algorithm for convolution neural networks (CNNs), called SmartIdx, by extending the traditional Top-k largest variation selection strategy intothe convolution-kernel-based selection, to reduce the proportion of the index in the overall communication cost and thusachieve a high compression ratio. The basic idea of SmartIdx is to improve the 1:1 proportion relationship betweenthe value and index of the parameters to n:1, by regarding the convolution kernel as the basic selecting unit in parameter selection, which can potentially deliver more informationto the parameter server under the limited network traffic. Tothis end, a set of rules are designed for judging which kernel should be selected and the corresponding packaging strategies are also proposed for further improving the compressionratio. Experiments on mainstream CNNs and datasets show that our proposed SmartIdx performs 2.5×−69.2× higher compression ratio than the state-of-the-art FL compression algorithms without degrading model performance.
How to Cite
Wu, D., Zou, X., Zhang, S., Jin, H., Xia, W., & Fang, B. (2022). SmartIdx: Reducing Communication Cost in Federated Learning by Exploiting the CNNs Structures. Proceedings of the AAAI Conference on Artificial Intelligence, 36(4), 4254-4262. https://doi.org/10.1609/aaai.v36i4.20345
AAAI Technical Track on Data Mining and Knowledge Management