AFS: An Attention-Based Mechanism for Supervised Feature Selection


  • Ning Gui Central South University
  • Danni Ge Zhejiang Sci-Tech University
  • Ziyin Hu Zhejiang Sci-Tech University



As an effective data preprocessing step, feature selection has shown its effectiveness to prepare high-dimensional data for many machine learning tasks. The proliferation of high di-mension and huge volume big data, however, has brought major challenges, e.g. computation complexity and stability on noisy data, upon existing feature-selection techniques. This paper introduces a novel neural network-based feature selection architecture, dubbed Attention-based Feature Selec-tion (AFS). AFS consists of two detachable modules: an at-tention module for feature weight generation and a learning module for the problem modeling. The attention module for-mulates correlation problem among features and supervision target into a binary classification problem, supported by a shallow attention net for each feature. Feature weights are generated based on the distribution of respective feature selec-tion patterns adjusted by backpropagation during the training process. The detachable structure allows existing off-the-shelf models to be directly reused, which allows for much less training time, demands for the training data and requirements for expertise. A hybrid initialization method is also introduced to boost the selection accuracy for datasets without enough samples for feature weight generation. Experimental results show that AFS achieves the best accuracy and stability in comparison to several state-of-art feature selection algorithms upon both MNIST, noisy MNIST and several datasets with small samples.




How to Cite

Gui, N., Ge, D., & Hu, Z. (2019). AFS: An Attention-Based Mechanism for Supervised Feature Selection. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 3705-3713.



AAAI Technical Track: Machine Learning