Error-Based Knockoffs Inference for Controlled Feature Selection

Authors

  • Xuebin Zhao Huazhong Agriculture University
  • Hong Chen Huazhong Agricultural University
  • Yingjie Wang Huazhong Agricultural University
  • Weifu Li Huazhong Agricultural University
  • Tieliang Gong Xi'an Jiaotong University
  • Yulong Wang Huazhong Agricultural University
  • Feng Zheng Southern University of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v36i8.20905

Keywords:

Machine Learning (ML)

Abstract

Recently, the scheme of model-X knockoffs was proposed as a promising solution to address controlled feature selection under high-dimensional finite-sample settings. However, the procedure of model-X knockoffs depends heavily on the coefficient-based feature importance and only concerns the control of false discovery rate (FDR). To further improve its adaptivity and flexibility, in this paper, we propose an error-based knockoff inference method by integrating the knockoff features, the error-based feature importance statistics, and the stepdown procedure together. The proposed inference procedure does not require specifying a regression model and can handle feature selection with theoretical guarantees on controlling false discovery proportion (FDP), FDR, or k-familywise error rate (k-FWER). Empirical evaluations demonstrate the competitive performance of our approach on both simulated and real data.

Downloads

Published

2022-06-28

How to Cite

Zhao, X., Chen, H., Wang, Y., Li, W., Gong, T., Wang, Y., & Zheng, F. (2022). Error-Based Knockoffs Inference for Controlled Feature Selection. Proceedings of the AAAI Conference on Artificial Intelligence, 36(8), 9190-9198. https://doi.org/10.1609/aaai.v36i8.20905

Issue

Section

AAAI Technical Track on Machine Learning III