Regularized Modal Regression on Markov-Dependent Observations: A Theoretical Assessment

Authors

  • Tieliang Gong Xi'an Jiaotong University
  • Yuxin Dong Xi'an Jiaotong University
  • Hong Chen Huazhong Agricultural University
  • Wei Feng Xi'an Jiaotong University
  • Bo Dong Xi'an Jiaotong University
  • Chen Li Xi'an Jiaotong University

DOI:

https://doi.org/10.1609/aaai.v36i6.20627

Keywords:

Machine Learning (ML)

Abstract

Modal regression, a widely used regression protocol, has been extensively investigated in statistical and machine learning communities due to its robustness to outlier and heavy-tailed noises. Understanding modal regression's theoretical behavior can be fundamental in learning theory. Despite significant progress in characterizing its statistical property, the majority results are based on the assumption that samples are independent and identical distributed (i.i.d.), which is too restrictive for real-world applications. This paper concerns about the statistical property of regularized modal regression (RMR) within an important dependence structure - Markov dependent. Specifically, we establish the upper bound for RMR estimator under moderate conditions and give an explicit learning rate. Our results show that the Markov dependence impacts on the generalization error in the way that sample size would be discounted by a multiplicative factor depending on the spectral gap of the underlying Markov chain. This result shed a new light on characterizing the theoretical underpinning for robust regression.

Downloads

Published

2022-06-28

How to Cite

Gong, T., Dong, Y., Chen, H., Feng, W., Dong, B., & Li, C. (2022). Regularized Modal Regression on Markov-Dependent Observations: A Theoretical Assessment. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6721-6728. https://doi.org/10.1609/aaai.v36i6.20627

Issue

Section

AAAI Technical Track on Machine Learning I