Optimal Attack against Autoregressive Models by Manipulating the Environment

Authors

  • Yiding Chen University of Wisconsin-Madison
  • Xiaojin Zhu University of Wisconsin-Madison

DOI:

https://doi.org/10.1609/aaai.v34i04.5760

Abstract

We describe an optimal adversarial attack formulation against autoregressive time series forecast using Linear Quadratic Regulator (LQR). In this threat model, the environment evolves according to a dynamical system; an autoregressive model observes the current environment state and predicts its future values; an attacker has the ability to modify the environment state in order to manipulate future autoregressive forecasts. The attacker's goal is to force autoregressive forecasts into tracking a target trajectory while minimizing its attack expenditure. In the white-box setting where the attacker knows the environment and forecast models, we present the optimal attack using LQR for linear models, and Model Predictive Control (MPC) for nonlinear models. In the black-box setting, we combine system identification and MPC. Experiments demonstrate the effectiveness of our attacks.

Downloads

Published

2020-04-03

How to Cite

Chen, Y., & Zhu, X. (2020). Optimal Attack against Autoregressive Models by Manipulating the Environment. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 3545-3552. https://doi.org/10.1609/aaai.v34i04.5760

Issue

Section

AAAI Technical Track: Machine Learning