Efficient Verification of ReLU-Based Neural Networks via Dependency Analysis

Authors

  • Elena Botoeva Imperial College London
  • Panagiotis Kouvaros Imperial College London
  • Jan Kronqvist Imperial College London
  • Alessio Lomuscio Imperial College London
  • Ruth Misener Imperial College London

DOI:

https://doi.org/10.1609/aaai.v34i04.5729

Abstract

We introduce an efficient method for the verification of ReLU-based feed-forward neural networks. We derive an automated procedure that exploits dependency relations between the ReLU nodes, thereby pruning the search tree that needs to be considered by MILP-based formulations of the verification problem. We augment the resulting algorithm with methods for input domain splitting and symbolic interval propagation. We present Venus, the resulting verification toolkit, and evaluate it on the ACAS collision avoidance networks and models trained on the MNIST and CIFAR-10 datasets. The experimental results obtained indicate considerable gains over the present state-of-the-art tools.

Downloads

Published

2020-04-03

How to Cite

Botoeva, E., Kouvaros, P., Kronqvist, J., Lomuscio, A., & Misener, R. (2020). Efficient Verification of ReLU-Based Neural Networks via Dependency Analysis. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 3291-3299. https://doi.org/10.1609/aaai.v34i04.5729

Issue

Section

AAAI Technical Track: Machine Learning