TY - JOUR AU - Lan, Jianglin AU - Zheng, Yang AU - Lomuscio, Alessio PY - 2022/06/28 Y2 - 2024/03/28 TI - Tight Neural Network Verification via Semidefinite Relaxations and Linear Reformulations JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 36 IS - 7 SE - AAAI Technical Track on Machine Learning II DO - 10.1609/aaai.v36i7.20689 UR - https://ojs.aaai.org/index.php/AAAI/article/view/20689 SP - 7272-7280 AB - We present a novel semidefinite programming (SDP) relaxation thatenables tight and efficient verification of neural networks. Thetightness is achieved by combining SDP relaxations with valid linearcuts, constructed by using the reformulation-linearisation technique(RLT). The computational efficiency results from a layerwise SDPformulation and an iterative algorithm for incrementally addingRLT-generated linear cuts to the verification formulation. The layerRLT-SDP relaxation here presented is shown to produce the tightest SDPrelaxation for ReLU neural networks available in the literature. Wereport experimental results based on MNIST neural networks showingthat the method outperforms the state-of-the-art methods whilemaintaining acceptable computational overheads. For networks ofapproximately 10k nodes (1k, respectively), the proposed methodachieved an improvement in the ratio of certified robustness casesfrom 0% to 82% (from 35% to 70%, respectively). ER -