Fully-Connected Tensor Network Decomposition and Its Application to Higher-Order Tensor Completion

Authors

  • Yu-Bang Zheng School of Mathematical Sciences, University of Electronic Science and Technology of China
  • Ting-Zhu Huang School of Mathematical Sciences, University of Electronic Science and Technology of China
  • Xi-Le Zhao School of Mathematical Sciences, University of Electronic Science and Technology of China
  • Qibin Zhao Tensor Learning Team, RIKEN Center for Advanced Intelligence Project (AIP) School of Automation, Guangdong University of Technology
  • Tai-Xiang Jiang School of Economic Information Engineering, Southwestern University of Finance and Economics

DOI:

https://doi.org/10.1609/aaai.v35i12.17321

Keywords:

Matrix & Tensor Methods, Applications

Abstract

The popular tensor train (TT) and tensor ring (TR) decompositions have achieved promising results in science and engineering. However, TT and TR decompositions only establish an operation between adjacent two factors and are highly sensitive to the permutation of tensor modes, leading to an inadequate and inflexible representation. In this paper, we propose a generalized tensor decomposition, which decomposes an Nth-order tensor into a set of Nth-order factors and establishes an operation between any two factors. Since it can be graphically interpreted as a fully-connected network, we named it fully-connected tensor network (FCTN) decomposition. The superiorities of the FCTN decomposition lie in the outstanding capability for characterizing adequately the intrinsic correlations between any two modes of tensors and the essential invariance for transposition. Furthermore, we employ the FCTN decomposition to one representative task, i.e., tensor completion, and develop an efficient solving algorithm based on proximal alternating minimization. Theoretically, we prove the convergence of the developed algorithm, i.e., the sequence obtained by it globally converges to a critical point. Experimental results substantiate that the proposed method compares favorably to the state-of-the-art methods based on other tensor decompositions.

Downloads

Published

2021-05-18

How to Cite

Zheng, Y.-B., Huang, T.-Z., Zhao, X.-L., Zhao, Q., & Jiang, T.-X. (2021). Fully-Connected Tensor Network Decomposition and Its Application to Higher-Order Tensor Completion. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 11071-11078. https://doi.org/10.1609/aaai.v35i12.17321

Issue

Section

AAAI Technical Track on Machine Learning V