Fully-Connected Tensor Network Decomposition and Its Application to Higher-Order Tensor Completion
Keywords:Matrix & Tensor Methods, Applications
AbstractThe popular tensor train (TT) and tensor ring (TR) decompositions have achieved promising results in science and engineering. However, TT and TR decompositions only establish an operation between adjacent two factors and are highly sensitive to the permutation of tensor modes, leading to an inadequate and inflexible representation. In this paper, we propose a generalized tensor decomposition, which decomposes an Nth-order tensor into a set of Nth-order factors and establishes an operation between any two factors. Since it can be graphically interpreted as a fully-connected network, we named it fully-connected tensor network (FCTN) decomposition. The superiorities of the FCTN decomposition lie in the outstanding capability for characterizing adequately the intrinsic correlations between any two modes of tensors and the essential invariance for transposition. Furthermore, we employ the FCTN decomposition to one representative task, i.e., tensor completion, and develop an efficient solving algorithm based on proximal alternating minimization. Theoretically, we prove the convergence of the developed algorithm, i.e., the sequence obtained by it globally converges to a critical point. Experimental results substantiate that the proposed method compares favorably to the state-of-the-art methods based on other tensor decompositions.
How to Cite
Zheng, Y.-B., Huang, T.-Z., Zhao, X.-L., Zhao, Q., & Jiang, T.-X. (2021). Fully-Connected Tensor Network Decomposition and Its Application to Higher-Order Tensor Completion. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 11071-11078. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17321
AAAI Technical Track on Machine Learning V