TY - JOUR
AU - Suh, Yoon-Je
AU - Kim, Byung Hyung
PY - 2021/05/18
Y2 - 2022/01/27
TI - Riemannian Embedding Banks for Common Spatial Patterns with EEG-based SPD Neural Networks
JF - Proceedings of the AAAI Conference on Artificial Intelligence
JA - AAAI
VL - 35
IS - 1
SE - AAAI Technical Track on Cognitive Modeling and Cognitive Systems
DO -
UR - https://ojs.aaai.org/index.php/AAAI/article/view/16168
SP - 854-862
AB - Modeling non-linear data as symmetric positive definite (SPD) matrices on Riemannian manifolds has attracted much attention for various classification tasks. In the context of deep learning, SPD matrix-based Riemannian networks have been shown to be a promising solution for classifying electroencephalogram (EEG) signals, capturing the Riemannian geometry within their structured 2D feature representation. However, existing approaches usually learn spatial-temporal structures in an embedding space for all available EEG signals, and their optimization procedures rely on computationally expensive iterations. Furthermore, these approaches often struggle to encode all of the various types of relationships into a single distance metric, resulting in a loss of generality. To address the above limitations, we propose a Riemannian Embedding Banks method, which divides the problem of common spatial patterns learning in an entire embedding space into K-subproblems and builds one model for each subproblem, to be combined with SPD neural networks. By leveraging the concept of the "separate to learn" technology on a Riemannian manifold, REB divides the data and the embedding space into K non-overlapping subsets and learns K separate distance metrics in a Riemannian geometric space instead of the vector space. Then, the learned K non-overlapping subsets are grouped into neurons in the SPD neural network's embedding layer. Experimental results on public EEG datasets demonstrate the superiority of the proposed approach for learning common spatial patterns of EEG signals despite their non-stationary nature, increasing the convergence speed while maintaining generalization.
ER -