Semi-IIN: Semi-Supervised Intra-Inter Modal Interaction Learning Network for Multimodal Sentiment Analysis
DOI:
https://doi.org/10.1609/aaai.v39i2.32131Abstract
Despite multimodal sentiment analysis being a fertile research ground that merits further investigation, current approaches take up high annotation cost and suffer from label ambiguity, non-amicable to high-quality labeled data acquisition. Furthermore, choosing the right interactions is essential because the significance of intra- or inter-modal interactions can differ among various samples. To this end, we propose Semi-IIN, a Semi-supervised Intra-inter modal Interaction learning Network for multimodal sentiment analysis. Semi-IIN integrates masked attention and gating mechanisms, enabling effective dynamic selection after independently capturing intra- and inter-modal interactive information. Combined with the self-training approach, Semi-IIN fully utilizes the knowledge learned from unlabeled data. Experimental results on two public datasets, MOSI and MOSEI, demonstrate the effectiveness of Semi-IIN, establishing a new state-of-the-art on several metrics.Downloads
Published
2025-04-11
How to Cite
Lin, J., Wang, Y., Xu, Y., & Liu, Q. (2025). Semi-IIN: Semi-Supervised Intra-Inter Modal Interaction Learning Network for Multimodal Sentiment Analysis. Proceedings of the AAAI Conference on Artificial Intelligence, 39(2), 1411–1419. https://doi.org/10.1609/aaai.v39i2.32131
Issue
Section
AAAI Technical Track on Cognitive Modeling & Cognitive Systems