Bridging Towers of Multi-task Learning with a Gating Mechanism for Aspect-based Sentiment Analysis and Sequential Metaphor Identification
Keywords:Text Classification & Sentiment Analysis, Lexical & Frame Semantics, Semantic Parsing
AbstractMulti-task learning (MTL) has been widely applied in Natural Language Processing. A major task and its associated auxiliary tasks share the same encoder; hence, an MTL encoder can learn the sharing abstract information between the major and auxiliary tasks. Task-specific towers are then employed upon the sharing encoder to learn task-specific information. Previous works demonstrated that exchanging information between task-specific towers yielded extra gains. This is known as soft-parameter sharing MTL. In this paper, we propose a novel gating mechanism for the bridging of MTL towers. Our method is evaluated based on aspect-based sentiment analysis and sequential metaphor identification tasks. The experiments demonstrate that our method can yield better performance than the baselines on both tasks. Based on the same Transformer backbone, we compare our gating mechanism with other information transformation mechanisms, e.g., cross-stitch, attention and vanilla gating. The experiments show that our method also surpasses these baselines.
How to Cite
Mao, R., & Li, X. (2021). Bridging Towers of Multi-task Learning with a Gating Mechanism for Aspect-based Sentiment Analysis and Sequential Metaphor Identification. Proceedings of the AAAI Conference on Artificial Intelligence, 35(15), 13534-13542. https://doi.org/10.1609/aaai.v35i15.17596
AAAI Technical Track on Speech and Natural Language Processing II