TY - JOUR AU - Peng, Xiaochang AU - Gildea, Daniel AU - Satta, Giorgio PY - 2018/04/26 Y2 - 2024/03/29 TI - AMR Parsing With Cache Transition Systems JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 32 IS - 1 SE - Main Track: NLP and Knowledge Representation DO - 10.1609/aaai.v32i1.11922 UR - https://ojs.aaai.org/index.php/AAAI/article/view/11922 SP - AB - <p> In this paper, we present a transition system that generalizes transition-based dependency parsing techniques to generateAMR graphs rather than tree structures. In addition to a buffer and a stack, we use a fixed-size cache, and allow the system to build arcs to any vertices present in the cache at the same time. The size of the cache provides a parameter that can trade off between the complexity of the graphs that can be built and the ease of predicting actions during parsing. Our results show that a cache transition system can cover almost all AMR graphs with a small cache size, and our end-to-end system achieves competitive results in comparison with other transition-based approaches for AMR parsing. </p> ER -