Error Diversity Matters: An Error-Resistant Ensemble Method for Unsupervised Dependency Parsing

Authors

  • Behzad Shayegh Dept. Computing Science, Alberta Machine Intelligence Institute (Amii), University of Alberta
  • Hobie H.-B. Lee Dept. Computing Science, Alberta Machine Intelligence Institute (Amii), University of Alberta
  • Xiaodan Zhu Dept. Electrical and Computer Engineering & Ingenuity Labs Research Institute, Queen’s University
  • Jackie Chi Kit Cheung Quebec Artificial Intelligence Institute (Mila), McGill University Canada CIFAR AI Chair
  • Lili Mou Dept. Computing Science, Alberta Machine Intelligence Institute (Amii), University of Alberta Canada CIFAR AI Chair

DOI:

https://doi.org/10.1609/aaai.v39i23.34697

Abstract

We address unsupervised dependency parsing by building an ensemble of diverse existing models through post hoc aggregation of their output dependency parse structures. We observe that these ensembles often suffer from low robustness against weak ensemble components due to error accumulation. To tackle this problem, we propose an efficient ensemble-selection approach that considers error diversity and avoids error accumulation. Results demonstrate that our approach outperforms each individual model as well as previous ensemble techniques. Additionally, our experiments show that the proposed ensemble-selection method significantly enhances the performance and robustness of our ensemble, surpassing previously proposed strategies, which have not accounted for error diversity.

Published

2025-04-11

How to Cite

Shayegh, B., Lee, H. H.-B., Zhu, X., Cheung, J. C. K., & Mou, L. (2025). Error Diversity Matters: An Error-Resistant Ensemble Method for Unsupervised Dependency Parsing. Proceedings of the AAAI Conference on Artificial Intelligence, 39(23), 25119–25127. https://doi.org/10.1609/aaai.v39i23.34697

Issue

Section

AAAI Technical Track on Natural Language Processing II