A Word Embedding and a Josa Vector for Korean Unsupervised Semantic Role Induction
DOI:
https://doi.org/10.1609/aaai.v30i1.9923Abstract
We propose an unsupervised semantic role labeling method for Korean language, one of the agglutinative languages which have complicated suffix structures telling much of syntactic. First, we construct an argument embedding and then develop a indicator vector of the suffix such as a Josa. And, we construct an argument tuple by concatenating above two vectors. The role induction is performed by clustering the argument tuples.These method which achieves up to a 70.16% of F1-score and 75.85% of accuracy.
Downloads
Published
2016-03-05
How to Cite
Nam, K.-M., & Kim, Y.-S. (2016). A Word Embedding and a Josa Vector for Korean Unsupervised Semantic Role Induction. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.9923
Issue
Section
Student Abstracts and Posters