Entity-Agnostic Representation Learning for Parameter-Efficient Knowledge Graph Embedding
Keywords:DMKM: Linked Open Data, Knowledge Graphs & KB Completion, DMKM: Semantic Web
AbstractWe propose an entity-agnostic representation learning method for handling the problem of inefficient parameter storage costs brought by embedding knowledge graphs. Conventional knowledge graph embedding methods map elements in a knowledge graph, including entities and relations, into continuous vector spaces by assigning them one or multiple specific embeddings (i.e., vector representations). Thus the number of embedding parameters increases linearly as the growth of knowledge graphs. In our proposed model, Entity-Agnostic Representation Learning (EARL), we only learn the embeddings for a small set of entities and refer to them as reserved entities. To obtain the embeddings for the full set of entities, we encode their distinguishable information from their connected relations, k-nearest reserved entities, and multi-hop neighbors. We learn universal and entity-agnostic encoders for transforming distinguishable information into entity embeddings. This approach allows our proposed EARL to have a static, efficient, and lower parameter count than conventional knowledge graph embedding methods. Experimental results show that EARL uses fewer parameters and performs better on link prediction tasks than baselines, reflecting its parameter efficiency.
How to Cite
Chen, M., Zhang, W., Yao, Z., Zhu, Y., Gao, Y., Z. Pan, J., & Chen, H. (2023). Entity-Agnostic Representation Learning for Parameter-Efficient Knowledge Graph Embedding. Proceedings of the AAAI Conference on Artificial Intelligence, 37(4), 4182-4190. https://doi.org/10.1609/aaai.v37i4.25535
AAAI Technical Track on Data Mining and Knowledge Management