ParamE: Regarding Neural Network Parameters as Relation Embeddings for Knowledge Graph Completion


  • Feihu Che Chinese Academy of Sciences
  • Dawei Zhang Institute of Automation,Chinese Academy of Sciences
  • Jianhua Tao Chinese Academy of Sciences
  • Mingyue Niu Chinese Academy of Sciences
  • Bocheng Zhao Chinese Academy of Sciences



We study the task of learning entity and relation embeddings in knowledge graphs for predicting missing links. Previous translational models on link prediction make use of translational properties but lack enough expressiveness, while the convolution neural network based model (ConvE) takes advantage of the great nonlinearity fitting ability of neural networks but overlooks translational properties. In this paper, we propose a new knowledge graph embedding model called ParamE which can utilize the two advantages together. In ParamE, head entity embeddings, relation embeddings and tail entity embeddings are regarded as the input, parameters and output of a neural network respectively. Since parameters in networks are effective in converting input to output, taking neural network parameters as relation embeddings makes ParamE much more expressive and translational. In addition, the entity and relation embeddings in ParamE are from feature space and parameter space respectively, which is in line with the essence that entities and relations are supposed to be mapped into two different spaces. We evaluate the performances of ParamE on standard FB15k-237 and WN18RR datasets, and experiments show ParamE can significantly outperform existing state-of-the-art models, such as ConvE, SACN, RotatE and D4-STE/Gumbel.




How to Cite

Che, F., Zhang, D., Tao, J., Niu, M., & Zhao, B. (2020). ParamE: Regarding Neural Network Parameters as Relation Embeddings for Knowledge Graph Completion. Proceedings of the AAAI Conference on Artificial Intelligence, 34(03), 2774-2781.



AAAI Technical Track: Knowledge Representation and Reasoning