LATTE: Latent Type Modeling for Biomedical Entity Linking


  • Ming Zhu Virginia Tech
  • Busra Celikkaya AWS AI
  • Parminder Bhatia AWS AI
  • Chandan K. Reddy Virginia Tech



Entity linking is the task of linking mentions of named entities in natural language text, to entities in a curated knowledge-base. This is of significant importance in the biomedical domain, where it could be used to semantically annotate a large volume of clinical records and biomedical literature, to standardized concepts described in an ontology such as Unified Medical Language System (UMLS). We observe that with precise type information, entity disambiguation becomes a straightforward task. However, fine-grained type information is usually not available in biomedical domain. Thus, we propose LATTE, a LATent Type Entity Linking model, that improves entity linking by modeling the latent fine-grained type information about mentions and entities. Unlike previous methods that perform entity linking directly between the mentions and the entities, LATTE jointly does entity disambiguation, and latent fine-grained type learning, without direct supervision. We evaluate our model on two biomedical datasets: MedMentions, a large scale public dataset annotated with UMLS concepts, and a de-identified corpus of dictated doctor's notes that has been annotated with ICD concepts. Extensive experimental evaluation shows our model achieves significant performance improvements over several state-of-the-art techniques.




How to Cite

Zhu, M., Celikkaya, B., Bhatia, P., & Reddy, C. K. (2020). LATTE: Latent Type Modeling for Biomedical Entity Linking. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9757-9764.



AAAI Technical Track: Natural Language Processing