Semantic Representation Using Explicit Concept Space Models

Authors

  • Walid Shalaby University of North Carolina at Charlotte
  • Wlodek Zadrozny University of North Carolina at Charlotte

DOI:

https://doi.org/10.1609/aaai.v31i1.11097

Keywords:

Semantic relatedness, concept space models, bag-of-concepts, association rule mining

Abstract

Explicit concept space models have proven efficacy for text representation in many natural language and text mining applications. The idea is to embed textual structures into a semantic space of concepts which captures the main topics of these structures. Despite their wide applicability, existing models have many shortcomings such as sparsity and being restricted to Wikipedia as the main knowledge source from which concepts are extracted. In this paper we highlight some of these limitations. We also describe Mined Semantic Analysis (MSA); a novel concept space model which employs unsupervised learning in order to uncover implicit relations between concepts. MSA leverages the discovered concept-concept associations to enrich the semantic representations. We evaluate MSA’s performance on benchmark data sets for measuring lexical semantic relatedness. Empirical results show superior performance of MSA compared to prior state-of-the-art methods.

Downloads

Published

2017-02-12

How to Cite

Shalaby, W., & Zadrozny, W. (2017). Semantic Representation Using Explicit Concept Space Models. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.11097