A Multi-Task Learning Approach for Improving Product Title Compression with User Search Log Data

Authors

  • Jingang Wang iDST, Alibaba Group
  • Junfeng Tian East China Normal University
  • Long Qiu Onehome (Beijing) Network Technology Co. Ltd.
  • Sheng Li iDST, Alibaba Group
  • Jun Lang iDST, Alibaba Group
  • Luo Si iDST, Alibaba Group
  • Man Lan East China Normal University

DOI:

https://doi.org/10.1609/aaai.v32i1.11264

Keywords:

Sentence Summarization, Product Title Compression, Multi-task Learning

Abstract

It is a challenging and practical research problem to obtain effective compression of lengthy product titles for E-commerce. This is particularly important as more and more users browse mobile E-commerce apps and more merchants make the original product titles redundant and lengthy for Search Engine Optimization. Traditional text summarization approaches often require a large amount of preprocessing costs and do not capture the important issue of conversion rate in E-commerce. This paper proposes a novel multi-task learning approach for improving product title compression with user search log data. In particular, a pointer network-based sequence-to-sequence approach is utilized for title compression with an attentive mechanism as an extractive method and an attentive encoder-decoder approach is utilized for generating user search queries. The encoding parameters (i.e., semantic embedding of original titles) are shared among the two tasks and the attention distributions are jointly optimized. An extensive set of experiments with both human annotated data and online deployment demonstrate the advantage of the proposed research for both compression qualities and online business values.

Downloads

Published

2018-04-25

How to Cite

Wang, J., Tian, J., Qiu, L., Li, S., Lang, J., Si, L., & Lan, M. (2018). A Multi-Task Learning Approach for Improving Product Title Compression with User Search Log Data. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11264