Improving Search Engine Efficiency through Contextual Factor Selection

Authors

  • Anxiang Zeng Nanyang Technological University
  • Han Yu Nanyang Technological University
  • Qing Da Alibaba Group
  • Yusen Zhan Alibaba Group
  • Yang Yu Nanjing University
  • Jingren Zhou Alibaba Group
  • Chunyan Miao Nanyang Technological University

DOI:

https://doi.org/10.1609/aimag.v42i2.15099

Abstract

Learning to rank (LTR) is an important artificial intelligence (AI) approach supporting the operation of many search engines. In large-scale search systems, the ranking results are continually improved with the introduction of more factors to be considered by LTR. However, the more factors being considered, the more computation resources required, which in turn, results in increased system response latency. Therefore, removing redundant factors can significantly improve search engine efficiency. In this paper, we report on our experience incorporating our Contextual Factor Selection (CFS) deep reinforcement learning approach into the Taobao e-commerce platform to optimize the selection of factors based on the context of each search query to simultaneously maintaining search result quality while significantly reducing latency. Online deployment on Taobao.com demonstrated that CFS is able to reduce average search latency under everyday use scenarios by more than 40% compared to the previous approach with comparable search result quality. Under peak usage during the Single’s Day Shopping Festival (November 11th) in 2017, CFS reduced the average search latency by 20% compared to the previous approach.

Downloads

Published

2021-10-20

How to Cite

Zeng, A. ., Yu, H., Da, Q., Zhan, Y., Yu, Y., Zhou, J. ., & Miao, C. (2021). Improving Search Engine Efficiency through Contextual Factor Selection. AI Magazine, 42(2), 50-58. https://doi.org/10.1609/aimag.v42i2.15099

Issue

Section

Special Topic Articles