SegFormer: A Topic Segmentation Model with Controllable Range of Attention

Authors

  • Haitao Bai Xi’an Jiaotong University
  • Pinghui Wang Xi'an Jiaotong University
  • Ruofei Zhang Xian Jiaotong University
  • Zhou Su Xi'an Jiaotong University

DOI:

https://doi.org/10.1609/aaai.v37i11.26477

Keywords:

SNLP: Applications, SNLP: Information Extraction, SNLP: Language Models, SNLP: Machine Translation & Multilinguality, SNLP: Sentence-Level Semantics and Textual Inference, SNLP: Summarization, SNLP: Text Classification, SNLP: Text Mining

Abstract

Topic segmentation aims to reveal the latent structure of a document and divide it into multiple parts. However, current neural solutions are limited in the context modeling of sentences and feature representation of candidate boundaries. This causes the model to suffer from inefficient sentence context encoding and noise information interference. In this paper, we design a new text segmentation model SegFormer with unidirectional attention blocks to better model sentence representations. To alleviate the problem of noise information interference, SegFormer uses a novel additional context aggregator and a topic classification loss to guide the model to aggregate the information within the appropriate range. In addition, SegFormer applies an iterative prediction algorithm to search for optimal boundaries progressively. We evaluate SegFormer's generalization ability, multilingual ability, and application ability on multiple challenging real-world datasets. Experiments show that our model significantly improves the performance by 7.5% on the benchmark WIKI-SECTION compared to several strong baselines. The application of SegFormer to a real-world dataset to separate normal and advertisement segments in product marketing essays also achieves superior performance in the evaluation with other cutting-edge models.

Downloads

Published

2023-06-26

How to Cite

Bai, H., Wang, P., Zhang, R., & Su, Z. (2023). SegFormer: A Topic Segmentation Model with Controllable Range of Attention. Proceedings of the AAAI Conference on Artificial Intelligence, 37(11), 12545-12552. https://doi.org/10.1609/aaai.v37i11.26477

Issue

Section

AAAI Technical Track on Speech & Natural Language Processing