LogFormer: A Pre-train and Tuning Pipeline for Log Anomaly Detection

Authors

  • Hongcheng Guo State Key Lab of Software Development Environment, Beihang University, Beijing, China
  • Jian Yang State Key Lab of Software Development Environment, Beihang University, Beijing, China
  • Jiaheng Liu State Key Lab of Software Development Environment, Beihang University, Beijing, China
  • Jiaqi Bai State Key Lab of Software Development Environment, Beihang University, Beijing, China
  • Boyang Wang State Key Lab of Software Development Environment, Beihang University, Beijing, China
  • Zhoujun Li State Key Lab of Software Development Environment, Beihang University, Beijing, China
  • Tieqiao Zheng Cloudwise Research, Beijing, China
  • Bo Zhang Cloudwise Research, Beijing, China
  • Junran Peng State Key Lab of Software Development Environment, Beihang University, Beijing, China
  • Qi Tian Huawei, Beijing, China

DOI:

https://doi.org/10.1609/aaai.v38i1.27764

Keywords:

APP: Software Engineering, DMKM: Anomaly/Outlier Detection

Abstract

Log anomaly detection is a key component in the field of artificial intelligence for IT operations (AIOps). Considering log data of variant domains, retraining the whole network for unknown domains is inefficient in real industrial scenarios. However, previous deep models merely focused on extracting the semantics of log sequences in the same domain, leading to poor generalization on multi-domain logs. To alleviate this issue, we propose a unified Transformer-based framework for Log anomaly detection (LogFormer) to improve the generalization ability across different domains, where we establish a two-stage process including the pre-training and adapter-based tuning stage. Specifically, our model is first pre-trained on the source domain to obtain shared semantic knowledge of log data. Then, we transfer such knowledge to the target domain via shared parameters. Besides, the Log-Attention module is proposed to supplement the information ignored by the log-paring. The proposed method is evaluated on three public datasets and one real-world dataset. Experimental results on multiple benchmarks demonstrate the effectiveness of our LogFormer with fewer trainable parameters and lower training costs.

Published

2024-03-25

How to Cite

Guo, H., Yang, J., Liu, J., Bai, J., Wang, B., Li, Z., Zheng, T., Zhang, B., Peng, J., & Tian, Q. (2024). LogFormer: A Pre-train and Tuning Pipeline for Log Anomaly Detection. Proceedings of the AAAI Conference on Artificial Intelligence, 38(1), 135-143. https://doi.org/10.1609/aaai.v38i1.27764

Issue

Section

AAAI Technical Track on Application Domains