Diaformer: Automatic Diagnosis via Symptoms Sequence Generation


  • Junying Chen Harbin Institute of Technology, Shenzhen
  • Dongfang Li Harbin Institute of Technology, Shenzhen
  • Qingcai Chen Harbin Institute of Technology, Shenzhen Peng Cheng Laboratory
  • Wenxiu Zhou Harbin Institute of Technology, Shenzhen
  • Xin Liu Peng Cheng Laboratory




Domain(s) Of Application (APP), Speech & Natural Language Processing (SNLP)


Automatic diagnosis has attracted increasing attention but remains challenging due to multi-step reasoning. Recent works usually address it by reinforcement learning methods. However, these methods show low efficiency and require task-specific reward functions. Considering the conversation between doctor and patient allows doctors to probe for symptoms and make diagnoses, the diagnosis process can be naturally seen as the generation of a sequence including symptoms and diagnoses. Inspired by this, we reformulate automatic diagnosis as a symptoms Sequence Generation (SG) task and propose a simple but effective automatic Diagnosis model based on Transformer (Diaformer). We firstly design the symptom attention framework to learn the generation of symptom inquiry and the disease diagnosis. To alleviate the discrepancy between sequential generation and disorder of implicit symptoms, we further design three orderless training mechanisms. Experiments on three public datasets show that our model outperforms baselines on disease diagnosis by 1%, 6% and 11.5% with the highest training efficiency. Detailed analysis on symptom inquiry prediction demonstrates that the potential of applying symptoms sequence generation for automatic diagnosis.




How to Cite

Chen, J., Li, D., Chen, Q., Zhou, W., & Liu, X. (2022). Diaformer: Automatic Diagnosis via Symptoms Sequence Generation. Proceedings of the AAAI Conference on Artificial Intelligence, 36(4), 4432-4440. https://doi.org/10.1609/aaai.v36i4.20365



AAAI Technical Track on Domain(s) Of Application