Weakly Supervised Neuro-Symbolic Module Networks for Numerical Reasoning over Text


  • Amrita Saha Salesforce Research Asia
  • Shafiq Joty Salesforce Research Asia, Nanyang Technological University
  • Steven C.H. Hoi Salesforce Research Asia




Speech & Natural Language Processing (SNLP)


Neural Module Networks (NMNs) have been quite successful in incorporating explicit reasoning as learnable modules in various question answering tasks, including the most generic form of numerical reasoning over text in Machine Reading Comprehension (MRC). However to achieve this, contemporary Neural Module Networks models obtain strong supervision in form of specialized program annotation from the QA pairs through various heuristic parsing and exhaustive computation of all possible discrete operations on discrete arguments. Consequently they fail to generalize to more open-ended settings without such supervision. Hence, we propose Weakly Supervised Neuro-Symbolic Module Network (WNSMN) trained with answers as the sole supervision for numerical reasoning based MRC. WNSMN learns to execute a noisy heuristic program obtained from the dependency parse of the query, as discrete actions over both neural and symbolic reasoning modules and trains it end-to-end in a reinforcement learning framework with discrete reward from answer matching. On the subset of DROP having numerical answers, WNSMN outperforms NMN by 32% and the reasoning-free generative language model GenBERT by 8% in exact match accuracy under comparable weakly supervised settings. This showcases the effectiveness of modular networks that can handle explicit discrete reasoning over noisy programs in an end-to-end manner.




How to Cite

Saha, A., Joty, S., & Hoi, S. C. (2022). Weakly Supervised Neuro-Symbolic Module Networks for Numerical Reasoning over Text. Proceedings of the AAAI Conference on Artificial Intelligence, 36(10), 11238-11247. https://doi.org/10.1609/aaai.v36i10.21374



AAAI Technical Track on Speech and Natural Language Processing