MetaSymNet: A Tree-like Symbol Network with Adaptive Architecture and Activation Functions

Authors

  • Yanjie Li Institute of Semiconductors, Chinese Academy of Sciences School of Electronic, Electrical, and Communication Engineering, University of Chinese Academy of Sciences Zhongguancun Academy
  • Weijun Li Institute of Semiconductors, Chinese Academy of Sciences School of Electronic, Electrical, and Communication Engineering, University of Chinese Academy of Sciences Zhongguancun Academy School of Integrated Circuits, University of Chinese Academy of Sciences
  • Lina Yu Institute of Semiconductors, Chinese Academy of Sciences
  • Min Wu Institute of Semiconductors, Chinese Academy of Sciences
  • Jingyi Liu Institute of Semiconductors, Chinese Academy of Sciences
  • Shu Wei Institute of Semiconductors, Chinese Academy of Sciences School of Electronic, Electrical, and Communication Engineering, University of Chinese Academy of Sciences
  • Yusong Deng Institute of Semiconductors, Chinese Academy of Sciences School of Electronic, Electrical, and Communication Engineering, University of Chinese Academy of Sciences
  • Meilan Hao Institute of Semiconductors, Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v39i25.34915

Abstract

Mathematical formulas are the language of communication between humans and nature. Discovering latent formulas from observed data is an important challenge in artificial intelligence, commonly known as symbolic regression(SR). The current mainstream SR algorithms regard SR as a combinatorial optimization problem and use Genetic Programming (GP) or Reinforcement Learning (RL) to solve the SR problem. These methods perform well on simple problems, but poorly on slightly more complex tasks. In addition, this class of algorithms ignores an important aspect: in SR tasks, symbols have explicit numerical meaning. So can we take full advantage of this important property and try to solve the SR problem with more efficient numerical optimization methods? Extrapolation and Learning Equation (EQL) replaces activation functions in neural networks with basic symbols and sparsifies connections to derive a simplified expression from a large network. However, EQL's fixed network structure can't adapt to the complexity of different tasks, often resulting in redundancy or insufficient, limiting its effectiveness. Based on the above analysis, we propose MetaSymNet, a tree-like network that employs the PANGU meta-function as its activation function. PANGU meta-function can evolve into various candidate functions during training. The network structure can also be adaptively adjusted according to different tasks. Then the symbol network evolves into a concise, interpretable mathematical expression. To evaluate the performance of MetaSymNet and five baseline algorithms, we conducted experiments across more than ten datasets, including SRBench. The experimental results show that MetaSymNet has achieved relatively excellent results on various evaluation metrics.

Downloads

Published

2025-04-11

How to Cite

Li, Y., Li, W., Yu, L., Wu, M., Liu, J., Wei, S., … Hao, M. (2025). MetaSymNet: A Tree-like Symbol Network with Adaptive Architecture and Activation Functions. Proceedings of the AAAI Conference on Artificial Intelligence, 39(25), 27081–27089. https://doi.org/10.1609/aaai.v39i25.34915

Issue

Section

AAAI Technical Track on Search and Optimization