Automata Cascades: Expressivity and Sample Complexity


  • Alessandro Ronca Sapienza University of Rome
  • Nadezda Alexandrovna Knorozova RelationalAI University of Zurich
  • Giuseppe De Giacomo University of Oxford Sapienza University of Rome



ML: Learning Theory, KRR: Computational Complexity of Reasoning, KRR: Geometric, Spatial, and Temporal Reasoning, ML: Classification and Regression, ML: Reinforcement Learning Theory, ML: Time-Series/Data Streams


Every automaton can be decomposed into a cascade of basic prime automata. This is the Prime Decomposition Theorem by Krohn and Rhodes. Guided by this theory, we propose automata cascades as a structured, modular, way to describe automata as complex systems made of many components, each implementing a specific functionality. Any automaton can serve as a component; using specific components allows for a fine-grained control of the expressivity of the resulting class of automata; using prime automata as components implies specific expressivity guarantees. Moreover, specifying automata as cascades allows for describing the sample complexity of automata in terms of their components. We show that the sample complexity is linear in the number of components and the maximum complexity of a single component, modulo logarithmic factors. This opens to the possibility of learning automata representing large dynamical systems consisting of many parts interacting with each other. It is in sharp contrast with the established understanding of the sample complexity of automata, described in terms of the overall number of states and input letters, which implies that it is only possible to learn automata where the number of states is linear in the amount of data available. Instead our results show that one can learn automata with a number of states that is exponential in the amount of data available.




How to Cite

Ronca, A., Knorozova, N. A., & De Giacomo, G. (2023). Automata Cascades: Expressivity and Sample Complexity. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9588-9595.



AAAI Technical Track on Machine Learning III