NeuralArTS: Structuring Neural Architecture Search with Type Theory (Student Abstract)

Authors

  • Robert Wu University of Toronto ML Collective
  • Nayan Saxena University of Toronto ML Collective
  • Rohan Jain University of Toronto ML Collective

DOI:

https://doi.org/10.1609/aaai.v36i11.21679

Keywords:

Deep Learning, Optimization, Neural Architecture Search, Automated Machine Learning, Type Theory

Abstract

Neural Architecture Search (NAS) algorithms automate the task of finding optimal deep learning architectures given an initial search space of possible operations. Developing these search spaces is usually a manual affair with pre-optimized search spaces being more efficient, rather than searching from scratch. In this paper we present a new framework called Neural Architecture Type System (NeuralArTS) that categorizes the infinite set of network operations in a structured type system. We further demonstrate how NeuralArTS can be applied to convolutional layers and propose several future directions.

Downloads

Published

2022-06-28

How to Cite

Wu, R., Saxena, N., & Jain, R. (2022). NeuralArTS: Structuring Neural Architecture Search with Type Theory (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 13085-13086. https://doi.org/10.1609/aaai.v36i11.21679