OpEvo: An Evolutionary Method for Tensor Operator Optimization

Authors

  • Xiaotian Gao Microsoft Research Asia
  • Wei Cui Microsoft Research Asia
  • Lintao Zhang Microsoft Research Asia
  • Mao Yang Microsoft Research Asia

Keywords:

Evolutionary Computation, Hyperparameter Tuning / Algorithm Configuration

Abstract

Training and inference efficiency of deep neural networks highly rely on the performance of tensor operators on hardware platforms. Manually optimizing tensor operators has limitations in terms of supporting new operators or hardware platforms. Therefore, automatically optimizing device code configurations of tensor operators is getting increasingly attractive. However, current methods for tensor operator optimization usually suffer from poor sample-efficiency due to the combinatorial search space. In this work, we propose a novel evolutionary method, OpEvo, which efficiently explores the search spaces of tensor operators by introducing a topology-aware mutation operation based on q-random walk to leverage the topological structures over the search spaces. Our comprehensive experiment results show that compared with state-of-the-art(SOTA) methods OpEvo can find the best configuration with the lowest variance and least efforts in the number of trials and wall-clock time. All code of this work is available online.

Downloads

Published

2021-05-18

How to Cite

Gao, X., Cui, W., Zhang, L., & Yang, M. (2021). OpEvo: An Evolutionary Method for Tensor Operator Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 35(14), 12320-12327. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17462

Issue

Section

AAAI Technical Track on Search and Optimization