Learning Adaptive Hidden Layers for Mobile Gesture Recognition

Authors

  • Ting-Kuei Hu Academia Sinica
  • Yen-Yu Lin Academia Sinica
  • Pi-Cheng Hsiu Academia Sinica

DOI:

https://doi.org/10.1609/aaai.v32i1.12279

Keywords:

deep learning, gesture recognition, adaptive hidden layer

Abstract

This paper addresses two obstacles hindering advances in accurate gesture recognition on mobile devices. First, gesture recognition performance is highly dependent on feature selection, but optimal features typically vary from gesture to gesture. Second, diverse user behaviors and mobile environments result in extremely large intra-class variations. We tackle these issues by introducing a new network layer, called an adaptive hidden layer (AHL), to generalize a hidden layer in deep neural networks and dynamically generate an activation map conditioned on the input. To this end, an AHL is composed of multiple neuron groups and an extra selector. The former compiles multi-modal features captured by mobile sensors, while the latter adaptively picks a plausible group for each input sample. The AHL is end-to-end trainable and can generalize an arbitrary subset of hidden layers. Through a series of AHLs, the great expressive power from exponentially many forward paths allows us to choose proper multi-modal features in a sample-specific fashion and resolve the problems caused by the unfavorable variations in mobile gesture recognition. The proposed approach is evaluated on a benchmark for gesture recognition and a newly collected dataset. Superior performance demonstrates its effectiveness.

Downloads

Published

2018-04-27

How to Cite

Hu, T.-K., Lin, Y.-Y., & Hsiu, P.-C. (2018). Learning Adaptive Hidden Layers for Mobile Gesture Recognition. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12279