AJILE Movement Prediction: Multimodal Deep Learning for Natural Human Neural Recordings and Video

Authors

  • Nancy Wang University of Washington
  • Ali Farhadi University of Washington
  • Rajesh Rao University of Washington
  • Bingni Brunton University of Washington

DOI:

https://doi.org/10.1609/aaai.v32i1.11889

Keywords:

deep learning, neuroscience, multimodal, naturalistic

Abstract

Developing useful interfaces between brains and machines is a grand challenge of neuroengineering. An effective interface has the capacity to not only interpret neural signals, but predict the intentions of the human to perform an action in the near future; prediction is made even more challenging outside well-controlled laboratory experiments. This paper describes our approach to detect and to predict natural human arm movements in the future, a key challenge in brain computer interfacing that has never before been attempted. We introduce the novel Annotated Joints in Long-term ECoG (AJILE) dataset; AJILE includes automatically annotated poses of 7 upper body joints for four human subjects over 670 total hours (more than 72 million frames), along with the corresponding simultaneously acquired intracranial neural recordings. The size and scope of AJILE greatly exceeds all previous datasets with movements and electrocorticography (ECoG), making it possible to take a deep learning approach to movement prediction. We propose a multimodal model that combines deep convolutional neural networks (CNN) with long short-term memory (LSTM) blocks, leveraging both ECoG and video modalities. We demonstrate that our models are able to detect movements and predict future movements up to 800 msec before movement initiation. Further, our multimodal movement prediction models exhibit resilience to simulated ablation of input neural signals. We believe a multimodal approach to natural neural decoding that takes context into account is critical in advancing bioelectronic technologies and human neuroscience.

Downloads

Published

2018-04-26

How to Cite

Wang, N., Farhadi, A., Rao, R., & Brunton, B. (2018). AJILE Movement Prediction: Multimodal Deep Learning for Natural Human Neural Recordings and Video. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11889

Issue

Section

Main Track: Machine Learning Applications