Gaze-Based Interaction Adaptation for People with Involuntary Head Movements (Student Abstract)

Authors

  • Cindy Tong Nanyang Technological University, Singapore The Chinese University of Hong Kong, Hong Kong
  • Rosanna Chan The Chinese University of Hong Kong, Hong Kong Centre for Perceptual and Interactive Intelligence, Hong Kong

DOI:

https://doi.org/10.1609/aaai.v38i21.30519

Keywords:

Computer Vision, Human-Computer Interaction, Applications Of AI

Abstract

Gaze estimation is an important research area in computer vision and machine learning. Eye-tracking and gaze-based interactions have made assistive technology (AT) more accessible to people with physical limitations. However, a non-negligible proportion of existing AT users, including those having dyskinetic cerebral palsy (CP) or severe intellectual disabilities (ID), have difficulties in using eye trackers due to their involuntary body movements. In this paper, we propose an adaptation method pertaining to head movement prediction and fixation smoothing to stabilize our target users' gaze points on the screen and improve their user experience (UX) in gaze-based interaction. Our empirical experimentation shows that our method significantly shortens the users' selection time and increases their selection accuracy.

Published

2024-03-24

How to Cite

Tong, C., & Chan, R. (2024). Gaze-Based Interaction Adaptation for People with Involuntary Head Movements (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 38(21), 23669-23670. https://doi.org/10.1609/aaai.v38i21.30519