Visual Gait Alignment for Sensorless Prostheses: Toward an Interpretable Digital Twin Framework

Authors

  • Jingyang Cui The University of Alabama
  • Fei Hu The University of Alabama
  • Greg Berkeley The University of Alabama
  • Weiqiang Lyu The University of Alabama
  • Xiangrong Shen The University of Alabama

DOI:

https://doi.org/10.1609/aaaiss.v7i1.36922

Abstract

A safe and interpretable visual method for prosthetic alignment assessment is proposed, suitable for sensorless scenarios such as home rehabilitation and telemedicine. The method collects human skeletal data based on a depth camera and extracts the motion difference characteristics of the left and right legs through gait symmetry analysis. Three types of clearly structured evaluation indicators are designed, including differences in joint range of motion, differences in swing phase duration, and angular trajectory similarity, to construct an interpretable alignment scoring function. This system is designed as a front-end module of a digital twin system. The scoring results can intuitively reflect differences in wearing status, facilitating real-time evaluation and adjustment of prosthetic alignment quality. Preliminary experiments have verified the stability and practicality of this method under visual recognition conditions, laying the foundation for personalized prosthetic optimization based on digital twins.

Downloads

Published

2025-11-23

How to Cite

Cui, J., Hu, F., Berkeley, G., Lyu, W., & Shen, X. (2025). Visual Gait Alignment for Sensorless Prostheses: Toward an Interpretable Digital Twin Framework. Proceedings of the AAAI Symposium Series, 7(1), 488-495. https://doi.org/10.1609/aaaiss.v7i1.36922

Issue

Section

Safe, Ethical, Certified, Uncertainty-aware, Robust, and Explainable AI for Health (SECURE-AI4H)