Understanding the Daily Lives of Older Adults: Integrating Multi-modal Personal Health Tracking Data through Visualization and Large Language Models
DOI:
https://doi.org/10.1609/aaaiss.v4i1.31790Abstract
Understanding the daily lives and routines of older adults is crucial to facilitate aging in place. Ubiquitous computing technologies like smartphones and wearables that are easy to deploy and scale, have become a popular method to collect comprehensive and longitudinal data for various demographics. Despite their popularity, several challenges persist when targeting the older adult population such as low compliance and hard to obtain feedback. In this work-in-progress paper, we present the design and development of a multi-modal sensing system that includes a phone, watch, and voice assistant. We are conducting an initial longitudinal study with one older adult participant over 30 days to explore how various types of data can be integrated through visualization techniques and large language models (LLMs). As a work-in-progress, we discussed our preliminary insights from the collected data, and conclude with a discussion of our future plans and directions for this research.Downloads
Published
2024-11-08
How to Cite
Li, J., Steinberg, J., Li, X., Yao, B., Wang, D., Mynatt, E., & Mishra, V. (2024). Understanding the Daily Lives of Older Adults: Integrating Multi-modal Personal Health Tracking Data through Visualization and Large Language Models. Proceedings of the AAAI Symposium Series, 4(1), 173–177. https://doi.org/10.1609/aaaiss.v4i1.31790
Issue
Section
Artificial Intelligence for Aging in Place - Short Papers