Heterogeneous Graph Learning for Multi-Modal Medical Data Analysis
DOI:
https://doi.org/10.1609/aaai.v37i4.25643Keywords:
APP: Healthcare, Medicine & Wellness, CV: Medical and Biological ImagingAbstract
Routine clinical visits of a patient produce not only image data, but also non-image data containing clinical information regarding the patient, i.e., medical data is multi-modal in nature. Such heterogeneous modalities offer different and complementary perspectives on the same patient, resulting in more accurate clinical decisions when they are properly combined. However, despite its significance, how to effectively fuse the multi-modal medical data into a unified framework has received relatively little attention. In this paper, we propose an effective graph-based framework called HetMed (Heterogeneous Graph Learning for Multi-modal Medical Data Analysis) for fusing the multi-modal medical data. Specifically, we construct a multiplex network that incorporates multiple types of non-image features of patients to capture the complex relationship between patients in a systematic way, which leads to more accurate clinical decisions. Extensive experiments on various real-world datasets demonstrate the superiority and practicality of HetMed. The source code for HetMed is available at https://github.com/Sein-Kim/Multimodal-Medical.Downloads
Published
2023-06-26
How to Cite
Kim, S., Lee, N., Lee, J., Hyun, D., & Park, C. (2023). Heterogeneous Graph Learning for Multi-Modal Medical Data Analysis. Proceedings of the AAAI Conference on Artificial Intelligence, 37(4), 5141-5150. https://doi.org/10.1609/aaai.v37i4.25643
Issue
Section
AAAI Technical Track on Domain(s) of Application