Fair Conformal Predictors for Applications in Medical Imaging
Keywords:AI For Social Impact (AISI Track Papers Only), Computer Vision (CV), Humans And AI (HAI)
AbstractDeep learning has the potential to automate many clinically useful tasks in medical imaging. However translation of deep learning into clinical practice has been hindered by issues such as lack of the transparency and interpretability in these ``black box'' algorithms compared to traditional statistical methods. Specifically, many clinical deep learning models lack rigorous and robust techniques for conveying certainty (or lack thereof) in their predictions -- ultimately limiting their appeal for extensive use in medical decision-making. Furthermore, numerous demonstrations of algorithmic bias have increased hesitancy towards deployment of deep learning for clinical applications. To this end, we explore how conformal predictions can complement existing deep learning approaches by providing an intuitive way of expressing uncertainty while facilitating greater transparency to clinical users. In this paper, we conduct field interviews with radiologists to assess possible use-cases for conformal predictors. Using insights gathered from these interviews, we devise two clinical use-cases and empirically evaluate several methods of conformal predictions on a dermatology photography dataset for skin lesion classification. We show how to modify conformal predictions to be more adaptive to subgroup differences in patient skin tones through equalized coverage. Finally, we compare conformal prediction against measures of epistemic uncertainty.
How to Cite
Lu, C., Lemay, A., Chang, K., Höbel, K., & Kalpathy-Cramer, J. (2022). Fair Conformal Predictors for Applications in Medical Imaging. Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 12008-12016. https://doi.org/10.1609/aaai.v36i11.21459
AAAI Special Track on AI for Social Impact