[1]
Wang, S. et al. 2026. Preference Is More than Comparisons: Rethinking Dueling Bandits with Augmented Human Feedback. Proceedings of the AAAI Conference on Artificial Intelligence. 40, 31 (Mar. 2026), 26453–26461. DOI:https://doi.org/10.1609/aaai.v40i31.39852.