[1]
A. Vinogradov and B. Harrison, “Using Multi-Armed Bandits to Dynamically Update Player Models in an Experience Managed Environment”, AIIDE, vol. 18, no. 1, pp. 207-214, Oct. 2022.