Parameterized Approximation Algorithms for K-center Clustering and Variants


  • Sayan Bandyapadhyay University of Bergen
  • Zachary Friggstad University of Alberta
  • Ramin Mousavi University of Alberta



Data Mining & Knowledge Management (DMKM), Search And Optimization (SO), Machine Learning (ML)


k-center is one of the most popular clustering models. While it admits a simple 2-approximation in polynomial time in general metrics, the Euclidean version is NP-hard to approximate within a factor of 1.93, even in the plane, if one insists the dependence on k in the running time be polynomial. Without this restriction, a classic algorithm yields a 2^{O((klog k)/{epsilon})}dn-time (1+epsilon)-approximation for Euclidean k-center, where d is the dimension. In this work, we give a faster algorithm for small dimensions: roughly speaking an O^*(2^{O((1/epsilon)^{O(d)} k^{1-1/d} log k)})-time (1+epsilon)-approximation. In particular, the running time is roughly O^*(2^{O((1/epsilon)^{O(1)}sqrt{k}log k)}) in the plane. We complement our algorithmic result with a matching hardness lower bound. We also consider a well-studied generalization of k-center, called Non-uniform k-center (NUkC), where we allow different radii clusters. NUkC is NP-hard to approximate within any factor, even in the Euclidean case. We design a 2^{O(klog k)}n^2 time 3-approximation for NUkC, and a 2^{O((klog k)/epsilon)}dn time (1+\epsilon)-approximation for Euclidean NUkC. The latter time bound matches the bound for k-center.




How to Cite

Bandyapadhyay, S., Friggstad, Z., & Mousavi, R. (2022). Parameterized Approximation Algorithms for K-center Clustering and Variants. Proceedings of the AAAI Conference on Artificial Intelligence, 36(4), 3895-3903.



AAAI Technical Track on Data Mining and Knowledge Management