Projection-free Online Learning over Strongly Convex Sets

Authors

  • Yuanyu Wan Nanjing University
  • Lijun Zhang Nanjing University Pazhou Lab

Keywords:

Online Learning & Bandits, Optimization, Time-Series/Data Streams

Abstract

To efficiently solve online problems with complicated constraints, projection-free algorithms including online frank-wolfe (OFW) and its variants have received significant interest recently. However, in the general case, existing efficient projection-free algorithms only achieved the regret bound of O(T^{3/4}), which is worse than the regret of projection-based algorithms, where T is the number of decision rounds. In this paper, we study the special case of online learning over strongly convex sets, for which we first prove that OFW can enjoy a better regret bound of O(T^{2/3}) for general convex losses. The key idea is to refine the decaying step-size in the original OFW by a simple line search rule. Furthermore, for strongly convex losses, we propose a strongly convex variant of OFW by redefining the surrogate loss function in OFW. We show that it achieves a regret bound of O(T^{2/3}) over general convex sets and a better regret bound of O(T^{1/2}) over strongly convex sets.

Downloads

Published

2021-05-18

How to Cite

Wan, Y., & Zhang, L. (2021). Projection-free Online Learning over Strongly Convex Sets. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 10076-10084. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17209

Issue

Section

AAAI Technical Track on Machine Learning IV