Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published October 2021 | Submitted
Journal Article Open

Online Optimization With Predictions and Switching Costs: Fast Algorithms and the Fundamental Limit

Abstract

This article considers online optimization with a finite prediction window of cost functions and additional switching costs on the decisions. We study the fundamental limits of dynamic regret of any online algorithm for both the with-prediction and the no-prediction cases. Besides, we propose two gradient-based online algorithms: receding horizon gradient descent (RHGD) and receding horizon accelerated gradient (RHAG); and provide their regret upper bounds. RHAG's regret upper bound is close to the lower bound, indicating the tightness of our lower bound and that our RHAG is near-optimal. Finally, we conduct numerical experiments to complement the theoretical results.

Additional Information

© 2020 IEEE. Manuscript received August 21, 2020; accepted October 28, 2020. Date of publication November 24, 2020; date of current version September 27, 2021. This work was supported in part by NSF CAREER grant under Grant ECCS-1553407, in part by AFOSR YIP under Grant FA9550-18-1-0150, and in part by ONR YIP under Grant N00014-19-1-2217. Recommended by Associate Editor S. Grammatico.

Attached Files

Submitted - 1801.07780.pdf

Files

1801.07780.pdf
Files (586.9 kB)
Name Size Download all
md5:829d9a410b0c1a8b50ac619753cb221e
586.9 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 23, 2023