Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 2, 2020 | Accepted Version
Report Open

Logarithmic Regret Bound in Partially Observable Linear Dynamical Systems

Abstract

We study the problem of adaptive control in partially observable linear dynamical systems. We propose a novel algorithm, adaptive control online learning algorithm (AdaptOn), which efficiently explores the environment, estimates the system dynamics episodically and exploits these estimates to design effective controllers to minimize the cumulative costs. Through interaction with the environment, AdaptOn deploys online convex optimization to optimize the controller while simultaneously learning the system dynamics to improve the accuracy of controller updates. We show that when the cost functions are strongly convex, after T times step of agent-environment interaction, AdaptOn achieves regret upper bound of polylog(T). To the best of our knowledge, AdaptOn is the first algorithm which achieves polylog(T) regret in adaptive control of unknown partially observable linear dynamical systems which includes linear quadratic Gaussian (LQG) control.

Additional Information

S. Lale is supported in part by DARPA PAI. K. Azizzadenesheli gratefully acknowledge the financial support of Raytheon and Amazon Web Services. B. Hassibi is supported in part by the National Science Foundation under grants CNS-0932428, CCF-1018927, CCF-1423663 and CCF-1409204, by a grant from Qualcomm Inc., by NASA's Jet Propulsion Laboratory through the President and Director's Fund, and by King Abdullah University of Science and Technology. A. Anandkumar is supported in part by Bren endowed chair, DARPA PAIHR00111890035 and LwLL grants, Raytheon, Microsoft, Google, and Adobe faculty fellowships.

Attached Files

Accepted Version - 2003.11227.pdf

Files

2003.11227.pdf
Files (1.1 MB)
Name Size Download all
md5:e8c73650636a8f3d0ff8fb04ea11ecb1
1.1 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 20, 2023