Suboptimal stabilizing controllers for linearly solvable system
Abstract
This paper presents a novel method to synthesize stochastic control Lyapunov functions for a class of nonlinear, stochastic control systems. In this work, the classical nonlinear Hamilton-Jacobi-Bellman partial differential equation is transformed into a linear partial differential equation for a class of systems with a particular constraint on the stochastic disturbance. It is shown that this linear partial differential equation can be relaxed to a linear differential inclusion, allowing for approximating polynomial solutions to be generated using sum of squares programming. It is shown that the resulting solutions are stochastic control Lyapunov functions with a number of compelling properties. In particular, a-priori bounds on trajectory suboptimality are shown for these approximate value functions. The result is a technique whereby approximate solutions may be computed with non-increasing error via a hierarchy of semidefinite optimization problems.
Additional Information
© 2015 IEEE.Attached Files
Submitted - 1509.07922v1.pdf
Files
Name | Size | Download all |
---|---|---|
md5:82a7ed2b3e43baf994b10233b55b15a5
|
485.8 kB | Preview Download |
Additional details
- Eprint ID
- 64530
- DOI
- 10.1109/CDC.2015.7403348
- Resolver ID
- CaltechAUTHORS:20160217-101732457
- Created
-
2016-02-17Created from EPrint's datestamp field
- Updated
-
2021-11-10Created from EPrint's last_modified field