Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published August 2011 | public
Book Section - Chapter

Uncertain semantics, representation nuisances, and necessary invariance properties of bootstrapping agents

Abstract

In the problem of bootstrapping, an agent must learn to use an unknown body, in an unknown world, starting from zero information about the world, its sensors, and its actuators. So far, this fascinating problem has not; been given a proper normalization. In this paper, we provide a possible rigorous definition of one of the key aspects of bootstrapping, namely the fact that an agent must be able to use "uninterpreted" observations and commands. We show that this can be formalized by positing the existence of representation nuisances that act on the data, and which must be tolerated by an agent. The classes of nuisances tolerate d in directly encode the assumptions needed about the world, and therefore the agent's ability to solve smaller or larger classes of bootstrapping problem instances. Moreover, we argue that the behavior of an agent that claims optimality must actually be invariant to the representation nuisances, and we discuss several design principles to obtain such invariance.

Additional Information

© 2011 IEEE. Date of Current Version: 10 October 2011. Thanks to Scott Livingston for the many insightful comments on a very short notice.

Additional details

Created:
August 19, 2023
Modified:
October 24, 2023