Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published November 11, 2005 | public
Report Open

Networks of Relations for Representation, Learning, and Generalization

Abstract

We propose representing knowledge as a network of relations. Each relation relates only a few continuous or discrete variables, so that any overall relationship among the many variables treated by the network winds up being distributed throughout the network. Each relation encodes which combinations of values correspond to past experience for the variables related by the relation. Variables may or may not correspond to understandable aspects of the situation being modeled by the network. A distributed calculational process can be used to access the information stored in such a network, allowing the network to function as an associative memory. This process in its simplest form is purely inhibitory, narrowing down the space of possibilities as much as possible given the data to be matched. In contrast with methods that always retrieve a best fit for all variables, this method can return values for inferred variables while leaving non-inferable variables in an unknown or partially known state. In contrast with belief propagation methods, this method can be proven to converge quickly and uniformly for any network topology, allowing networks to be as interconnected as the relationships warrant, with no independence assumptions required. The generalization properties of such a memory are aligned with the network's relational representation of how the various aspects of the modeled situation are related.

Files

etr071.pdf
Files (117.7 kB)
Name Size Download all
md5:0d01dd699cb31b011c49cabe97579ee8
117.7 kB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 24, 2023