Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published January 9, 2020 | Submitted
Report Open

Memory Augmented Recursive Neural Networks

Abstract

Recursive neural networks have shown an impressive performance for modeling compositional data compared to their recurrent counterparts. Although recursive neural networks are better at capturing long range dependencies, their generalization performance starts to decay as the test data becomes more compositional and potentially deeper than the training data. In this paper, we present memory-augmented recursive neural networks to address this generalization performance loss on deeper data points. We augment Tree-LSTMs with an external memory, namely neural stacks. We define soft push and pop operations for filling and emptying the memory to ensure that the networks remain end-to-end differentiable. In order to assess the effectiveness of the external memory, we evaluate our model on a neural programming task introduced in the literature called equation verification. Our results indicate that augmenting recursive neural networks with external memory consistently improves the generalization performance on deeper data points compared to the state-of-the-art Tree-LSTM by up to 10%.

Additional Information

A. Anandkumar is supported by Bren Chair professorship, DARPA PAI HR0011-18-9-0035, Faculty awards from Adobe, BMW, Microsoft and Google.

Attached Files

Submitted - 1911.01545.pdf

Files

1911.01545.pdf
Files (869.0 kB)
Name Size Download all
md5:f7a81e628d41f5598e8df54d1029aec7
869.0 kB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 18, 2023