Memory Augmented Recursive Neural Networks
Abstract
Recursive neural networks have shown an impressive performance for modeling compositional data compared to their recurrent counterparts. Although recursive neural networks are better at capturing long range dependencies, their generalization performance starts to decay as the test data becomes more compositional and potentially deeper than the training data. In this paper, we present memory-augmented recursive neural networks to address this generalization performance loss on deeper data points. We augment Tree-LSTMs with an external memory, namely neural stacks. We define soft push and pop operations for filling and emptying the memory to ensure that the networks remain end-to-end differentiable. In order to assess the effectiveness of the external memory, we evaluate our model on a neural programming task introduced in the literature called equation verification. Our results indicate that augmenting recursive neural networks with external memory consistently improves the generalization performance on deeper data points compared to the state-of-the-art Tree-LSTM by up to 10%.
Additional Information
A. Anandkumar is supported by Bren Chair professorship, DARPA PAI HR0011-18-9-0035, Faculty awards from Adobe, BMW, Microsoft and Google.Attached Files
Submitted - 1911.01545.pdf
Files
Name | Size | Download all |
---|---|---|
md5:f7a81e628d41f5598e8df54d1029aec7
|
869.0 kB | Preview Download |
Additional details
- Eprint ID
- 100579
- Resolver ID
- CaltechAUTHORS:20200109-090330653
- Bren Professor of Computing and Mathematical Sciences
- Defense Advanced Research Projects Agency (DARPA)
- HR0011-18-9-0035
- Adobe
- BMW
- Microsoft
- Created
-
2020-01-09Created from EPrint's datestamp field
- Updated
-
2023-06-02Created from EPrint's last_modified field