Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published September 6, 2019 | Submitted
Report Open

Robust Deep Networks with Randomized Tensor Regression Layers

Abstract

In this paper, we propose a novel randomized tensor decomposition for tensor regression. It allows to stochastically approximate the weights of tensor regression layers by randomly sampling in the low-rank subspace. We theoretically and empirically establish the link between our proposed stochastic rank-regularization and the dropout on low-rank tensor regression. This acts as an additional stochastic regularization on the regression weight, which, combined with the deterministic regularization imposed by the low-rank constraint, improves both the performance and robustness of neural networks augmented with it. In particular, it makes the model more robust to adversarial attacks and random noise, without requiring any adversarial training. We perform a thorough study of our method on synthetic data, object classification on the CIFAR100 and ImageNet datasets, and large scale brain-age prediction on UK Biobank brain MRI dataset. We demonstrate superior performance in all cases, as well as significant improvement in robustness to adversarial attacks and random noise.

Additional Information

This research has been conducted using the UK Biobank Resource under Application Number 18545.

Attached Files

Submitted - 1902.10758.pdf

Files

1902.10758.pdf
Files (2.4 MB)
Name Size Download all
md5:3a15bf7b59e89d8e9234a8d12281ec6c
2.4 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 18, 2023