Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 1, 2019 | Submitted
Report Open

Convolutional Dictionary Learning through Tensor Factorization

Abstract

Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. However, in many domains, additional invariances such as shift invariances exist, enforced via models such as convolutional dictionary learning. In this paper, we develop novel tensor decomposition algorithms for parameter estimation of convolutional models. Our algorithm is based on the popular alternating least squares method, but with efficient projections onto the space of stacked circulant matrices. Our method is embarrassingly parallel and consists of simple operations such as fast Fourier transforms and matrix multiplications. Our algorithm converges to the dictionary much faster and more accurately compared to the alternating minimization over filters and activation maps.

Additional Information

We thank Cris Cecka for helpful discussion on fast implementation of block matrix inverse and initial discussions with Majid Janzamin and Hanie Sedghi on Toeplitz matrices.

Attached Files

Submitted - 1506.03509.pdf

Files

1506.03509.pdf
Files (220.8 kB)
Name Size Download all
md5:842db9a6218774800ecfe77286970299
220.8 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 20, 2023