Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 2015 | Submitted
Book Section - Chapter Open

Fast and Guaranteed Tensor Decomposition via Sketching

Abstract

Tensor CANDECOMP/PARAFAC (CP) decomposition has wide applications in statistical learning of latent variable models and in data mining. In this paper, we propose fast and randomized tensor CP decomposition algorithms based on sketching. We build on the idea of count sketches, but introduce many novel ideas which are unique to tensors. We develop novel methods for randomized computation of tensor contractions via FFTs, without explicitly forming the tensors. Such tensor contractions are encountered in decomposition methods such as tensor power iterations and alternating least squares. We also design novel colliding hashes for symmetric tensors to further save time in computing the sketches. We then combine these sketching ideas with existing whitening and tensor power iterative techniques to obtain the fastest algorithm on both sparse and dense tensors. The quality of approximation under our method does not depend on properties such as sparsity, uniformity of elements, etc. We apply the method for topic modeling and obtain competitive results.

Additional Information

© 2015 Neural Information Processing Systems Foundation. Anima Anandkumar is supported in part by the Microsoft Faculty Fellowship and the Sloan Foundation. Alex Smola is supported in part by a Google Faculty Research Grant.

Attached Files

Submitted - 1506.04448.pdf

Files

1506.04448.pdf
Files (687.6 kB)
Name Size Download all
md5:777ca4103f91a5f87fc77d7507db886d
687.6 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 20, 2023