Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 2016 | Supplemental Material + Published
Book Section - Chapter Open

Online and Differentially-Private Tensor Decomposition

Abstract

Tensor decomposition is positioned to be a pervasive tool in the era of big data. In this paper, we resolve many of the key algorithmic questions regarding robustness, memory efficiency, and differential privacy of tensor decomposition. We propose simple variants of the tensor power method which enjoy these strong properties. We propose the first streaming method with a linear memory requirement. Moreover, we present a noise calibrated tensor power method with efficient privacy guarantees. At the heart of all these guarantees lies a careful perturbation analysis derived in this paper which improves up on the existing results significantly.

Additional Information

A. Anandkumar is supported in part by Microsoft Faculty Fellowship, NSF Career award CCF-1254106, ONR Award N00014- 14-1-0665, ARO YIP Award W911NF-13-1-0084 and AFOSR YIP FA9550-15-1-0221.

Attached Files

Published - p3539-wang.pdf

Supplemental Material - 6498-online-and-differentially-private-tensor-decomposition-supplemental.zip

Files

6498-online-and-differentially-private-tensor-decomposition-supplemental.zip
Files (1.3 MB)
Name Size Download all
md5:0ccddaa82ada7db0b85ff6594d78d77c
896.1 kB Preview Download
md5:1a9802d8b79192e3d62026c172f2e2fa
364.4 kB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 20, 2023