Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published January 2022 | Submitted
Journal Article Open

Spectral analysis of weighted Laplacians arising in data clustering

Abstract

Graph Laplacians computed from weighted adjacency matrices are widely used to identify geometric structure in data, and clusters in particular; their spectral properties play a central role in a number of unsupervised and semi-supervised learning algorithms. When suitably scaled, graph Laplacians approach limiting continuum operators in the large data limit. Studying these limiting operators, therefore, sheds light on learning algorithms. This paper is devoted to the study of a parameterized family of divergence form elliptic operators that arise as the large data limit of graph Laplacians. The link between a three-parameter family of graph Laplacians and a three-parameter family of differential operators is explained. The spectral properties of these differential operators are analyzed in the situation where the data comprises of two nearly separated clusters, in a sense which is made precise. In particular, we investigate how the spectral gap depends on the three parameters entering the graph Laplacian, and on a parameter measuring the size of the perturbation from the perfectly clustered case. Numerical results are presented which exemplify the analysis and which extend it in the following ways: the computations study situations in which there are two nearly separated clusters, but which violate the assumptions used in our theory; situations in which more than two clusters are present, also going beyond our theory; and situations which demonstrate the relevance of our studies of differential operators for the understanding of finite data problems via the graph Laplacian. The findings provide insight into parameter choices made in learning algorithms which are based on weighted adjacency matrices; they also provide the basis for analysis of the consistency of various unsupervised and semi-supervised learning algorithms, in the large data limit.

Additional Information

© 2021 Elsevier Inc. Received 13 September 2019, Revised 18 April 2021, Accepted 30 July 2021, Available online 1 September 2021. The authors are grateful to Nicolás García Trillos for helpful discussions regarding the results in Section 5 concerning various graph Laplacians and their continuum limits. We are also thankful to the anonymous reviewers whose comments and suggestions helped us improve an earlier version of this article. AMS is grateful to AFOSR (grant FA9550-17-1-0185) and NSF (grant DMS 18189770) for financial support. FH was partially supported by Caltech's von Kármán postdoctoral instructorship. BH was partially supported by an NSERC PDF fellowship.

Attached Files

Submitted - 1909.06389.pdf

Files

1909.06389.pdf
Files (1.7 MB)
Name Size Download all
md5:5d78ba51490799fbe12660dfc705d7e4
1.7 MB Preview Download

Additional details

Created:
August 22, 2023
Modified:
October 19, 2023