Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published November 2019 | public
Book Section - Chapter

Deep Inductive Matrix Completion for Biomedical Interaction Prediction

Abstract

In many real tasks, side information in addition to the observed entries is available in the matrix completion problem. To make good use of this information, an inductive approach to matrix completion was proposed where the matrix entries are modeled as a bilinear function of real-valued vectors associated with the rows and the columns. However, it is not effective in handling data of nonlinear structures. In this paper, we propose a novel model called Deep Inductive Matrix Completion (DIMC) for nonlinear inductive matrix completion, which consists of two deep-structure neural networks to extract latent features from high-dimensional known side vectors, and then to predict their relationships using the latent features. In DIMC, the parameters of the neural networks are alternatively optimized to minimize the reconstruction error. Then the missing entries can be readily recovered with the side vectors of rows and columns. We compare DIMC with state-of-the-art methods of linear and nonlinear matrix completion in the tasks of drug repositioning, gene-disease and miRNA-disease association prediction. The experimental results verified that DIMC is capable to provide higher accuracy than existing methods and is applicable to predict inductively on new row-column interactions with auxiliary side information. In addition, we discuss the effects of alternating training frequency on the performance of DIMC and how we can utilize such property to implement a GPU-based parallel computing algorithm that significantly shortens the training time.

Additional Information

© 2019 IEEE. The authors would like to thank Fen Pei from University of Pittsburgh for early stage discussions. This work is supported by the National Institutes of Health grants R01-GM093156 and P30-DA035778.

Additional details

Created:
August 19, 2023
Modified:
October 20, 2023