Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published June 2021 | Accepted Version
Book Section - Chapter Open

Learning View-Disentangled Human Pose Representation by Contrastive Cross-View Mutual Information Maximization

Abstract

We introduce a novel representation learning method to disentangle pose-dependent as well as view-dependent factors from 2D human poses. The method trains a network using cross-view mutual information maximization (CV-MIM) which maximizes mutual information of the same pose performed from different viewpoints in a contrastive learning manner. We further propose two regularization terms to ensure disentanglement and smoothness of the learned representations. The resulting pose representations can be used for cross-view action recognition.To evaluate the power of the learned representations, in addition to the conventional fully-supervised action recognition settings, we introduce a novel task called single-shot cross-view action recognition. This task trains models with actions from only one single viewpoint while models are evaluated on poses captured from all possible viewpoints. We evaluate the learned representations on standard benchmarks for action recognition, and show that (i) CV-MIM performs competitively compared with the state-of-the-art models in the fully-supervised scenarios; (ii) CV-MIM outperforms other competing methods by a large margin in the single-shot cross-view setting; (iii) and the learned representations can significantly boost the performance when reducing the amount of supervised training data. Our code is made publicly available at https://github.com/google-research/google-research/tree/master/poem.

Additional Information

© 2021 IEEE. This work was done while the author was a research intern at Google.

Attached Files

Accepted Version - 2012.01405.pdf

Files

2012.01405.pdf
Files (4.8 MB)
Name Size Download all
md5:7f72a7cd47cf4c290bb27fc4df38c3c4
4.8 MB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 23, 2023