Correspondence Learning via Linearly-invariant Embedding

In this paper, we propose a fully differentiable pipeline for estimatingaccurate dense correspondences between 3D point clouds. The proposed pipelineis an extension and a generalization of the functional maps framework. However,instead of using the Laplace-Beltrami eigenfunctions as done in virtually allprevious works in this domain, we demonstrate that learning the basis from datacan both improve robustness and lead to better accuracy in challengingsettings. We interpret the basis as a learned embedding into a higherdimensional space. Following the functional map paradigm the optimaltransformation in this embedding space must be linear and we propose a separatearchitecture aimed at estimating the transformation by learning optimaldescriptor functions. This leads to the first end-to-end trainable functionalmap-based correspondence approach in which both the basis and the descriptorsare learned from data. Interestingly, we also observe that learning a\emph{canonical} embedding leads to worse results, suggesting that leaving anextra linear degree of freedom to the embedding network gives it morerobustness, thereby also shedding light onto the success of previous methods.Finally, we demonstrate that our approach achieves state-of-the-art results inchallenging non-rigid 3D point cloud correspondence applications.