Nonrigid point set registration with robust transformation learning under manifold regularization

Jiayi Ma, Jia Wu, Ji Zhao, Junjun Jiang, Huabing Zhou, Quan Z. Sheng

Research output: Contribution to journalArticlepeer-review

113 Citations (Scopus)


This paper solves the problem of nonrigid point set registration by designing a robust transformation learning scheme. The principle is to iteratively establish point correspondences and learn the nonrigid transformation between two given sets of points. In particular, the local feature descriptors are used to search the correspondences and some unknown outliers will be inevitably introduced. To precisely learn the underlying transformation from noisy correspondences, we cast the point set registration into a semisupervised learning problem, where a set of indicator variables is adopted to help distinguish outliers in a mixture model. To exploit the intrinsic structure of a point set, we constrain the transformation with manifold regularization which plays a role of prior knowledge. Moreover, the transformation is modeled in the reproducing kernel Hilbert space, and a sparsity-induced approximation is utilized to boost efficiency. We apply the proposed method to learning motion flows between image pairs of similar scenes for visual homing, which is a specific type of mobile robot navigation. Extensive experiments on several publicly available data sets reveal the superiority of the proposed method over state-of-the-art competitors, particularly in the context of the degenerated data.

Original languageEnglish
Pages (from-to)3584-3597
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number12
Early online date26 Oct 2018
Publication statusPublished - Dec 2019


  • Manifold regularization
  • nonrigid
  • point set registration
  • robust estimation
  • visual homing


Dive into the research topics of 'Nonrigid point set registration with robust transformation learning under manifold regularization'. Together they form a unique fingerprint.

Cite this