Abstract:
We propose an algorithm that performs registration of large sets of unstructured point clouds of moving and deforming objects without computing correspondences. Given as input a set of frames with dense spatial and temporal sampling, such as the raw output of a fast scanner, our algorithm exploits the underlying temporal coherence in the data to directly compute the motion of the scanned object and bring all frames into a common coordinate system. In contrast with existing methods which usually perform pairwise alignments between consecutive frames, our algorithm computes a globally consistent motion spanning multiple frames. We add a time coordinate to all the input points based on the ordering of the respective frames and pose the problem of computing the motion of each frame as an estimation of certain kinematic properties of the resulting spacetime surface. By performing this estimation for each frame as a whole we are able to compute rigid interframe motions, and by adapting our method to perform a local analysis of the spacetime surface, we extend the basic algorithm to handle registration of deformable objects as well. We demonstrate the performance of our algorithm on a number of synthetic and scanned examples, each consisting of hundreds of scans.
Bibtex:
@incollection{mitra2007dgr, author = "N. J. Mitra and S. Fl{\"o}ry and M. Ovsjanikov and N. Gelfand and L. Guibas and H. Pottmann", title = "Dynamic Geometry Registration", booktitle = "Symposium on Geometry Processing", year = 2007, pages = "173182", editor = "A. Belyaev and M. Garland", publisher = "Eurographics", }

