35-Issue 6
Permanent URI for this collection
Browse
Browsing 35-Issue 6 by Subject "[Computer Graphics]: Shape modelling—Point‐based models"
Now showing 1 - 1 of 1
Results Per Page
Sort Options
Item Detection of Geometric Temporal Changes in Point Clouds(Copyright © 2016 The Eurographics Association and John Wiley & Sons Ltd., 2016) Palma, Gianpaolo; Cignoni, Paolo; Boubekeur, Tamy; Scopigno, Roberto; Chen, Min and Zhang, Hao (Richard)Detecting geometric changes between two 3D captures of the same location performed at different moments is a critical operation for all systems requiring a precise segmentation between change and no‐change regions. Such application scenarios include 3D surface reconstruction, environment monitoring, natural events management and forensic science. Unfortunately, typical 3D scanning setups cannot provide any one‐to‐one mapping between measured samples in static regions: in particular, both extrinsic and intrinsic sensor parameters may vary over time while sensor noise and outliers additionally corrupt the data. In this paper, we adopt a multi‐scale approach to robustly tackle these issues. Starting from two point clouds, we first remove outliers using a probabilistic operator. Then, we detect the actual change using the implicit surface defined by the point clouds under a Growing Least Square reconstruction that, compared to the classical proximity measure, offers a more robust change/no‐change characterization near the temporal intersection of the scans and in the areas exhibiting different sampling density and direction. The resulting classification is enhanced with a spatial reasoning step to solve critical geometric configurations that are common in man‐made environments. We validate our approach on a synthetic test case and on a collection of real data sets acquired using commodity hardware. Finally, we show how 3D reconstruction benefits from the resulting precise change/no‐change segmentation.Detecting geometric changes between two 3D captures of the same location performed at different moments is a critical operation for all systems requiring a precise segmentation between change and no‐change regions. Unfortunately, typical 3D scanning setups cannot provide any oneto‐one mapping between measured samples in static regions: both extrinsic and intrinsic sensor parameters may vary over time while sensor noise and outliers additionally corrupt the data. In this paper, we adopt a multi‐scale approach to robustly tackle these issues, obtaining a robust segmentation near the temporal intersection of the scans and in the areas with different sampling density and direction.