IET Image Processing (Jul 2023)

Robust registration of aerial and close‐range photogrammetric point clouds using visual context features and scale consistency

  • Guanghan Chu,
  • Dazhao Fan,
  • Yang Dong,
  • Song Ji,
  • Linyu Gu,
  • Dongzi Li,
  • Wu Zhang

DOI
https://doi.org/10.1049/ipr2.12821
Journal volume & issue
Vol. 17, no. 9
pp. 2698 – 2709

Abstract

Read online

Abstract Point cloud registration is of great significance to the reconstruction of high‐precision 3D city models. There are some challenges when aligning aerial and close‐range photogrammetric point clouds, such as huge view differences caused by the different sights of the sensors, massive noisy points due to the error in dense matching, and scale uncertainty since there are no control points for absolute orientation. To achieve complementary advantages of aerial and close‐range point clouds, in this paper, a robust cross‐source point clouds registration method is proposed using image visual context features and scale consistency. First, a cross‐view image matching method based on image visual context features is proposed to obtain corresponding points. Second, to overcome the challenges of noisy points and scale differences, an outlier filtering method is designed based on scale consistency. Finally, the dual quaternions model considering the scale factor is introduced to solve the spatial transformation model. To analyze the feasibility of this method qualitatively and quantitatively, experiments are conducted using the public scene dataset of Dortmund, Germany, and the scene dataset of Zhengzhou City, China. Three kinds of cross‐source point clouds registration experiments are conducted in this paper, including aerial and close‐range point clouds registration in Dortmund, and aerial and ground point clouds registrations in both Dortmund and Zhengzhou. The chamfer distances of the three sets of experiments are 4.48 m, 5.97 m and 4.78 m, respectively. The ablation study shows that the outlier filtering method and the dual quaternions model improve the accuracy by at least 14% and 25%, respectively. The experiments demonstrate that the method accomplishes the cross‐source point clouds registration in large‐scale scenes accurately and efficiently, providing a solid foundation for subsequent fine 3D reconstruction.

Keywords