Remote Sensing (Aug 2024)
Aerial Hybrid Adjustment of LiDAR Point Clouds, Frame Images, and Linear Pushbroom Images
Abstract
In airborne surveying, light detection and ranging (LiDAR) strip adjustment and image bundle adjustment are customarily performed as separate processes. The bundle adjustment is usually conducted from frame images, while using linear pushbroom (LP) images in the bundle adjustment has been historically challenging due to the limited number of observations available to estimate the exterior image orientations. However, data from these three sensors conceptually provide information to estimate the same trajectory corrections, which is favorable for solving the problems of image depth estimation or the planimetric correction of LiDAR point clouds. Thus, our purpose with the presented study is to jointly estimate corrections to the trajectory and interior sensor states in a scalable hybrid adjustment between 3D LiDAR point clouds, 2D frame images, and 1D LP images. Trajectory preprocessing is performed before the low-frequency corrections are estimated for certain time steps in the following adjustment using cubic spline interpolation. Furthermore, the voxelization of the LiDAR data is used to robustly and efficiently form LiDAR observations and hybrid observations between the image tie-points and the LiDAR point cloud to be used in the adjustment. The method is successfully demonstrated with an experiment, showing the joint adjustment of data from the three different sensors using the same trajectory correction model with spline interpolation of the trajectory corrections. The results show that the choice of the trajectory segmentation time step is not critical. Furthermore, photogrammetric sub-pixel planimetric accuracy is achieved, and height accuracy on the order of mm is achieved for the LiDAR point cloud. This is the first time these three types of sensors with fundamentally different acquisition techniques have been integrated. The suggested methodology presents a joint adjustment of all sensor observations and lays the foundation for including additional sensors for kinematic mapping in the future.
Keywords