Remote Sensing (Nov 2021)

Detection of Outliers in LiDAR Data Acquired by Multiple Platforms over Sorghum and Maize

  • Behrokh Nazeri,
  • Melba Crawford

DOI
https://doi.org/10.3390/rs13214445
Journal volume & issue
Vol. 13, no. 21
p. 4445

Abstract

Read online

High-resolution point cloud data acquired with a laser scanner from any platform contain random noise and outliers. Therefore, outlier detection in LiDAR data is often necessary prior to analysis. Applications in agriculture are particularly challenging, as there is typically no prior knowledge of the statistical distribution of points, plant complexity, and local point densities, which are crop-dependent. The goals of this study were first to investigate approaches to minimize the impact of outliers on LiDAR acquired over agricultural row crops, and specifically for sorghum and maize breeding experiments, by an unmanned aerial vehicle (UAV) and a wheel-based ground platform; second, to evaluate the impact of existing outliers in the datasets on leaf area index (LAI) prediction using LiDAR data. Two methods were investigated to detect and remove the outliers from the plant datasets. The first was based on surface fitting to noisy point cloud data via normal and curvature estimation in a local neighborhood. The second utilized the PointCleanNet deep learning framework. Both methods were applied to individual plants and field-based datasets. To evaluate the method, an F-score was calculated for synthetic data in the controlled conditions, and LAI, the variable being predicted, was computed both before and after outlier removal for both scenarios. Results indicate that the deep learning method for outlier detection is more robust than the geometric approach to changes in point densities, level of noise, and shapes. The prediction of LAI was also improved for the wheel-based vehicle data based on the coefficient of determination (R2) and the root mean squared error (RMSE) of the residuals before and after the removal of outliers.

Keywords