IEEE Access (Jan 2021)

Self-Corrective Sensor Fusion for Drone Positioning in Indoor Facilities

  • Francisco Javier Gonzalez-Castano,
  • Felipe Gil-Castineira,
  • David Rodriguez-Pereira,
  • Jose Angel Regueiro-Janeiro,
  • Silvia Garcia-Mendez,
  • David Candal-Ventureira

DOI
https://doi.org/10.1109/ACCESS.2020.3048194
Journal volume & issue
Vol. 9
pp. 2415 – 2427

Abstract

Read online

Drones may be more advantageous than fixed cameras for quality control applications in industrial facilities, since they can be redeployed dynamically and adjusted to production planning. The practical scenario that has motivated this paper, image acquisition with drones in a car manufacturing plant, requires drone positioning accuracy in the order of 5 cm. During repetitive manufacturing processes, it is assumed that quality control imaging drones will follow highly deterministic periodic paths, stop at predefined points to take images and send them to image recognition servers. Therefore, by relying on prior knowledge about production chain schedules, it is possible to optimize the positioning technologies for the drones to stay at all times within the boundaries of their flight plans, which will be composed of stopping points and the paths in between. This involves mitigating issues such as temporary blocking of line-of-sight between the drone and any existing radio beacons; sensor data noise; and the loss of visual references. We present a self-corrective solution for this purpose. It corrects visual odometer readings based on filtered and clustered Ultra-Wide Band (UWB) data, as an alternative to direct Kalman fusion. The approach combines the advantages of these technologies when at least one of them works properly at any measurement spot. It has three method components: independent Kalman filtering, data association by means of stream clustering and mutual correction of sensor readings based on the generation of cumulative correction vectors. The approach is inspired by the observation that UWB positioning works reasonably well at static spots whereas visual odometer measurements reflect straight displacements correctly but can underestimate their length. Our experimental results demonstrate the advantages of the approach in the application scenario over Kalman fusion, in terms of stopping point detection and trajectory estimation error.

Keywords