Drones (Feb 2023)

Fast and High-Quality Monocular Depth Estimation with Optical Flow for Autonomous Drones

  • Tomoyasu Shimada,
  • Hiroki Nishikawa,
  • Xiangbo Kong,
  • Hiroyuki Tomiyama

DOI
https://doi.org/10.3390/drones7020134
Journal volume & issue
Vol. 7, no. 2
p. 134

Abstract

Read online

Recent years, autonomous drones have attracted attention in many fields due to their convenience. Autonomous drones require precise depth information so as to avoid collision to fly fast and both of RGB image and LiDAR point cloud are often employed in applications based on Convolutional Neural Networks (CNNs) to estimate the distance to obstacles. Such applications are implemented onboard embedded systems. In order to precisely estimate the depth, such CNN models are in general so complex to extract many features that the computational complexity increases, requiring long inference time. In order to solve the issue, we employ optical flow to aid in-depth estimation. In addition, we propose a new attention structure that makes maximum use of optical flow without complicating the network. Furthermore, we achieve improved performance without modifying the depth estimator by adding a perceptual discriminator in training. The proposed model is evaluated through accuracy, error, and inference time on the KITTI dataset. In the experiments, we have demonstrated the proposed method achieves better performance by up to 34% accuracy, 55% error reduction and 66% faster inference time on Jetson nano compared to previous methods. The proposed method is also evaluated through a collision avoidance in simulated drone flight and achieves the lowest collision rate of all estimation methods. These experimental results show the potential of proposed method to be used in real-world autonomous drone flight applications.

Keywords