IEEE Access (Jan 2020)

2D-Key-Points-Localization-Driven 3D Aircraft Pose Estimation

  • Yibo Li,
  • Ruixing Yu,
  • Bing Zhu

DOI
https://doi.org/10.1109/ACCESS.2020.3026209
Journal volume & issue
Vol. 8
pp. 181293 – 181301

Abstract

Read online

In this paper, we are interesting in inferring 3D pose estimation of aircraft object leveraging 2D key-points localization. Monocular vision based pose estimation for aircraft can be widely utilized in airspace tasks like flight control system, air traffic management, autonomous navigation and air defense system. Nonetheless, prior methods using directly regression or classification can not meet the requirements of high precision in aircraft pose estimation context, other approaches using PnP algorithms that need additional information such as template 3D model or depth as prior knowledge. These methods do not exploit to full advantage the correlation information between 2D key-points and 3D pose. In this paper, we present a multi-branch network, named AirPose network, using convolutional neural network to address 3D pose estimation based on 2D key-points information. In the meantime, a novel feature fusion method is explored to enable orientation estimation branch adequately exploit key-points information. Our feature fusion method significantly decreases 3D pose estimation error also avoids the involvement of RANSAC based PnP algorithms. To address the problem that there is no available dedicated aircraft 3D pose dataset for training and testing, we build a visual simulation platform on Unreal Engine 4 applying domain randomization (DR) skill, named AKO platform, which generates aircraft images automatically labeled with 3D orientation and key-points location. The dataset is called AKO dataset. We implement a series of ablation experiments to evaluate our framework for aircraft object detection, key-points localization and orientation estimation on AKO dataset. Experiments show that our proposed AirPose network leveraging AKO dataset can achieve convincing results for each of the tasks.

Keywords