Measurement + Control (Jul 2022)

YOLO based deep learning on needle-type dashboard recognition for autopilot maneuvering system

  • Chia-Ming Chang,
  • Yuan-Dean Liou,
  • Yi-Cheng Huang,
  • Shang-En Shen,
  • PeiJou Yu,
  • TingHsueh Chuang,
  • Shean-Juinn Chiou

DOI
https://doi.org/10.1177/00202940221115199
Journal volume & issue
Vol. 55

Abstract

Read online

Developing a fully automatic auxiliary flying system with robot maneuvering is feasible. This study develops a control vision system that can read all kind of needle-type meters. The vision device in this study implements a modified YOLO-based object detection model to recognize the airspeed readings from the needle-type dashboard. With this approach, meter information in the cockpit is replaced by a single camera and a powerful edge-computer for future autopilot maneuvering purpose. A modified YOLOv4-tiny model by adding the Spatial Pyramid Pooling (SPP) and the Bidirectional Feature Pyramid Network (BAFPN) to the Neck region of the convolutional neural networks (CNN) structure is implemented. The Taguchi method for acquiring a set of optimum hyperparameters for the CNN is applied. An improved deep learning network with higher Mean Average precision (mAP) compared with conventional YOLOv4-tiny and possessing a higher Frames Per Second (FPS) value than YOLOv4 is deployed successfully. Established a self-control system using a camera to receive airspeed indications from the designed virtual needle-type dashboard. Moreover, the dashboard’s pointer is controlled by applying the proposed control method, which contains PID control in addition to the pointer’s rotation angle recognition. A modified YOLOv4-tiny model with a fabricated system for visual dynamical recognition control is implemented successfully. The feasibility of bettering mean accuracy precision and frame per second in achieving autopilot maneuvering is verified.