IEEE Access (Jan 2024)

An Improved Method for Extracting Inter-Row Navigation Lines in Nighttime Maize Crops Using YOLOv7-Tiny

  • Hailiang Gong,
  • Weidong Zhuang

DOI
https://doi.org/10.1109/ACCESS.2024.3365555
Journal volume & issue
Vol. 12
pp. 27444 – 27455

Abstract

Read online

In response to the issue of insufficient nighttime illumination in mechanical weeding of maize crops, this study proposes an improved YOLOv7-tiny network model infrared image object detection. The model incorporates the ShuffleNet v1 network to reduce computational complexity, enhance image feature extraction, and obtain more comprehensive semantic information. Additionally, the Coordinate Attention(CA) mechanism module is integrated into the neck network to improve sample detection performance. The EIOU loss function is employed to replace the original loss function, which results in faster model convergence and improved positioning accuracy. The improved YOLOv7-tiny network model is used to detect maize seedlings, with the center point of the detection box serving as the navigation reference point. Subsequently, the least squares method is used to fit the maize rows on both sides, thereby obtaining the inter-row navigation line in the middle of the two rows. Experimental results demonstrate that the improved YOLOv7-tiny network model achieves a detection accuracy of 94.21 % and a detection speed of 32.4 frames per second, enabling accurate identification of maize seedlings at night. The average error between the extracted positioning reference points and the manually labeled midpoint of the maize seedlings is 4.85 cm, meeting navigation requirements of maize crop rows and providing feasibility for deployment on mobile terminal devices.

Keywords