IEEE Access (Jan 2019)

Window Zooming–Based Localization Algorithm of Fruit and Vegetable for Harvesting Robot

  • Chenglin Wang,
  • Tianhong Luo,
  • Lijun Zhao,
  • Yunchao Tang,
  • Xiangjun Zou

DOI
https://doi.org/10.1109/ACCESS.2019.2925812
Journal volume & issue
Vol. 7
pp. 103639 – 103649

Abstract

Read online

Localization of fruit and vegetable is of great significance to fruit and vegetable harvesting robots and even harvesting industries. However, uncontrollable factors, such as varying illumination, random occlusion, and various surface color and texture, constrain the localization of fruit and vegetable using the vision imaging technology under unconstructed environment. Our previous studies have developed various methods (illumination normalization, features-based classification, etc.) to localize a certain kind of fruit or vegetable using the binocular stereo vision. However, the localization of the multiple fruit and vegetable still faces challenges due to the uncontrollable factors. In order to address this issue, this study proposed an intelligent localization method of targets in fruit and vegetable images acquired by the two charge-coupled device (CCD) color cameras under unstructured environment. The method utilized the Faster region-based convolutional neural network (R-CNN) model to recognize the fruit and vegetable. Based on the recognition results, a window zooming method was proposed for the matching of the recognized target. Finally, the localization of the target was completed by calculating the three-dimensional coordinates of the matched target using the triangular measurement principle. The experimental results showed that the proposed method could be robust against the influences of varying illumination and occlusion, and the average accurate recognition rate was 96.33% under six different conditions. About 93.44% of 1036 pairs of tested targets from unoccluded and partially occluded conditions were successfully matched. Localization errors had no significant difference and they were less than 7.5 mm when the measuring distance was between 300 and 1600 mm under varying illumination and partially occluded conditions.

Keywords