Drones (Feb 2020)

Deep Neural Networks and Transfer Learning for Food Crop Identification in UAV Images

  • Robert Chew,
  • Jay Rineer,
  • Robert Beach,
  • Maggie O'Neil,
  • Noel Ujeneza,
  • Daniel Lapidus,
  • Thomas Miano,
  • Meghan Hegarty-Craver,
  • Jason Polly,
  • Dorota S. Temple

DOI
https://doi.org/10.3390/drones4010007
Journal volume & issue
Vol. 4, no. 1
p. 7

Abstract

Read online

Accurate projections of seasonal agricultural output are essential for improving food security. However, the collection of agricultural information through seasonal agricultural surveys is often not timely enough to inform public and private stakeholders about crop status during the growing season. Acquiring timely and accurate crop estimates can be particularly challenging in countries with predominately smallholder farms because of the large number of small plots, intense intercropping, and high diversity of crop types. In this study, we used RGB images collected from unmanned aerial vehicles (UAVs) flown in Rwanda to develop a deep learning algorithm for identifying crop types, specifically bananas, maize, and legumes, which are key strategic food crops in Rwandan agriculture. The model leverages advances in deep convolutional neural networks and transfer learning, employing the VGG16 architecture and the publicly accessible ImageNet dataset for pretraining. The developed model performs with an overall test set F1 of 0.86, with individual classes ranging from 0.49 (legumes) to 0.96 (bananas). Our findings suggest that although certain staple crops such as bananas and maize can be classified at this scale with high accuracy, crops involved in intercropping (legumes) can be difficult to identify consistently. We discuss the potential use cases for the developed model and recommend directions for future research in this area.

Keywords