PLoS ONE (Jan 2019)

Aerial-trained deep learning networks for surveying cetaceans from satellite imagery.

  • Alex Borowicz,
  • Hieu Le,
  • Grant Humphries,
  • Georg Nehls,
  • Caroline Höschle,
  • Vladislav Kosarev,
  • Heather J Lynch

DOI
https://doi.org/10.1371/journal.pone.0212532
Journal volume & issue
Vol. 14, no. 10
p. e0212532

Abstract

Read online

Most cetacean species are wide-ranging and highly mobile, creating significant challenges for researchers by limiting the scope of data that can be collected and leaving large areas un-surveyed. Aerial surveys have proven an effective way to locate and study cetacean movements but are costly and limited in spatial extent. Here we present a semi-automated pipeline for whale detection from very high-resolution (sub-meter) satellite imagery that makes use of a convolutional neural network (CNN). We trained ResNet, and DenseNet CNNs using down-scaled aerial imagery and tested each model on 31 cm-resolution imagery obtained from the WorldView-3 sensor. Satellite imagery was tiled and the trained algorithms were used to classify whether or not a tile was likely to contain a whale. Our best model correctly classified 100% of tiles with whales, and 94% of tiles containing only water. All model architectures performed well, with learning rate controlling performance more than architecture. While the resolution of commercially-available satellite imagery continues to make whale identification a challenging problem, our approach provides the means to efficiently eliminate areas without whales and, in doing so, greatly accelerates ocean surveys for large cetaceans.