Ecology and Evolution (Jan 2018)

A spatiotemporal analysis of acoustic interactions between great reed warblers (Acrocephalus arundinaceus) using microphone arrays and robot audition software HARK

  • Reiji Suzuki,
  • Shiho Matsubayashi,
  • Fumiyuki Saito,
  • Tatsuyoshi Murate,
  • Tomohisa Masuda,
  • Koichi Yamamoto,
  • Ryosuke Kojima,
  • Kazuhiro Nakadai,
  • Hiroshi G. Okuno

DOI
https://doi.org/10.1002/ece3.3645
Journal volume & issue
Vol. 8, no. 1
pp. 812 – 825

Abstract

Read online

Abstract Acoustic interactions are important for understanding intra‐ and interspecific communication in songbird communities from the viewpoint of soundscape ecology. It has been suggested that birds may divide up sound space to increase communication efficiency in such a manner that they tend to avoid overlap with other birds when they sing. We are interested in clarifying the dynamics underlying the process as an example of complex systems based on short‐term behavioral plasticity. However, it is very problematic to manually collect spatiotemporal patterns of acoustic events in natural habitats using data derived from a standard single‐channel recording of several species singing simultaneously. Our purpose here was to investigate fine‐scale spatiotemporal acoustic interactions of the great reed warbler. We surveyed spatial and temporal patterns of several vocalizing color‐banded great reed warblers (Acrocephalus arundinaceus) using an open‐source software for robot audition HARK (Honda Research Institute Japan Audition for Robots with Kyoto University) and three new 16‐channel, stand‐alone, and water‐resistant microphone arrays, named DACHO spread out in the bird's habitat. We first show that our system estimated the location of two color‐banded individuals’ song posts with mean error distance of 5.5 ± 4.5 m from the location of observed song posts. We then evaluated the temporal localization accuracy of the songs by comparing the duration of localized songs around the song posts with those annotated by human observers, with an accuracy score of average 0.89 for one bird that stayed at one song post. We further found significant temporal overlap avoidance and an asymmetric relationship between songs of the two singing individuals, using transfer entropy. We believe that our system and analytical approach contribute to a better understanding of fine‐scale acoustic interactions in time and space in bird communities.

Keywords