Methods in Ecology and Evolution (Apr 2023)

RoboFinch: A versatile audio‐visual synchronised robotic bird model for laboratory and field research on songbirds

  • Ralph Simon,
  • Judith Varkevisser,
  • Ezequiel Mendoza,
  • Klaus Hochradel,
  • Rogier Elsinga,
  • Peter G. Wiersma,
  • Esmee Middelburg,
  • Eva Zoeter,
  • Constance Scharff,
  • Katharina Riebel,
  • Wouter Halfwerk

DOI
https://doi.org/10.1111/2041-210X.14063
Journal volume & issue
Vol. 14, no. 4
pp. 1092 – 1103

Abstract

Read online

Abstract Singing in birds is accompanied by beak, head and throat movements. The role of these visual cues has long been hypothesised to be an important facilitator in vocal communication, including social interactions and song acquisition, but has seen little experimental study. To address whether audio‐visual cues are relevant for birdsong we used high‐speed video recording, 3D scanning, 3D printing technology and colour‐realistic painting to create RoboFinch, an open source adult‐mimicking robot which matches temporal and chromatic properties of songbird vision. We exposed several groups of juvenile zebra finches during their song developmental phase to one of six singing robots that moved their beaks synchronised to their song and compared them with birds in a non‐synchronised and two control treatments. Juveniles in the synchronised treatment approached the robot setup from the start of the experiment and progressively increased the time they spent singing, contra to the other treatment groups. Interestingly, birds in the synchronised group seemed to actively listen during tutor song playback, as they sung less during the actual song playback compared to the birds in the asynchronous and audio‐only control treatments. Our open source RoboFinch setup thus provides an unprecedented tool for systematic study of the functionality and integration of audio‐visual cues associated with song behaviour. Realistic head and beak movements aligned to specific song elements may allow future studies to assess the importance of multisensory cues during song development, sexual signalling and social behaviour. All software and assembly instructions are open source, and the robot can be easily adapted to other species. Experimental manipulations of stimulus combinations and synchronisation can further elucidate how audio‐visual cues are integrated by receivers and how they may enhance signal detection, recognition, learning and memory.

Keywords