Ecological Indicators (Nov 2022)
Assessing the effect of human activities on biophony in urban forests using an automated acoustic scene classification model
Abstract
Monitoring biodiversity and assessing the impact of human activities using acoustics is a promising area in the field of urban ecology. Previous studies on urban biodiversity using acoustics are often limited by data continuity and survey scope, making it difficult to answer questions about relationships between bird population dynamics and environmental factors. To some extent, big data methods such as continuous acoustic monitoring have bridged this gap and provided new research paths to address the problem. In this study, we proposed a machine learning (ML) method that uses convolutional neural networks (CNN) and target sound area ratios (TSAR) to quantify the dominance of seven types of acoustic scenes. Acoustic data was recorded at nine sites in three urban forests in Guangzhou, China. Using the site-related sound data, we trained the convolutional neural network and identified seven target soundscape components with an overall F1 score of 0.97, a precision of 0.96, and a recall of 0.97. Spatial patterns of acoustic scenes in urban forests were examined to understand the effective working radius of monitoring equipment and the impacts of differing land use types on the composition of soundscapes. This study indicates significant interactions between human activities and biodiversity using acoustics, demonstrates that vocal organisms respond to environmental changes primarily by changes in their vocal frequencies, and proposes a novel framework for utilizing acoustics to monitor urban biodiversity. Going forth, these analyses help to promote the conservation of biodiversity and the sustainability of urban development.