AUDIO EXPERIENCE DESIGN

Imperial College London

Combining machine learning and a universal acoustic feature-set yields efficient automated monitoring of ecosystems

Sarab S Sethi, Nick S Jones, Ben D Fulcher, Lorenzo Picinali, Dena J Clink, Holger Klinck, C David L Orme, Peter H Wrege, Robert M Ewers
January 1, 2019
BioRxiv
Volume 
Issue 
Pages 
865980
Cold Spring Harbor Laboratory

Natural habitats are being impacted by human pressures at an alarming rate. Monitoring these ecosystem-level changes often requires labour-intensive surveys that are unable to detect rapid or unanticipated environmental changes. Here we developed a generalisable, data-driven solution to this challenge using eco-acoustic data. We exploited a convolutional neural network to embed ecosystem soundscapes from a wide variety of biomes into a common acoustic space. In both supervised and unsupervised modes, this allowed us to accurately quantify variation in habitat quality across space and in biodiversity through time. On the scale of seconds, we learned a typical soundscape model that allowed automatic identification of anomalous sounds in playback experiments, paving the way for real-time detection of irregular environmental behaviour including illegal activity. Our highly generalisable approach, and the common set of features, will enable scientists to unlock previously hidden insights from eco-acoustic data and offers promise as a backbone technology for global collaborative autonomous ecosystem monitoring efforts.

Visit Publication

Authors from the Audio Experience Design Team

Related Project

No items found.

Related Tools & Devices

No items found.