Publication Date:
2024-03-12
Description:
Insect population numbers and biodiversity have been rapidly declining with time, and monitoring
\nthese trends has become increasingly important for conservation measures to be
\neffectively implemented. But monitoring methods are often invasive, time and resource
\nintense, and prone to various biases. Many insect species produce characteristic sounds
\nthat can easily be detected and recorded without large cost or effort. Using deep learning
\nmethods, insect sounds from field recordings could be automatically detected and classified
\nto monitor biodiversity and species distribution ranges. We implement this using recently
\npublished datasets of insect sounds (up to 66 species of Orthoptera and Cicadidae) and
\nmachine learning methods and evaluate their potential for acoustic insect monitoring. We
\ncompare the performance of the conventional spectrogram-based audio representation
\nagainst LEAF, a new adaptive and waveform-based frontend. LEAF achieved better classification
\nperformance than the mel-spectrogram frontend by adapting its feature extraction
\nparameters during training. This result is encouraging for future implementations of deep
\nlearning technology for automatic insect sound recognition, especially as larger datasets
\nbecome available.
Repository Name:
National Museum of Natural History, Netherlands
Type:
info:eu-repo/semantics/article
Format:
application/pdf
Permalink