Automated detection of Hainan gibbon calls for passive acoustic monitoring

1O_LIExtracting species calls from passive acoustic recordings is a common preliminary step to ecological analysis. For many species, particularly those occupying noisy, acoustically variable habitats, the call extraction process continues to be largely manual, a time-consuming and increasingly unsustainable process. Deep neural networks have been shown to offer excellent performance across a range of acoustic classification applications, but are relatively underused in ecology. C_LIO_LIWe describe the steps involved in developing an automated classifier for a passive acoustic monitoring project, using the identification of calls of the Hainan gibbon (Nomascus hainanus), one of the worlds rarest mammal species, as a case study. This includes preprocessing - selecting a temporal resolution, windowing and annotation; data augmentation; processing - choosing and fitting appropriate neural network models; and postprocessing - linking model predictions to replace, or more likely facilitate, manual labelling. C_LIO_LIOur best model converted acoustic recordings into spectrogram images on the mel frequency scale, using these to train a convolutional neural network. Model predictions were highly accurate, with per-second false positive and false negative rates of 1.5% and 22.3%. Nearly all false negatives were at the fringes of calls, adjacent to segments where the call was correctly identified, so that very few calls were missed altogether. A postprocessing step identifying intervals of repeated calling reduced an eight-hour recording to, on average, 22 minutes for manual processing, and did not miss any calling bouts over 72 hours of test recordings. Gibbon calling bouts were detected regularly in multi-month recordings from all selected survey points within Bawangling National Nature Reserve, Hainan. C_LIO_LIWe demonstrate that passive acoustic monitoring incorporating an automated classifier represents an effective tool for remote detection of one of the worlds rarest and most threatened species. Our study highlights the viability of using neural networks to automate or greatly assist the manual labelling of data collected by passive acoustic monitoring projects. We emphasise that model development and implementation be informed and guided by ecological objectives, and increase accessibility of these tools with a series of notebooks that allow users to build and deploy their own acoustic classifiers. C_LI

[1]  Elmar Nöth,et al.  ORCA-SPOT: An Automatic Killer Whale Sound Detection Toolkit Using Deep Learning , 2019, Scientific Reports.

[2]  Holger Klinck,et al.  Gibbons aren’t singing in the rain: presence and amount of rainfall influences ape calling behavior in Sabah, Malaysia , 2020, Scientific Reports.

[3]  Sound Spectrum Characteristics of Songs of Hainan Gibbon (Nomascus hainanus) , 2014, International Journal of Primatology.

[4]  Wei Liu,et al.  Whistle detection and classification for whales based on convolutional neural networks , 2019, Applied Acoustics.

[5]  J. Andrew Royle,et al.  Spatial proximity moderates genotype uncertainty in genetic tagging studies , 2020, Proceedings of the National Academy of Sciences.

[6]  Xiaobai Liu,et al.  Deep neural networks for automated detection of marine mammal species , 2020, Scientific Reports.

[7]  David L. Borchers,et al.  A general framework for animal density estimation from acoustic detections across a fixed microphone array , 2015 .

[8]  Warren Y. Brockelman,et al.  Estimation of density of gibbon groups by use of loud songs , 1993, American journal of primatology.

[9]  Dan Stowell,et al.  Automatic acoustic identification of individuals in multiple species: improving identification across recording conditions , 2019, Journal of the Royal Society Interface.

[10]  Dena J. Clink,et al.  Application of a semi-automated vocal fingerprinting approach to monitor Bornean gibbon females in an experimentally fragmented landscape in Sabah, Malaysia , 2019 .

[11]  Jessica V. Bryant,et al.  How many remnant gibbon populations are left on Hainan? Testing the use of local ecological knowledge to detect cryptic threatened primates , 2017, American journal of primatology.

[12]  Yunpeng Li,et al.  Bioacoustic detection with wavelet-conditioned convolutional neural networks , 2018, Neural Computing and Applications.

[13]  T. Vu,et al.  An Application of Autonomous Recorders for Gibbon Monitoring , 2019, International Journal of Primatology.

[14]  K. Zuberbühler,et al.  The Syntax and Meaning of Wild Gibbon Songs , 2006, PloS one.

[15]  Vladimir Kulyukin,et al.  Toward Audio Beehive Monitoring: Deep Learning vs. Standard Machine Learning in Classifying Beehive Audio Samples , 2018, Applied Sciences.

[16]  Amy McGovern,et al.  Automated detection of bird roosts using NEXRAD radar data and Convolutional Neural Networks , 2018, Remote Sensing in Ecology and Conservation.

[17]  S. Turvey,et al.  Detection of a New Hainan Gibbon (Nomascus hainanus) Group Using Acoustic Call Playback , 2016, International Journal of Primatology.

[18]  Eric Hervet,et al.  Applications for deep learning in ecology , 2019, Methods in Ecology and Evolution.

[19]  S. Turvey,et al.  Thermal infrared imaging from drones can detect individuals and nocturnal behavior of the world’s rarest primate , 2020, Global Ecology and Conservation.

[20]  R. Varshney,et al.  Genome-wide identification of meiotic recombination hot spots detected by SLAF in peanut (Arachis hypogaea L.) , 2020, Scientific Reports.

[21]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[22]  D. Ngoprasert,et al.  Singing Patterns of White-Cheeked Gibbons (Nomascus sp.) in the Annamite Mountains of Laos , 2015, International Journal of Primatology.

[23]  S. Cheyne Effects of meteorology, astronomical variables, location and human disturbance on the singing apes: Hylobates albibarbis , 2008, American journal of primatology.

[24]  David L. Borchers,et al.  An Efficient Acoustic Density Estimation Method with Human Detectors Applied to Gibbons in Cambodia , 2016, PloS one.

[25]  H. Kühl,et al.  Listening and watching: Do camera traps or acoustic sensors more efficiently detect wild chimpanzees in an open habitat? , 2020, Methods in Ecology and Evolution.

[26]  Hanna Pamula,et al.  Towards the Acoustic Monitoring of Birds Migrating at Night , 2019, Biodiversity Information Science and Standards.

[27]  Hervé Glotin,et al.  Automatic acoustic detection of birds through deep learning: The first Bird Audio Detection challenge , 2018, Methods in Ecology and Evolution.

[28]  Lorenzo Picinali,et al.  Characterizing soundscapes across diverse ecosystems using a universal acoustic feature set , 2020, Proceedings of the National Academy of Sciences.

[29]  Hafiz Adnan Habib,et al.  A hybrid technique for speech segregation and classification using a sophisticated deep neural network , 2018, PloS one.

[30]  Kate E. Jones,et al.  CityNet—Deep learning tools for urban ecoacoustic assessment , 2018, Methods in Ecology and Evolution.

[31]  Chen Sun,et al.  Revisiting Unreasonable Effectiveness of Data in Deep Learning Era , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[32]  Jessica V. Bryant,et al.  Spatiotemporal requirements of the Hainan gibbon: Does home range constrain recovery of the world's rarest ape? , 2017, American journal of primatology.

[33]  Pengfei Fan,et al.  Effects of group density, hunting, and temperature on the singing patterns of eastern hoolock gibbons (Hoolock leuconedys) in Gaoligongshan, Southwest China , 2016, American journal of primatology.

[34]  K. Kerber,et al.  Niger’s Child Survival Success, Contributing Factors and Challenges to Sustainability: A Retrospective Analysis , 2016, PloS one.

[35]  Robert J. Wood,et al.  Deep Machine Learning Techniques for the Detection and Classification of Sperm Whale Bioacoustics , 2019, Scientific Reports.

[36]  T. Waldhör,et al.  Increased nurse workload is associated with bloodstream infections in very low birth weight infants , 2019, Scientific Reports.