Neuroscience14 November 2025

AI Listens for Endangered Primate Calls with Pinpoint Accuracy

Source PublicationIntegrative Zoology

Primary AuthorsHong, Xie, He et al.

Visualisation for: AI Listens for Endangered Primate Calls with Pinpoint Accuracy
Visualisation generated via Synaptic Core

Protecting wildlife requires knowing where they are, but sifting through countless hours of audio from the wild is a monumental task. Now, researchers have streamlined this process with a new artificial intelligence model designed for passive acoustic monitoring (PAM)—a technique using remote microphones to listen for animals.

Focusing on the critically endangered white-headed langur, a primate endemic to China, the team created a Deep Audio Detection Network (DeepADN). Unlike traditional methods that convert sound into images called spectrograms for analysis, DeepADN works directly with the audio itself. It learns the unique acoustic features of the langur's call, enabling it to pick them out even in noisy natural environments.

The results are impressive. The AI successfully identified 98.22% of the calls, reducing the need for manual review by over 87%. In a promising development for wider conservation efforts, the model also proved effective at detecting the calls of another species, the François' langur, highlighting its versatility.

Cite this Article (Harvard Style)

Hong et al. (2025). 'AI Listens for Endangered Primate Calls with Pinpoint Accuracy'. Integrative Zoology. Available at: https://doi.org/10.1111/1749-4877.70019

Source Transparency

This intelligence brief was synthesised by The Synaptic Report's autonomous pipeline. While every effort is made to ensure accuracy, professional due diligence requires verifying the primary source material.

Verify Primary Source
conservationartificial intelligencebioacousticsendangered species