Have you ever thought about how much life is buzzing, chirping, and calling all around us, often unnoticed? Scientists have long used audio recordings from microphones and underwater hydrophones to capture these rich soundscapes — from the songs of birds in a forest to the distant calls of whales beneath the waves. These sounds don’t just fill the air; they tell stories about which species are present, how many there are, and the overall health of the ecosystem. But sorting through mountains of audio data isn’t exactly a walk in the park.
I recently came across an exciting update from the Bioacoustics world — an AI model called Perch. It’s designed to make sense of these complex audio environments faster and more accurately than ever before. What struck me most is how this model extends beyond bird calls: it now recognizes sounds from mammals, amphibians, and even the often intrusive anthropogenic noises like machines and vehicles. Plus, it adapts better to tricky environments like coral reefs underwater.
Trained on almost twice as much data than before, from public sources such as Xeno-Canto and iNaturalist, Perch can analyze thousands (sometimes millions) of hours of recordings. It doesn’t just say “hey, there’s a bird here” — it can tackle nuanced questions like “how many babies are being born” or “how many individual animals are present.” This versatility is a huge leap toward practical conservation, turning raw audio into actionable insights.
Perch helped researchers detect honeycreeper sounds nearly 50 times faster than traditional methods, enabling the monitoring of endangered species over larger areas.
Real-world impact: Perch in the wild
It’s one thing to build a smart algorithm, but seeing it in action is another level. Since its launch in 2023, Perch has been downloaded more than 250,000 times and woven into tools biologists actively use. For example, Cornell’s BirdNet Analyzer leverages Perch’s vector search to pinpoint species quickly. This has even helped BirdLife Australia uncover new populations of elusive birds like the Plains Wanderer, a real win for conservation efforts.

One particularly inspiring story is from the University of Hawaiʻi’s LOHE Bioacoustics Lab. Honeycreepers, native birds important to Hawaiian culture, face extinction partly due to avian malaria spread by invasive mosquitoes. Researchers using Perch managed to find their calls almost 50 times faster than before, dramatically speeding up monitoring efforts and helping protect these treasured species.
Not just recognition — agile, adaptive modeling
What I found particularly fascinating is how Perch supports an approach called agile modeling. Imagine you have only one example of a rare animal’s call — traditionally, training a model to recognize it would be painstaking and slow. With Perch’s vector search, scientists can surface similar sounds from large datasets, then quickly train a classifier with just some expert feedback. This process can build high-quality detectors in under an hour, and it works across habitats, from forests to coral reefs.
This is an incredible discovery – acoustic monitoring like this will help shape the future of many endangered bird species.
Paul Roe, Dean Research, James Cook University, Australia
This method unlocks new possibilities for studying species that have limited data — a big plus for conservationists racing against time to monitor endangered populations.
Looking ahead: the soundtrack of a thriving planet
Putting it all together, the advancements in AI-powered bioacoustics like Perch aren’t just about crunching data faster — they’re about amplifying the voices of the wild to help safeguard our planet’s biodiversity. The combination of open-source tools and cutting-edge models maximizes the impact of conservationists’ efforts and gives them more time for crucial in-the-field work.
From Hawaii’s forests to coral reefs teeming with life, this technology showcases what happens when we blend tech expertise with environmental urgency. Each classifier built and each hour of audio analyzed brings us closer to a future where the natural sounds around us tell stories of rich, thriving ecosystems — not silent losses.
If you’re curious about how AI is amplifying these wildlife voices, the Perch project offers open access to its models and methods — inviting anyone inspired to join this crucial journey.



