By Anica Hattingh and Sofia Bassios
Last year, two postgraduate students, Christiaan Geldenhuys (PhD candidate in Electronic Engineering) and Günther Tonitz (Master’s student in Electronic Engineering), at Stellenbosch University (SU), received global recognition after receiving the Jury’s Award at the BioDCASE 2025. They developed a machine-learning system capable of detecting Antarctic blue and fin whale calls in complex underwater recordings, earning them second place overall in this international challenge.

What is BioDCASE?
BioDCASE is a specialised research challenge aimed at using artificial intelligence (AI) to detect and classify biological sounds. More specifically, BioDCASE is a bioacoustics task within the annual Detection and Classification of Acoustic Scenes and Events Challenge (DCASE).
The challenge, which is one of the leading audio machine-learning competitions, is organised by the Institute of Electrical and Electronics Engineers (IEEE) Audio and Acoustic Signal Processing (AASP) community.
The challenge tasks researchers with developing machine-learning systems that can recognise animal sounds in complex audio recordings. These recordings are made in natural environments such as oceans, forests and wetlands.
In layman’s terms, participants are tasked with building algorithms that can detect biological sounds (such as whale calls), distinguish between different species or call types, and work accurately despite background noise and large datasets.
Why does this research matter?
“The problem we are trying to solve is that we have a recorder that is in the ocean, that is recording ocean noise 24/7 and we want to be able to determine […] where blue and fin whale calls are occurring and the types of those calls,” Tonitz explained. He added that it would be unrealistic to expect someone to manually listen through nearly 6 000 hours of recordings to identify whale calls, as such a method would be both time-consuming and financially impractical.
Tonitz further explained that their work in audio machine learning contributes to answering questions about whether or not blue whales are endangered. “You can’t just say, ‘Oh, I haven’t seen blue whales in a while so they’re likely endangered, right?’” he said. Relying only on sightings does not provide reliable scientific evidence and cannot serve as solid proof when policy decisions – such as enforcing whaling bans introduced in the 1960s – are being considered.
How does the algorithm work?
Tonitz explained that their model analyses continuous underwater recordings and processes the data into a format that allows a classifier to identify whale calls. The system then pinpoints the exact start and end times of each call, producing a timeline that indicates when a whale call begins and when it ends within the recording.
To evaluate the model during the challenge, it was tested on new audio data that it had not previously encountered. The results were then compared with a verified list of actual whale calls in the recordings. The model’s performance was scored based on how accurately it detected real calls and avoided false detections.
Geldenhuys added, “These calls are like really big haystacks and you are trying to find the particular little needle within each of those.”
Bioacoustics aims to understand animal behaviour by the sounds that they produce, which is the very aim of Tonitz and Geldenhuys’s research. Through their work, they demonstrate how advanced technology can help scientists better monitor whale populations and highlight the growing role of machine learning in environmental research.