Coral acoustics: AI trained to detect fish sounds faster than humans
It’s hoped this new advance in AI technology will pave a pathway to both scaling-up and speeding-up acoustic data analysis across the world’s coral reef ecosystems, a move that could be transformative for research and conservation efforts.
Researchers looking to speed up the process of distinguishing and identifying sounds recorded on coral reefs have trained a neural network to sift through mountains of acoustic data to decipher trends in the marine environment 25 times faster than humans.
It’s believed that the recent success in this field will now pave the pathway towards both scaled-up and sped-up analysis of sound monitoring across the world’s coral reef ecosystems, a move that could be transformative for research and conservation efforts.
With many of the world’s coral reefs under threat from climate change and human activity, being able to rapidly identify and track changes in reef populations will be crucial for those conservation – and even restoration – efforts.
Coral reefs are recognised as some of the world’s most diverse ecosystems. Yet, despite making up less than 1% of the world’s oceans, one-quarter of all marine species spend some portion of their life on a reef. With so much life in one spot, researchers can struggle to gain a clear understanding of which species are present and in what numbers.
For years, marine scientists have used passive acoustic monitoring to try and track coral reef activity. Typically, an acoustic recorder would be deployed underwater, where it would spend months recording audio from the reef.
Until now, tools could be used to analyse this data in batches but are unable to be used to find specific sounds. To do that, scientists would usually need to comb through the data by hand.
“But for the people that are doing that, it’s awful work, to be quite honest,” said Seth McCammon, an assistant scientist of Applied Ocean Physics and Engineering at Woods Hole Oceanographic Institution (WHOI) – the team behind the study.
“It’s incredibly tedious work. It’s miserable.”

Not only that, but it’s a slow process, too – too slow, WHOI has said to be ‘of much practical use’ oftentimes taking years to analyse data to the required level.
As an alternative, then, researchers from WHOI trained a neural network to sort through all this data automatically, analysing audio recordings in real time. According to the study published in the Journal of the Acoustical Society of America the resultant algorithm can match the accuracy of human experts in deciphering acoustical trends on a reef, but can do so more than 25 times faster – presenting the potential to change the way ocean research is conducted.
“Now that we no longer need to have a human in the loop, what other sorts of devices – moving beyond just recorders – could we use?” said McCammon. “Some work that my co-author Aran Mooney is doing involves integrating this type of neural network onto a floating mooring that’s broadcasting real-time updates of fish call counts.
“We are also working on putting our neural network onto our autonomous underwater vehicle, CUREE, so that it can listen for fish and map out hot spots of biological activity.”
This technology also has the potential to solve a long-standing problem in marine acoustic studies: matching each unique sound to a fish.
This is an area of sound identification that has so far eluded scientists and to this day researchers are unable to pin – with certainty – a call to a specific species of fish. Reaching such a stage in technological advancement is – what McCammon has billed – the “holy grail”.
“By being able to do fish call detection in real-time, we can start to build devices that are able to automatically hear a call and then see what fish are nearby,” said McCammon.
It is hoped that the neural network built by WHOI researchers will eventually provide the ability to monitor fish populations in real-time, identify species in trouble, and respond to disasters. This level of technology would prove to be transformative for conservationists, aiding them to gain a clearer picture of the health of coral reefs, in an era where reefs need all the help they can get.
The article ‘Rapid detection of fish calls within diverse coral reef soundscapes using a convolutional neural network’ is authored by Seth McCammon, Nathan Formel, Sierra Jarriel, and T. Aran Mooney. It was published in The Journal of the Acoustical Society of America earlier this month.

"*" indicates required fields
Printed editions
Current issue
Back issues

Current Issue
Issue 41 Holdfast to the canopy

Back Issues
Issue 39 Special Edition: OPY2024
Enjoy so much more from Oceanographic Magazine by becoming a subscriber.
A range of subscription options are available.