Marine Life

Humpback to the future - AI makes splash in whale conservation

Google's Digital Futures Programme has partnered with Australia's Griffith University to deploy hydrophone and Google AI technology in new research into the migratory behaviours of humpback whales.

29/11/2024
Written by Rob Hutchins
Photographs by Mike Doherty

With the change of season comes an end to the annual visit that humpback whales pay the east coast of Australia to breed as, once again, the beloved whales begin their journey back to Antarctica in time for their summer feeding. 

A species notoriously difficult to study – with much of their time spent underwater and outside range of direct observation – little much else is known about what these humpback whales will be getting up to between now and when they next return to Australian waters, except that when they do, it will be the Australian autumn.

During their migration, these humpback whales will make calls and sing songs in what has been called a “grand chorus in the symphony of their ecosystems”. And it is here – in a marriage between this underwater soundscape and state-of-the-art technology, courtesy of Google – that scientists will be provided with their most “valuable and vital” window to better understand this species and their habitats to date. 

Google’s Digital Futures Initiative is supporting whale researchers from Griffith University’s Whales and Climate Programme to comprehensively monitor humpback whale migrations and their ecosystems along Australia’s east coast with the deployment of hydrophones and state-of-the-art automatic audio detection, powered by Google AI.

It’s a partnership that promises to halve the leg work involved in tracking these most elusive humpbacks. Traditional whale research methods typically involve labour-intensive processes such as logging sightings of whales and manually reviewing the audio data. This can usually only be gathered during daylight hours meaning scientists are unable to collect detailed, comprehensive data over continuous stretches of time.

Until now, that is.

Humpback whales: These ocean giants are notoriously difficult to track using traditional methods

It’s hoped that the use of hydrophones – microphones used underwater for recording or listening to underwater sounds – and Google AI technology within this new collaborative project will remove such research barriers and limitations, enabling scientists to collect automatic and continuous audio data and analysis.

“The Whales and Climate Programme currently holds the largest whale sighting database in Australia but this is sighting data captured during the daytime, which means there is no data spanning 24-hour periods,” said Griffith University’s Dr Olaf Meynecke.

“The hydrophone array will help us capture continuous data over 24 hours and do this for the entire whale season every year. We will be collecting many terabytes of acoustic data that will then be analysed with Google’s AI technology to detect whale location and activity. 

This data will then be overlaid with data from existing sightings which, states Dr Meynecke, “will provide a much more holistic picture of whale movements and behaviours.”

Deployment of the hydrophones began in sites off the Gold Coast, Sydney, and Merimbula with another three deployed in early October. Spaced out at approximately 500km intervals, the tech has been dispersed to ensure wide coverage of annual migrations that typically span the Australian east coast.

This agenda-setting project has been brought to life via the Digital Futures Initiative, Google Australia’s $1bn investment in Australian research, partnerships, and infrastructure. The commitment supports a range of AI-focused projects across healthcare, sustainability, energy, and more, including a search engine for bird and wildlife sounds.

“Google’s AI technology detects whale sounds, marks the location in time and classifies the species,” said Dr Lauren Harrell, from the Google Research team. “The model does this automatically, relieving researchers from time consuming and manual work so they can spend more time uncovering insights and exploring new, unchartered territories of research.”

While the current focus will be on monitoring humpback whale sounds, the potential of the AI model extends beyond this species alone. Google AI has already hinted toward building on the model to detect the sounds of diverse marine species, from fish to dolphins and seals.

“This data can help to inform conservation decisions and will be made publicly available to the global research community,” said Dr Harrell.

Click here for more from the Oceanographic Newsroom.

Written by Rob Hutchins
Photographs by Mike Doherty

Printed editions

Current issue

Back issues

Enjoy so much more from Oceanographic Magazine by becoming a subscriber.
A range of subscription options are available.