Skip to main content
x

Scientists Launch Groundbreaking Research Project to Decode Whale Language

“It’s not about us talking to them. This is about listening to the whales in their own setting, on their own terms.”
By Melissa Smith | Updated On May 15, 2021
Share This Article :

Scientists Launch Groundbreaking Research Project to Decode Whale Language

Social pod of whales in the Indian Ocean

Sperm whales gather in a social pod.

Shutterstock.com/wildestanimal

An international team of scientists has kicked off a five-year effort to decode the language of sperm whales.

Whales use a series of clicks, called codas, to communicate with one another, and different groups have their own dialects. Shane Gero, founder of the Dominica Sperm Whale Project, is working to understand the coda whale clan in Dominica.

Gero first observed a pair of whales having a conversation in 2008 — they clicked back and forth for 40 minutes, swimming alongside each other.

“They were talking and playing and being siblings,” he tells National Geographic. “There was clearly so much going on.”

He’d tracked the species in the Caribbean island’s waters for three years, recording their codas, but had never overheard a full dialogue before. It captivated him. What if humans could decode the clicks? We might unlock a revolutionary understanding of the natural world.

For the next 13 years, Gero recorded thousands of underwater audio tracks, annotated with field notes such as which whales were clicking, their behavior at the time, and who was around them.

These key observations give context to the whales’ communication and used to test first test whether a computer could use machine learning AI to identify individuals by their clicks; the computer correctly identified the whales more than 94 percent of the time.

With this promising result in the books, the Project Cetacean Translation Initiative (CETI) will expand on Gero’s original work. The CETI team partnered with the local government and has received funding to capture and analyze millions of whale codas in the area.

The hope is that the machines will be able to identify patterns in the language, like grammar, syntax, words or phrases.

“With whales, the big question is whether any of this stuff is even present,” Jacob Andreas, a natural language processing expert at MIT and member of Project CETI, tells National Geographic. “Are there minimal units inside this communication system that behave like language, and are there rules for putting them together?”

In addition to from recording the millions of codas needed for computers to recognize patterns, scientists say they’ll need to observe the animals’ behavior to give the findings context. There may, for example, be a specific set of clicks made when the animals hunt or mate.

“It’s the cocktail party problem,” David Gruber, a professor at The City University of New York and Project CETI team member, tells National Geographic. “Scatter a few microphones around a party, and they’ll pick up snatches of conversation. But watch people—tracking who touches someone’s arm, who scans the room for better company—“and the whole scene starts to make more sense,” Gruber says.

To capture the context of whale communication, the team will use specialized audio and video equipment, including high-resolution hydrophones that can record 24 hours a day and reach depths of thousands of feet, videoing devices that can attach to whales with suction cups and withstand pressure at extreme depths, and drones designed to move like fish so they can record unobtrusively.

“We want to know as much as we can,” says Gruber. “What’s the weather doing? Who’s talking to who? What’s happening 10 kilometers away. Is the whale hungry, sick, pregnant, mating? But we want to be as invisible as possible as we do it.”

Once recording begins, the team will rely on advances in artificial intelligence to make sense of the data, identifying key structural elements of the whales’ language.

After that comes the fun part — trying to talk with whales. The team plans to test its analyses by broadcasting them to vocalization whales and monitoring the results to see if, for example, whales predictably respond.

“The question comes up: What are you going to say to them? That kind of misses the point,” Gero says. “It assumes they have a language to talk about us and boats or the weather or whatever we might want to ask them about.”

Rather, the goal is simply to understand. “It’s not about us talking to them,” Gruber says. “This is about listening to the whales in their own setting, on their own terms. It’s the idea that we want to know what they’re saying — that we care.”