Analysing their sonic repertoire produces remarkable results.

In 2011, Zooniverse, a platform for conducting scientific research with the help of volunteers, paired with Scientific American to start the Whale FM project. The initiative collected over 15,000 samples of pilot and orca whale calls from off the coasts of the Bahamas, Iceland, and Norway to see whether computer analysis could decipher anything about these populations and the ways they use their sonic repertoire.
“We wanted to analyze and profile whale communication,” explained Whale FM lead Artificial Intelligence scientist Lior Shamir in an interview with Interesting Engineering.
“We started with supervised machine learning, just to see if the computer can identify differences between the audio, and then we started to use unsupervised learning. I asked the computer, “What can you tell us about the relationships between the sets of audio samples that we have?”
The AI had no human guidance while analyzing this audio, it only knew that different whale types existed in the dataset. After running the program across 300 computer processors for a total of seven weeks, the AI produced a map that grouped the pilot whales and orcas separately.
“That wasn’t very surprising,” Shamir said. “They’re different species, we expect they’d speak differently. But what was interesting, inside those groups, it identified and clustered those pods of whales by where they live on the globe.”
Shamir and the Whale FM team were surprised. Their work had uncovered evidence for different communication styles within each species. Norwegian orcas, for example, were speaking a unique dialect compared to their Icelandic relatives. The same was true for pilot whales in the Bahamas and in Norway.
“[The results] showed that the same species might have a different dialect based on where they live, just like people have different accents. They are actually communicating, they are actually speaking with each other, and eventually we’ll figure out what they’re saying.”
Eventually could come sooner rather than later. In April of this year, a team of experts in robotics, linguistics, and machine learning established Project CETI (Cetacean Translation Initiative), a scientific endeavor with the goal of applying new advances in machine learning to better understand sperm whale language.
Nestor, one of the project's founding members, is convinced sperm whales are a unique species, an animal whose behavior, he writes, “more closely resembles our culture and intellect than any other creature’s on the planet.”
Sperm whales “talk” to one another through intensely loud series of clicks, called codas. The clicks are so loud they can reach upwards of 230 decibels - louder than a rocket launch. This makes them the loudest animal on the planet and enables them to communicate with one another over distances of hundreds of miles.
We'll keep you posted!