The team analyzed recordings of 8,719 pods from about 60 whales collected by the Dominica Sperm Whale Project between 2005 and 2018, using a mixture of algorithms to recognize and classify patterns. They found that the way whales communicate was not random or simplistic, but structured according to the context of their conversations. This allowed them to recognize distinct voices that had not previously been picked up.
Instead of relying on more complex machine learning techniques, the researchers chose to use classical analysis to approach an existing database with fresh eyes.
“We wanted to go with a simpler model that would already give us the basis for our case,” says Sharma.
“The nice thing about a statistics approach is that you don’t have to train a model and it’s not a black box, and [the analyses are] easier to execute,” says Felix Effenberger, senior advisor for artificial intelligence research at the Earth Species Project, a nonprofit researching how to decode non-human communication using AI. However, he points out that machine learning is a great way to speed up the process of discovering patterns in a dataset, so adopting such a method could be useful in the future.
Algorithms converted the clicks in the encoding data into a new kind of data visualization the researchers call a trade-off plot, revealing that some codes had extra clicks. These extra clicks, combined with variations in the duration of their calls, appeared in interactions between many whales, which the researchers say suggests that codas can convey more information and have a more complex internal structure than previously thought. previously.
“One way to think about what we found is that people in the past analyzed the sperm whale’s communication system as if it were like Egyptian hieroglyphs, but it’s actually like letters,” says Jacob Andreas, an associate professor at CSAIL who participated in the project.
While the team isn’t sure if what they uncovered can be interpreted as the equivalent of the letters, tongue position or sentences that go into human language, they are confident that there was a lot of internal similarity between the codes they analyzed. says.
“This in turn allowed us to recognize that there were more kinds of coda, or more kinds of distinctions between codas, that whales are clearly capable of perceiving-[and] that people just didn’t understand at all in that data.”
The team’s next step is to create linguistic models of whale calls and examine how these calls relate to different behaviors. They also plan to work on a more general system that could be used across species, Sharma says. Taking a communication system we know nothing about, working out how it encodes and transmits information, and slowly beginning to understand what is being transmitted could serve many purposes beyond whales. “I think we’re just starting to understand some of these things,” he says. “We are very much at the beginning, but we are slowly making progress.”
Gaining an understanding of what animals say to each other is the main motivation behind projects like these. But if we ever hope to understand what whales are communicating, there’s a big hurdle in the way: the need for experiments to show that such an effort can actually work, says Caroline Casey, a researcher at UC Santa Cruz who studies elephant seals. voice communication for over a decade.
“There has been a renewed interest since the advent of artificial intelligence in decoding animal signals,” says Casey. “It is very difficult to prove that a label actually means to animals what people think it means. This paper has very well described the subtleties of their acoustic structure, but it is very difficult to take that extra step to get to the meaning of a signal.’