May 20, 2024

Talking whales? AI reveals a complex language hidden in sperm whale clicks

Talking Whales? AI Reveals A Complex Language Hidden In Sperm Whale Clicks
A mother sperm whale and her calf off the coast of Mauritius. The calf has remoras (suckerfish) attached to its body. Credit: Wikimedia Commons.

Sperm whales (Physeter macrocephalus) are renowned for their complex social structures and behaviors, facilitated by their unique vocalizations. These whales produce codas — a sequence of rapid, Morse code-like clicks used during social interactions. These aren’t random jumbles of clicks but rather purposeful vocalizations that have meaning almost like you or I would utter words. For instance, sperm whales use codas to identify themselves or outside clans. And this knowledge is passed down culturally, as young calves are not born able to produce codas. They learn by mimicking their parents.

Previously, researchers thought that the sperm whale codas were an intriguing but rather simple sets of messages. However, researchers recently used AI to detect and decode a vast dataset of whale clicks, almost like using something like Google Translate to make sense of a phrase in a foreign language. This research indicates that codas are far more than just simple, repetitive signals.

They are, in fact, a sophisticated language composed of nearly an order of magnitude more distinguishable patterns than previously recognized.

The clickbait you actually want to hear

The AI algorithms employed by the researchers at MIT processed and categorized over 8,700 codas, identifying subtle differences in rhythm and structure that define the diverse “dialects” among whale groups. In this instance, the use of AI was crucial as it was applied to its best use case: pattern recognition. The machine was able to dissect the nuanced variations in coda sequences that would be virtually indiscernible to human analysts.

The study’s findings are stunning: sperm whale codas don’t only vary by group but also change contextually within conversations. Two key features — rubato (temporal variations) and ornamentation (additional clicks) — combine with rhythm and tempo to form a rich vocal repertoire. This combinatorial system allows whales to express a wide range of information and emotions, from social cues to environmental interactions, the authors reported in the journal Nature Communications.

Deep sea chatter

However, while the researchers were able to categorize the purpose of these clicks, they’ve still yet to uncover the semantics—the meanings associated with specific vocal patterns. The authors focused on identifying the complexity and variability in the vocal patterns, rather than interpreting what each pattern specifically communicates.

The bottom line is that sperm whale vocalizations aren’t just pretty noises in the deep blue ocean. They serve a clear purpose in complex communication systems, perhaps not all that dissimilar to human language. All of this may explain the findings of biologists at Dalhousie University in Nova Scotia, who analyzed newly digitized logbooks kept by whalers during their hunting voyages in the North Pacific.

<!– Tag ID: zmescience_300x250_InContent_3

[jeg_zmescience_ad_auto size=”__300x250″ id=”zmescience_300x250_InContent_3″]

–>

Sperm whales were the main target of the commercial whaling industry from 1800 to 1987, as exhibited in the legendary Moby Dick book. They found that the strike rate of the whalers’ harpoons fell by 58% in just a few years — and the researchers think this was due to the whales sharing information between them.

Until not too long ago, humans were believed to be the only species in the world capable of employing symbolic language to communicate. While there is no definite proof that might suggest otherwise in another species, more recently the conversation has pivoted from a human-centric approach to one deeply rooted in understanding animals on their own terms.

How AI is revolutionizing our understanding of animal communication

Scientists are beginning to learn that a lot of species employ complex communication. For instance, a 2016 study used sophisticated deep learning AI on more than 15,000 recordings of Egyptian fruit bats. The bats’ vocalizations were correlated with specific behaviors. They not only argue over resources but also differentiate between genders in their communications, use unique “signature calls” akin to individual names, and engage in vocal learning.

Interestingly, mother bats modulate their vocal pitch downward when addressing their offspring — a direct contrast to the higher pitch “motherese” typical of human mothers. This pitch adjustment in bats prompts a babble response from the young, helping them learn specific vocal signals. What’s even more amazing is that most of these vocalizations are in ultrasound, far beyond our hearing range. No scientist can hear and decode bat ‘speech’, but our computers now can.

Elsewhere at the Free University of Berlin, researchers used AI that combines computer vision with natural language processing to decode the intricate body movements and sound patterns of honeybees. The bees use all sorts of specific signals, including instructions to stop or keep quiet. Next, the researchers used this info and developed RoboBee, a tiny robot that was placed inside a beehive. The RoboBee is essentially a pretender bee, which was programmed to make employ bee ‘language’.

A robotic “bee” performs a waggle dance. Credit: Freie Universität Berlin.

This robot has successfully directed bees to specific tasks by mimicking their communication signals, including the well-known waggle dance that indicates the direction of food sources. Although initial results have been mixed, the potential to guide bee behavior for conservation purposes, like directing them toward safe nectar sources, is a promising avenue of research.

A new way to find communication in nature

The insights gained from such studies are as revolutionary as the microscope’s unveiling of the microbial world centuries ago, says Karen Bakker, a professor at the University of British Columbia and a fellow at the Harvard Radcliffe Institute for Advanced Study.

“When Dutch scientist Antonie van Leeuwenhoek started looking through his microscopes, he discovered the microbial world, and that laid the foundation for countless future breakthroughs. So the microscope enabled humans to see anew with both our eyes and our imaginations. The analogy here is that digital bioacoustics, combined with artificial intelligence, is like a planetary-scale hearing aid that enables us to listen anew with both our prosthetically enhanced ears and our imagination.”

“This is slowly opening our minds not only to the wonderful sounds that nonhumans make but to a fundamental set of questions about the so-called divide between humans and nonhumans, our relationship to other species. It’s also opening up new ways to think about conservation and our relationship to the planet. It’s pretty profound.”

Thanks for your feedback!