Now Reading
Is Interspecies Communication Possible? AI, AI, Captain!

Is Interspecies Communication Possible? AI, AI, Captain!

  • Scientists have been using the technology to codify animal vocalisations and facial expressions into a language humans can understand.

Human communication is sequential: the words occur in a certain sequence. Animals are different. “The reason, as an animal, you communicate information to another animal, is if there’s some benefit to you doing so,” said James Savage, a behavioural ecologist. 

For instance, a past experiment entailed scientists approaching prairie dogs while wearing different coloured clothes at different times of the day. The study profiled the rodents’ discrete alarm calls for each of the scientists. Slobodchikoff, an expert in the communications of prairie dogs, emphasised the importance of understanding that animals recognise reality differently from humans. For example, unlike humans, bees and some birds see in the ultraviolet range of the visual spectrum. Bats, dolphins, dogs, and cats hear sounds in the ultrasonic range. Scientists have been using the technology to codify animal vocalisations and facial expressions into a language humans can understand.

Deep Learning DevCon 2021 | 23-24th Sep | Register>>

AI applications

Machine learning is well suited to study animal language because the loose correspondences between human and animal worlds and concepts do not matter to an AI; this includes, animal ideas being expressed as gestures, sequences of movements, or changes in skin texture, said Britt Selvitelle, a founding member of the Earth Species Project, an organisation developing AI approaches to understand animal communication. The neural network considers aspects of an animal’s behavioural repertoire whose representation is similar to human language. As long as that is available for the algorithm to spot, the nature of the input data does not matter. 

In 2016, researchers used AI to decode call differences between Egyptian fruit bats squabbling over food and fighting over resting spots. In 2019, this research was advanced by transforming the sounds into sonograms and running the images through artificial neural networks. The scientists were able to infer the links between the sounds and animal behaviors such as fleeing danger or trying to attract a mate. Researchers dubbed this algorithm “DeepSqueak.”

Study on dolphins

Dolphins show high cognitive abilities, and are thus a frequent subject of animal language studies. According to Savage, dolphins purportedly do more than send alerts. It seems like dolphins call each other by names. Dolphins also employ body language and various sounds, including clicks and whistles, for sonar echolocation. 

Follow us on Google News>>

Jussi Karlgren, a computational linguist, planned a detailed experiment that would feed dolphin calls into an AI system to decipher them. Beyond just words, the results found tone, timing, context and facial expressions as their form of communication. Next, he hopes to collect dolphin whistles and click trains to segment them through AI. 

Wild Dolphin Project uses similar algorithms, coupled with underwater keyboards and computers, to decode dolphin communications in the Atlantic. The founder, Denise Herzing, used an ML algorithm – Cetacean Hearing and Telemetry (chat) to recognise signals in dolphin whistles. The algorithm managed to correlate a sound made by the dolphins the researchers had trained them to associate with sargassum seaweed. The researchers believe the dolphins may have assimilated the new “word” and begun using it in the wild.

Marcelo Magnasco, a dolphin researcher at the Rockefeller University, highlighted the need to have an extensive body of data to determine the dolphin’s signal variation and intention. 

He and his collaborator, Diana Reiss installed an underwater touch screen with interactive apps to communicate with dolphins. The duo will display visual cues to understand dolphins’ preferences among two or three alternatives. 

Roger Payne, a whale-song expert, expanded on this communication system at a workshop of the Interspecies Internet project with questions like “Do dolphins fear boats? Are sharks scary? Is your mother afraid of sharks?” According to him, we might even find out if dolphins regularly lie to each other as humans do. 

Check out the demo here. 

Current research

Shane Gero, a Canadian biologist, had been tracking sperm whales off the Caribbean island by using underwater recorders to capture codas (the rhythmic series of their ‘clicks’) from thousands of whales. After hearing a 40 minute back and forth clicking between two whales, Gero said the key to unlocking whale communication would be knowing who the animals are and what they’re doing as they make their sounds. 

Listen to sperm whales click

In April 2021, a team of scientists embarked on a five-year expedition to build on Gero’s work. Project CETI (Cetacean Translation Initiative) is likely the most significant interspecies communication effort in history. The team includes experts in linguistics, robotics, machine learning, and camera engineering and will lean heavily on NLP advances by capturing millions of whale codas and analysing them. The hope is to expose the underlying architecture of whale chatter- the foundations of their communication, including grammar, syntax and sentences. The experts will track how whales behave when making or hearing clicks. 

The most important thing about using AI to decipher animal language is the possibility of interpreting animal cognition on terms that are meaningful to the animals and not humans. 

What Do You Think?

Join Our Discord Server. Be part of an engaging online community. Join Here.


Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top