What if you could talk to your pet dog or cat? Sounds too preposterous to be true?
Recent advancements in artificial intelligence and machine learning suggest that this could indeed be possible, albeit in a restricted manner. In the past, AI has helped us in decoding ancient languages, and now scientists are experimenting with the technology in order to interpret animal vocalisations and facial expressions into a language which can be understood by humans.
Sceptics are aplenty, however. While some question the very depth and sophistication of animal language, others are left wondering why animal sounds are synthesised into English alone. The first recorded tangible output of such research was a Japanese novelty item, which was awarded the satirical Ig Nobel Prize in 2002.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Lack of conviction notwithstanding, many from the academic and scientific community — buoyed by the emergence of AI — believe that interaction between animals and humans could be a reality in the next decade. And while much of the progress has limited itself to pages in research journals, we have begun to see examples of it being parlayed into actual devices.
One such example is a Japanese company called Inupathy unveiled its flagship product at the Consumer Electronics Show (CES) 2020. The product portfolio, consisting of a harness, a heart rate monitor, and an app is essentially a vest that helps people to understand what their dogs are thinking about.
This is based on a simple premise — when dogs get excited, their heart rates go up. The company claims to estimate a dog’s current emotional state by analysing and interpreting this data, which it displays on an LED screen on top of the harness. While red indicates stress, green is meant to convey that they are relaxed.
AI Translation Technology
While there could be several spin-offs in the larger expanse of innovation in this space, one of the elemental technology that is currently being explored here is natural language processing. As of today, NLP technologies can enable machines to understand only human language, but as per some researchers, the scope of this could easily be expanded. This can be done by building a deep learning model and training it on an animal language database. Some scientists also believe that the AI system tied around Google Translate could take us one step closer to enabling communication with animals.
To understand this a little better, it will be useful to know the AI machinery that operates to correct our spellings and suggest sentence endings each time we use tools such as Gmail. This technology improves with time by getting better at understanding our language. Can such a capability shore up to human-animal interactions as well?
Decoding Animal Language — Why It Is Worth Our Time
By building empathy for animals, AI could promote their conservation and improve well being. Some researchers are working with Silicon Valley-based Conservation Metrics to apply new AI techniques to protect the elephant population in Africa. First, the team has collated 900,000 hours of recordings of elephant vocalisations. Following that, they used deep learning to analyse features in the sound data, whereby they were able to identify the sounds that indicate danger, signalling poachers in the vicinity.
Another important application for animal language translation can lie in animal husbandry. Farmers can leverage this technology to identify animals that are sick and get them timely medical intervention. This can be illustrated with an example where AI learned to spot the pain of a sheep. A team of veterinarians developed a protocol for estimating their pain from their facial expressions, but it remained a time-consuming activity. A team of scientists at the University of Cambridge in the UK, however, automated this task.
They first drew up a Sheep Pain Facial Expression Scale by listing several ‘facial action units’ (AUs) associated with different levels of pain. Using a sample set of 480 photos, they manually labelled these AUs – nostril deformation, ear rotations, shape and form of the eyes, etc. Following this, they trained a machine-learning algorithm by feeding it 90% of this data and tested the same on the remaining 10%. The mean accuracy at identifying the AUs was 67%, which was the same as the average human. If refined further and by adding more labels, this could also work with other farm animals.
The man popularly credited with leading the charge in decoding animal sounds is Dr Constantine Slobodchikoff. Using prairie dogs as his model species, he spent more than 30 years studying their sophisticated way of communication. This led him to start a company in 2018 called Zoolingua, whose objective was to improve the relationship between owners and their pets. Currently, his team is developing an app that would translate a dog’s body language and sounds to English.
Another company based out of Sweden called Gavagai AB, is working on a program that combines AI analysis software with CHAT. CHAT – or Cetacean Hearing and Telemetry – is known to be one of the first truly effective animal translators to decipher dolphins whistles and their meanings. The program has already mastered 40 human languages, and the same technology is currently being used to translate the languages of other animals, including rhesus macaques and white-cheeked gibbons.