With AI Translation Technology, You May Soon Be Able To Talk To Your Pets

What if you could talk to your pet dog or cat? Sounds too preposterous to be true?

Recent advancements in artificial intelligence and machine learning suggest that this could indeed be possible, albeit in a restricted manner. In the past, AI has helped us in decoding ancient languages, and now scientists are experimenting with the technology in order to interpret animal vocalisations and facial expressions into a language which can be understood by humans. 

Sceptics are aplenty, however. While some question the very depth and sophistication of animal language, others are left wondering why animal sounds are synthesised into English alone. The first recorded tangible output of such research was a Japanese novelty item, which was awarded the satirical Ig Nobel Prize in 2002. 

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

Lack of conviction notwithstanding, many from the academic and scientific community — buoyed by the emergence of AI — believe that interaction between animals and humans could be a reality in the next decade. And while much of the progress has limited itself to pages in research journals, we have begun to see examples of it being parlayed into actual devices.

One such example is a Japanese company called Inupathy unveiled its flagship product at the Consumer Electronics Show (CES) 2020. The product portfolio, consisting of a harness, a heart rate monitor, and an app is essentially a vest that helps people to understand what their dogs are thinking about.


Download our Mobile App



This is based on a simple premise — when dogs get excited, their heart rates go up. The company claims to estimate a dog’s current emotional state by analysing and interpreting this data, which it displays on an LED screen on top of the harness. While red indicates stress, green is meant to convey that they are relaxed. 

AI Translation Technology

While there could be several spin-offs in the larger expanse of innovation in this space, one of the elemental technology that is currently being explored here is natural language processing. As of today, NLP technologies can enable machines to understand only human language, but as per some researchers, the scope of this could easily be expanded. This can be done by building a deep learning model and training it on an animal language database. Some scientists also believe that the AI system tied around Google Translate could take us one step closer to enabling communication with animals. 

To understand this a little better, it will be useful to know the AI machinery that operates to correct our spellings and suggest sentence endings each time we use tools such as Gmail. This technology improves with time by getting better at understanding our language. Can such a capability shore up to human-animal interactions as well?

Decoding Animal Language — Why It Is Worth Our Time

By building empathy for animals, AI could promote their conservation and improve well being. Some researchers are working with Silicon Valley-based Conservation Metrics to apply new AI techniques to protect the elephant population in Africa. First, the team has collated 900,000 hours of recordings of elephant vocalisations. Following that, they used deep learning to analyse features in the sound data, whereby they were able to identify the sounds that indicate danger, signalling poachers in the vicinity.

Another important application for animal language translation can lie in animal husbandry. Farmers can leverage this technology to identify animals that are sick and get them timely medical intervention. This can be illustrated with an example where AI learned to spot the pain of a sheep. A team of veterinarians developed a protocol for estimating their pain from their facial expressions, but it remained a time-consuming activity. A team of scientists at the University of Cambridge in the UK, however, automated this task. 

They first drew up a Sheep Pain Facial Expression Scale by listing several ‘facial action units’ (AUs) associated with different levels of pain. Using a sample set of 480 photos, they manually labelled these AUs – nostril deformation, ear rotations, shape and form of the eyes, etc. Following this, they trained a machine-learning algorithm by feeding it 90% of this data and tested the same on the remaining 10%. The mean accuracy at identifying the AUs was 67%, which was the same as the average human. If refined further and by adding more labels, this could also work with other farm animals.

Outlook

The man popularly credited with leading the charge in decoding animal sounds is Dr Constantine Slobodchikoff. Using prairie dogs as his model species, he spent more than 30 years studying their sophisticated way of communication. This led him to start a company in 2018 called Zoolingua, whose objective was to improve the relationship between owners and their pets. Currently, his team is developing an app that would translate a dog’s body language and sounds to English. 

Another company based out of Sweden called Gavagai AB, is working on a program that combines AI analysis software with CHAT. CHAT – or Cetacean Hearing and Telemetry – is known to be one of the first truly effective animal translators to decipher dolphins whistles and their meanings. The program has already mastered 40 human languages, and the same technology is currently being used to translate the languages of other animals, including rhesus macaques and white-cheeked gibbons.

More Great AIM Stories

Anu Thomas
Anu is a writer who stews in existential angst and actively seeks what’s broken. Lover of avant-garde films and BoJack Horseman fan theories, she has previously worked for Economic Times. Contact: anu.thomas@analyticsindiamag.com

AIM Upcoming Events

Early Bird Passes expire on 3rd Feb

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
AIM TOP STORIES

Do machines feel pain?

Scientists worldwide have been finding ways to bring a sense of awareness to robots, including feeling pain, reacting to it, and withstanding harsh operating conditions.

IT professionals and DevOps say no to low-code

The obsession with low-code is led by its drag-and-drop interface, which saves a lot of time. In low-code, every single process is shown visually with the help of a graphical interface that makes everything easier to understand.

Neuralink elon musk

What could go wrong with Neuralink?

While the broad aim of developing such a BCI is to allow humans to be competitive with AI, Musk wants Neuralink to solve immediate problems like the treatment of Parkinson’s disease and brain ailments.

Understanding cybersecurity from machine learning POV 

Today, companies depend more on digitalisation and Internet-of-Things (IoT) after various security issues like unauthorised access, malware attack, zero-day attack, data breach, denial of service (DoS), social engineering or phishing surfaced at a significant rate.