Now Reading
What Is AI Called In Your Mother Tongue?

What Is AI Called In Your Mother Tongue?

  • “I think that any future where we have to rely on companies like Google to do translation work for us is not a future where I do see a kind of justice focus or rights-focused kinds of translation”

Over the last few years, the conversation around emerging technologies like AI and machine learning has increased massively. However, this conversation is limited only to the research and developers’ community. The general public, which is at the receiving end, is largely left out of such conversations. This is mainly because there has been very little effort to give cultural and linguistic context to such technologies. To give an example, most of us might be unaware of what AI is called in our local tongue or worse; there might not be any local term to refer to AI to begin with.

Recognising this problem, Asvatha Babu, a PhD student at the School of Communication at the American University, has been studying the interaction between technology, media, governance, and the public. Analytics India Magazine caught up with Asvatha for a detailed conversation.

Register for FREE Workshop on Data Engineering>>
Asvatha Babu

Edited excerpts from the interview:

AIM: What are the challenges of translating technological terms/concepts into a local cultural context?

Asvatha Babu: I am a PhD candidate at the School of Communication at the American University; I did my masters from the same university, focused on cybersecurity and technology policy. I worked briefly on blockchain and its use in creating social impact (for example, its possible use in the Aadhaar system). A series of events and deeper research got me interested in facial recognition technology (FRT).

My current dissertation work is focused on FRT and its use in solving large social, humanitarian problems. As part of my dissertation work, I have interacted with the police and understood how they use these technologies (currently, in the context of Tamil Nadu). In addition, I also try to understand the police’s attitude towards this technology and whether it is really helping them ease their work or adding more burden.

Another aspect of my research is to understand society’s general attitude towards FRT — what they think and talk about it. So when we say FRT, we culturally construct multiple things under this umbrella. In pursuit of understanding the cultural implications of FRT, I am studying the media coverage around this tech. For example, I have realised that even though the Tamil Nadu police have been using FRT for the last three years, they don’t really have a name in the local language (Tamil). 

There are so many ways to translate the term facial recognition to convey the purpose of the technology. That is what journalists do to report about FRT in the local language. Because of this, there is a tendency to talk about FRT in a more functional way (what it does and why it is required). This approach is a double-edged sword. On the one hand, this translation makes FRT appear highly contextualised, which is good. But on the other, my study shows that such an approach also leads to media reporting about FRT from the police/authorities’ perspective. This means that there is less focus on the critical analysis of such a tool.

AIM: What can be done to improve the understanding of technologies like FRT, especially through proper and contextual translations?

Asvatha Babu: A lot more attention needs to be paid to the process of translation. There are a lot of really good research organisations, civil society organisations and think tanks working on the concept of facial recognition and surveillance in India, algorithmic surveillance, digital rights and data privacy. But there is a lack of focus on grassroots level translation. How the technology gets translated or how it gets locally constructed within a specific language is important to study. So, for the first step – I think we need to pay more attention to that.

In the second step, there needs to be more engagement with linguistics like language studies scholars, digital rights scholars, as well the people who study the effect or the impact of these technologies, along with social justice and social rights like think tanks and organisations. There needs to be more engagement between them and the engineering community. Currently, the translation softwares are created by bigger organisations like Google and used by authorities like the police. There is a lack of critical voices. We’re missing this whole perspective that needs to be present in order for people to understand the technology for what it is and for the media to cover it more comprehensively.

AIM: How good are softwares like Google Translate when it comes to grassroots level tech-translation?

Asvatha Babu: I think that any future where we have to rely on companies like Google to do translation work for us is not a future where I see a kind of justice focus or rights-focused kinds of translation, because, their translation services are constructed with the aim of being used more and being more profitable as a product. 

To this end, the tribal group of New Zealand is fighting to keep their Maori language alive without the interference of big conglomerates like Google. They understand that once the tech giants gain access to this language, they (the tribe) will leave all sense of contractual ownership of this language. Companies like Google, Microsoft or IBM don’t care about the language as such; they’re interested in the concept of having this language in their arsenal, so then more people have to rely on these companies. So, the tribe is now building their own process of automation. There are alternate ways of automating the translating process that doesn’t have to rely on these big tech companies motivated solely by profit.

See Also
UMANG App To Introduce Face ID For Biometric Verifications - But Are We Ready?

AIM: Given the risk of surveillance and breach of privacy technologies like FRT pose, activists believe that they should be completely banned. What is your take?

Asvatha Babu: Yes, this is a major concern — pros vs cons and the actual cost of benefits. 

To begin with, surveillance has always existed; people have always watched other people, especially in the context of, say, a public health issue or in prisons. There’s always this notion of having to watch your fellows or people under you for public safety and security, etc. In the current time, the availability of information, processing prowess and the entry of private technology companies trying to make a profit out of this has made surveillance even more advanced.

It is important to consider two perspectives here. The first is that we are already living in a hostile society where there is a power asymmetry between authorities and the public. Developing more advanced tools of surveillance would further exacerbate that asymmetry. So – any amount of it is too much.  

The second perspective, often adopted by the authorities and those who develop such technologies, is that it makes life easier and enables the police to serve the public better and increase public safety and security.

These technologies cannot be eliminated or rolled back. From our (engineers, developers, media and authorities) end, we need to ensure that people are adequately educated about these technologies and their effects in a more cultural context.

Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.
Join our Telegram Group. Be part of an engaging community

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top