Advertisement

Hyderabad Man Bears The Brunt Of Facial Recognition Technology

The Internet Freedom Foundation said the random collection of citizens’ photographs could be linked to a database building exercise for FRT
Hyderabad facial recognition

Hyderabad-based social activist SQ Masood was in for a rude shock on May 19 when ten police officials stopped him and asked him to remove his mask to take a picture. He refused to remove the mask given the pandemic guidelines. However, the cops photographed him despite his protest.

Later, Masood approached the Police Commissioner to raise his concerns about police collecting citizens’ photos. On behalf of Masood, the Internet Freedom Foundation has served a legal notice to the Police Commissioner (on May 31) pointing out such activities are illegal and violate the right to privacy guaranteed by the Indian constitution. Identification of Prisoners Act, 1920 allows police to photograph only people arrested or convicted of a crime and not innocent citizens.

The Internet Freedom Foundation said the random collection of citizens’ photographs could be linked to a database building exercise for FRT (facial recognition technology). “FRT uses algorithms to extract data points from your face to create a digital signature of your face. This signature is then compared with an existing database to find possible matches,” the blog said. FRT is not 100 percent accurate in finding matches and also poses a risk of misidentification. As per the Supreme Court of India in KS. Puttaswamy v Union of India, (2017) 10 SCC 1, any invasion of privacy must satisfy legality, necessity and proportionality.

As per the Internet Freedom Foundation report, Telangana uses the highest number of facial recognition technologies (FRT) in India. Of the 32 FRT systems in India, Telangana has five of them, used by Hyderabad Police, Telangana Police, Telangana State Election Commission, Hyderabad Airport, and State Higher Education Department.

FRT for law enforcement

Telangana is not an isolated case. Many states and countries have adopted face recognition technology for law enforcement. Last year, Vadodara police installed FRT systems to identify criminals and repeat offenders.

Earlier this year, the Lucknow police announced they would deploy AI-based CCTVs at public places to help women in distress. The cameras will identify if the women are in trouble by reading their facial expression and alert authorities.

Countries such as the US have reported multiple instances of FRT use. As per Steven Feldstein, a policy researcher at the Carnegie Endowment for International Peace in Washington DC, as of 2019, 64 countries were using FRT in surveillance.

China has been using FRT systems for mass surveillance. According to Coded Bias, a documentary released last year, facial recognition systems, used as part of the social credit system, is no less than an ‘algorithmic obedience training’ under the garb of societal betterment. A 2019 database leak revealed China’s surveillance system collects as much as 6.8 million records a day from cameras around hotels, parks, tourism spots, and mosques.

Dangers of FRT

A facial recognition system uses biometric software to map a person’s facial features from a photo or video. The system then tries to match it with the information already available on the databases to verify identity. Police departments and other law enforcement units regularly use FRT systems to find suspects and witnesses by scanning these photos.

Time and again, experts have condemned the use of FRTs. Out of the many challenges associated with the FRT systems, perhaps the most dangerous is the system’s inherent bias. Studies have shown these systems can be biased against marginalised groups, putting them at risk of being falsely implicated in crimes they never committed. In many previous instances, marginalised communities tend to be the hotbed for such technological experiments.

Most democracies around the world guarantee their citizens’ liberty. Being recorded and scanned by such systems can instil a sense of fear, insecurity, and mistrust due to a system that treats you as a criminal suspect without probable cause. Moreover, the database that stores such sensitive information has a risk of being hacked, putting at stake the privacy of many citizens.

Download our Mobile App

Shraddha Goled
I am a technology journalist with AIM. I write stories focused on the AI landscape in India and around the world with a special interest in analysing its long term impact on individuals and societies. Reach out to me at shraddha.goled@analyticsindiamag.com.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Upcoming Events

15th June | Online

Building LLM powered applications using LangChain

17th June | Online

Mastering LangChain: A Hands-on Workshop for Building Generative AI Applications

Jun 23, 2023 | Bangalore

MachineCon 2023 India

26th June | Online

Accelerating inference for every workload with TensorRT

MachineCon 2023 USA

Jul 21, 2023 | New York

Cypher 2023

Oct 11-13, 2023 | Bangalore

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR

Is Sam Altman a Hypocrite? 

While on the one hand, Altman is advocating for the international community to build strong AI regulations, he is also worried when someone finally decides to regulate it