Now Reading
Is Facial Recognition A New Form of Gender Discrimination?

Is Facial Recognition A New Form of Gender Discrimination?

  • According to research, facial recognition-based AGR technology is more likely to misgender trans people and non-binary people.

In recent years, much has been said about the dangers of facial recognition, such as mass surveillance and misidentification. However, advocates for digital rights fear a far more pernicious usage may be slipping out of the radar, like utilising digital tools to determine someone’s sexual orientation and gender.

We engage with AI systems daily, whether it’s utilising predictive text on our phones or adding a photo filter on social media apps like Instagram or Snapchat. While some AI-powered systems do practical tasks, like reducing the manual workload, it also poses a significant threat to our privacy. In addition to all the information you provide about yourself when you create an account online, many sensitive personal details from your photos, videos, and conversation such as your voice, facial shape, skin colour etc. are also captured.

Deep Learning DevCon 2021 | 23-24th Sep | Register>>

Recently, a new initiative has been started in the EU to prevent these applications from being available. Reclaim Your Face, an EU-based NGO, is pushing for a formal ban on biometric mass surveillance within the EU, asking lawmakers to set red lines or prohibitions on AI applications that violate human rights.

Reclaim your face

Gender is a broad spectrum and as society advances and becomes more self-aware, traditionally held notions become obsolete. One would expect technology to advance at the same pace. Unfortunately, advancements in the field of biometric technology have not been able to keep up.

Every year numerous apps enter the market seeking a wide range of users’ personal data. Often a lot of these systems utilise outdated and restricted understandings of gender. Facial recognition technology classifies people in binary– either male or female, depending on the presence of facial hair or makeup. In other cases, consumers are asked to provide information about their gender, personality, habits, finances, etc. where a lot of trans and nonbinary individuals are misgendered. 

Follow us on Google News>>

Thankfully, many attempts have been made to alter the user interface design to give people more control over their privacy and gender identity. Companies are promoting inclusion through modified designs that provide people with more freedom in defining their gender identity, with a broader range of terminology like genderqueer, genderfluid, or third gender (as opposed to a traditional male/female binary or two-gender system).

However, automated gender recognition or AGR still overlooks this. Rather than deciding what gender a person is, it gets facts about you and infers your gender. By using this technology, gender identification is dissolved into a simple binary based on the provided facts. In addition, it completely lacks in both objective or scientific understanding of gender and is an act of erasure for transgender and non-binary people. This systematic and mechanical erasing has actual implications in the real world.

Poor gender recognition 

According to research, facial recognition-based AGR technology is more likely to misgender trans people and non-binary people. In the research article “The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition“, author OS Keys explores how Human-Computer Interaction (HCI) and AGR use the word “gender” and how HCI employs gender recognition technology. The research’s analysis reveals that gender is continuously operationalised in a trans-exclusive manner and, as a result, trans individuals subjected to it are disproportionately at risk.

The paper, “How Computers See Gender: An Evaluation of Gender Classification in Commercial Facial Analysis and Image Labeling Services“, by Morgan Klaus Scheuerman et al. found similar results. To understand how gender is concretely conceptualised and encoded into today’s commercial facial analysis and image labelling technologies. They conducted a two-phase study investigating two distinct issues: a review of ten commercial facial analysis (FA) and image labelling services and an assessment of five FA services using self-labelled Instagram images with a bespoke dataset of varied genders. They learned how pervasive it is when gender is formalised into classifiers and data standards. When researching transgender and non-binary individuals, it was discovered that FA services performed inconsistently failed to identify non-binary genders. Additionally, they found that gender performance and identity were not encoded into the computer vision infrastructure in the same way.

The issues mentioned aren’t the only challenges to the rights of LGBTQ communities. The research papers give us a brief insight into both the good and bad aspects of AI. It highlights the importance of developing new approaches for automated gender recognition that defy the conventional method of gender classification. 

What Do You Think?

Join Our Discord Server. Be part of an engaging online community. Join Here.


Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top