Advertisement

Active Hackathon

About 76 percent of the audience encountered a biased algorithm

Today, the mother's name and profession are essential in a child's school application form, unlike in the past.

Bias in AI algorithms is nothing but the mirror of societal bias that has been ingrained for many years now. These biases will remain if not acted upon to have an equal and diverse world not just in tech but also otherwise. Talking about women in Data Science and AI, Anjali Iyer, Delivery Excellence Business Leader at The Math Company, shared some of her personal experiences at The Rising 2022, Women in AI conference organised by Analytics India Magazine on Friday.

Watch all the recorded sessions of Rising 2022 here>>

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

Recalling her schooling days, Anjali said, “During my days, when my parents were getting me enrolled in a school, I don’t remember seeing an application form where my mother’s name or occupation was asked. It was all about the father’s name and occupation. I know the times are changing, at least now; when I enrolled my daughter in a school, I could see that I was able to spell my name and my occupation, but that wasn’t the case when I was getting into the school. So there is definitely a change that we are getting into, but again how many women are getting into the tech industry and how many are able to get into the leadership roles, there is still a wide gap.”

76% have encountered biased algorithms

Traversing through experiences with women in Data Science and AI, Anjali asked the audience if they had ever encountered a biased algorithm. About 76 per cent of the audience at the conference voted “Yes”, which speaks volumes about the existing bias. “The representation of women in AI and tech matters a lot, and we need to change this. But it is not just about representation but also about women getting the right opportunity, as there is a huge difference in the way men and women get similar opportunities while getting into tech. Ten years ago, even though I did not get the opportunity to do my MS as marriage was a priority, things are changing slowly,” she added.

Citing an example of an American multinational company, Anjali spoke about a health app that was criticised for ignoring women’s health issues. The app could track every piece of data except for women’s natural cycles. This happened because of the bias, which is inbuilt in these algorithms, and it happened only because there was a lack of diversity.

Dismal number of women in STEM 

Looking at the STEM jobs, there is just about 28 per cent women representation and even more startling in the field of AI/ML research, where there are just 15 per cent women in the industry. “These figures mean that the organisations will fail to harness the fullest capacity of their digital innovations without including women. These machine learning technologies will be fed with a constant stream of biased data, eventually producing junk results and not giving a holistic picture and eventually causing harm. It will be like one bias leading to another, and we will get into that loop”, Anjali added.

Female AI voice assistants 

Taking another poll with the audience, Anjali asked which voice assistant they would pick for their home, and nearly 76 per cent of the audience at the conference picked a female voice. Anjali said that it was worthwhile to note that both men and women have expressed higher interest in female gender synthetic voices. Women reported an 11.9 per cent preference, and men showed about 14.3 per cent. “Typically, these AI bots and voice assistants reinforce gender bias because as we move along in this digital world, we all know that the world may soon have a higher number of voice assistants than people. These voice assistants, be it as hotel staff, our IVR calls or the childcare providers, have traditionally featured female sounding voices, and female sounding voices projected on these technologies reinforce an impression that women typically hold assistant jobs and should be servile and docile,” she concluded.

More Great AIM Stories

Poornima Nataraj
Poornima Nataraj has worked in the mainstream media as a journalist for 12 years, she is always eager to learn anything new and evolving. Witnessing a revolution in the world of Analytics, she thinks she is in the right place at the right time.

Our Upcoming Events

Conference, Virtual
Genpact Analytics Career Day
3rd Sep

Conference, in-person (Bangalore)
Cypher 2022
21-23rd Sep

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
21st Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM
MOST POPULAR
[class^="wpforms-"]
[class^="wpforms-"]