About 76 percent of the audience encountered a biased algorithm

Today, the mother's name and profession are essential in a child's school application form, unlike in the past.

Bias in AI algorithms is nothing but the mirror of societal bias that has been ingrained for many years now. These biases will remain if not acted upon to have an equal and diverse world not just in tech but also otherwise. Talking about women in Data Science and AI, Anjali Iyer, Delivery Excellence Business Leader at The Math Company, shared some of her personal experiences at The Rising 2022, Women in AI conference organised by Analytics India Magazine on Friday.

Watch all the recorded sessions of Rising 2022 here>>

Recalling her schooling days, Anjali said, “During my days, when my parents were getting me enrolled in a school, I don’t remember seeing an application form where my mother’s name or occupation was asked. It was all about the father’s name and occupation. I know the times are changing, at least now; when I enrolled my daughter in a school, I could see that I was able to spell my name and my occupation, but that wasn’t the case when I was getting into the school. So there is definitely a change that we are getting into, but again how many women are getting into the tech industry and how many are able to get into the leadership roles, there is still a wide gap.”

76% have encountered biased algorithms

Traversing through experiences with women in Data Science and AI, Anjali asked the audience if they had ever encountered a biased algorithm. About 76 per cent of the audience at the conference voted “Yes”, which speaks volumes about the existing bias. “The representation of women in AI and tech matters a lot, and we need to change this. But it is not just about representation but also about women getting the right opportunity, as there is a huge difference in the way men and women get similar opportunities while getting into tech. Ten years ago, even though I did not get the opportunity to do my MS as marriage was a priority, things are changing slowly,” she added.

Citing an example of an American multinational company, Anjali spoke about a health app that was criticised for ignoring women’s health issues. The app could track every piece of data except for women’s natural cycles. This happened because of the bias, which is inbuilt in these algorithms, and it happened only because there was a lack of diversity.

Dismal number of women in STEM 

Looking at the STEM jobs, there is just about 28 per cent women representation and even more startling in the field of AI/ML research, where there are just 15 per cent women in the industry. “These figures mean that the organisations will fail to harness the fullest capacity of their digital innovations without including women. These machine learning technologies will be fed with a constant stream of biased data, eventually producing junk results and not giving a holistic picture and eventually causing harm. It will be like one bias leading to another, and we will get into that loop”, Anjali added.

Female AI voice assistants 

Taking another poll with the audience, Anjali asked which voice assistant they would pick for their home, and nearly 76 per cent of the audience at the conference picked a female voice. Anjali said that it was worthwhile to note that both men and women have expressed higher interest in female gender synthetic voices. Women reported an 11.9 per cent preference, and men showed about 14.3 per cent. “Typically, these AI bots and voice assistants reinforce gender bias because as we move along in this digital world, we all know that the world may soon have a higher number of voice assistants than people. These voice assistants, be it as hotel staff, our IVR calls or the childcare providers, have traditionally featured female sounding voices, and female sounding voices projected on these technologies reinforce an impression that women typically hold assistant jobs and should be servile and docile,” she concluded.

Download our Mobile App

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox