MITB Banner

Watch More

Strong AI? Industry Majors Are Reeling From Gender Bias In AI Tools

“All animals are equal, but some animals are more equal than others” said George Orwell in his novel titled Animal Farm. But in the world of artificial intelligence, Google’s Gmail product manager Paul Lambert has found out that, “Not all ‘screw ups’ are equal,”. This was his response when the search engine giant found out there was at least one serious defect in the AI software running Gmail. It did not understand gender pronouns.

Many benchmarks are broken every now and then.As we wrote recently, Sony announced that the company has achieved the best speed in the industry. They achieved the industry best performance by using distributed learning.

But at the same time, there are many fundamental things where AI capabilities need to be improved. In the last few months,  Google’s product Gmail introduced the autocomplete feature that helps users complete their sentences and craft emails faster. This feature does not suggest “him” or “her” now since Google’s supposedly smart feature calls itself “Smart Compose” fails to understand gender pronouns.

Gender Bias Issues

The change has come from the fear that Gmail may suggest a wrong pronoun because of the imperfect AI techniques and may offend many users. One of the research scientists working at Gmail found one such example. When the researcher typed, “I am meeting an investor next week,” and Smart Compose responded by suggesting a follow-up question: “Do you want to meet him?” and did not mention the female pronoun “her.”  Many such examples came from regular usage of the product. Google did not want to risk a PR disaster in a world where gender issues and discussions are on the rise.

Google is not the only major AI company to face such issues. A team at Amazon has been working on a product to review job applicants’ resumes. The hiring tool used AI to give job candidates some sort of scores to evaluate their readiness for the role. But just within a year they realised the system was biased towards male candidates because of the biased training it had gotten. Most of the data Amazon had collected came from men, which was a reflection of male dominance in the industry.

New Tools On Hold

LinkedIn, the top social networking network for professionals also tried to build an algorithmic ranking for candidates to assess the fit for the job. John Jersin, vice president of LinkedIn Talent Solutions, told Reuters that the service is not a replacement for traditional recruiters. “I certainly would not trust any AI system today to make a hiring decision on its own,” he added.

In the world where there is a rush to introduce new AI tools and services, gender based tools are taking a backseat. It is a matter of great intrigue that AI still can not identify gender pronouns and has bias towards male despite of various efforts to eradicate such a bias. Certainly does not seem like a strong AI is emerging any time soon. Once it does, it will know if a person is a “he” or “she.”

Access all our open Survey & Awards Nomination forms in one place >>

Picture of Abhijeet Katte

Abhijeet Katte

As a thorough data geek, most of Abhijeet's day is spent in building and writing about intelligent systems. He also has deep interests in philosophy, economics and literature.

Download our Mobile App

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
Recent Stories