Active Hackathon

Understanding Algorithmic Biases & Its Impact On Online Hiring

The last few years have seen a dramatic increase when it comes to online job hiring platforms. For instance, popular hiring platforms like LinkedIn, TaskRabbit, among others, have been playing a significant role in employing millions of job seekers. These platforms are well known for one more reason, which is the use of automated tools and machine learning techniques across the platform.

These days, ranking algorithms are being widely employed by various online job platforms in order to determine how job seekers are presented to potential employers. The hiring platforms use such tools to meet certain job-specific criteria, such as requirements, work experience and other such. Since such platforms impact the livelihoods of job seekers, it is important to make certain that the underlying algorithms are not negatively influencing the underrepresented groups.


Sign up for your weekly dose of what's up in emerging technology.

Despite, according to some recent research, it has been found that the ranking algorithms used by several online platforms lead to augment the unwanted biases in the training data. The fairness of AI algorithms and models have always been a topic of debate and discussion among researchers because of their unfair decisions in gender, race, among others.  

Recently, a team of researchers from Harvard University and Technische Universität Berlin studied the fairness of these ranking algorithms, how gender biases manifest in online hiring platforms and how they impact real-world hiring decisions.

The researchers stated the central purpose behind all the fair ranking algorithms is to redistribute user attention across various groups as well as individuals in an evenhanded fashion. While the fair ranking algorithms seem to be a beneficial move towards minimising the undesirable biases induced by ranked lists, it was not clear if these algorithms actually improve the real-world results, such as hiring decisions in online portals for underrepresented groups.”

They added, “Furthermore, there is little to no research that systematically explores how other factors, such as inherent biases of employers, interact with ranking algorithms and influence the real-world outcomes.”

Understanding Algorithm Biases

Two years back, Amazon AI Researchers detected a severe issue in AI hiring tools. It was detected that the AI system learned to prefer male job candidates while penalising female applicants. This means that the resumes of mostly male candidates were reaching the organisations.

In the current research, the researchers tried to evaluate how gender biases percolate in online hiring platforms and how they impact the real-world hiring decisions. To be more specific, they analysed how several sources of gender biases in online hiring platforms including the type of the ranking algorithm, inherent biases of employers, among others interact with each other and affect hiring decisions.

In order to do so, they carried out a large-scale user study with 1,079 participants on Amazon Mechanical Turk using real-world data from TaskRabbit. TaskRabbit is a popular American online and mobile marketplace that provides employment opportunities for freelance labour suppliers.

They have experimented with three different ranking algorithms, which are RandomRanking, RabbitRanking and FairDet-Greedy, where candidates are ranked randomly, based on their TaskRabbit relevance scores, and using a fair ranking algorithm called Det-Greedy.

Benefits Of Fair Ranking

According to the researchers, the analysis revealed several critical and surprising insights about gender biases in online hiring. They found that fair ranking algorithms can help increase the number of underrepresented candidates selected, even after controlling for visibility.

They also found that fair ranking is more effective when the features of underrepresented candidates are similar to those of the overrepresented class. However, this ranking is ineffective at increasing representation when employer selections already represent demographic parity.

Wrapping Up

Due to the pandemic, there have been various challenges, such as businesses have been compelled to work remotely, employees getting laid off, pay cuts, among others. Moreover, with the limited social gathering, candidates have been unable to apply for in-person interviews. As per a blog post, organisations have been making more use of algorithmic hiring tools to screen a flood of job applicants during the coronavirus pandemic.

More Great AIM Stories

Ambika Choudhury
A Technical Journalist who loves writing about Machine Learning and Artificial Intelligence. A lover of music, writing and learning something out of the box.

Our Upcoming Events

Conference, Virtual
Genpact Analytics Career Day
3rd Sep

Conference, in-person (Bangalore)
Cypher 2022
21-23rd Sep

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
21st Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM

Council Post: How to Evolve with Changing Workforce

The demand for digital roles is growing rapidly, and scouting for talent is becoming more and more difficult. If organisations do not change their ways to adapt and alter their strategy, it could have a significant business impact.

All Tech Giants: On your Mark, Get Set – Slow!

In September 2021, the FTC published a report on M&As of five top companies in the US that have escaped the antitrust laws. These were Alphabet/Google, Amazon, Apple, Facebook, and Microsoft.

The Digital Transformation Journey of Vedanta

In the current digital ecosystem, the evolving technologies can be seen both as an opportunity to gain new insights as well as a disruption by others, says Vineet Jaiswal, chief digital and technology officer at Vedanta Resources Limited

BlenderBot — Public, Yet Not Too Public

As a footnote, Meta cites access will be granted to academic researchers and people affiliated to government organisations, civil society groups, academia and global industry research labs.