The last few years have seen a dramatic increase when it comes to online job hiring platforms. For instance, popular hiring platforms like LinkedIn, TaskRabbit, among others, have been playing a significant role in employing millions of job seekers. These platforms are well known for one more reason, which is the use of automated tools and machine learning techniques across the platform.
These days, ranking algorithms are being widely employed by various online job platforms in order to determine how job seekers are presented to potential employers. The hiring platforms use such tools to meet certain job-specific criteria, such as requirements, work experience and other such. Since such platforms impact the livelihoods of job seekers, it is important to make certain that the underlying algorithms are not negatively influencing the underrepresented groups.
THE BELAMY
Sign up for your weekly dose of what's up in emerging technology.
Despite, according to some recent research, it has been found that the ranking algorithms used by several online platforms lead to augment the unwanted biases in the training data. The fairness of AI algorithms and models have always been a topic of debate and discussion among researchers because of their unfair decisions in gender, race, among others.
Recently, a team of researchers from Harvard University and Technische Universität Berlin studied the fairness of these ranking algorithms, how gender biases manifest in online hiring platforms and how they impact real-world hiring decisions.
The researchers stated the central purpose behind all the fair ranking algorithms is to redistribute user attention across various groups as well as individuals in an evenhanded fashion. While the fair ranking algorithms seem to be a beneficial move towards minimising the undesirable biases induced by ranked lists, it was not clear if these algorithms actually improve the real-world results, such as hiring decisions in online portals for underrepresented groups.”
They added, “Furthermore, there is little to no research that systematically explores how other factors, such as inherent biases of employers, interact with ranking algorithms and influence the real-world outcomes.”
Understanding Algorithm Biases
Two years back, Amazon AI Researchers detected a severe issue in AI hiring tools. It was detected that the AI system learned to prefer male job candidates while penalising female applicants. This means that the resumes of mostly male candidates were reaching the organisations.
In the current research, the researchers tried to evaluate how gender biases percolate in online hiring platforms and how they impact the real-world hiring decisions. To be more specific, they analysed how several sources of gender biases in online hiring platforms including the type of the ranking algorithm, inherent biases of employers, among others interact with each other and affect hiring decisions.
In order to do so, they carried out a large-scale user study with 1,079 participants on Amazon Mechanical Turk using real-world data from TaskRabbit. TaskRabbit is a popular American online and mobile marketplace that provides employment opportunities for freelance labour suppliers.
They have experimented with three different ranking algorithms, which are RandomRanking, RabbitRanking and FairDet-Greedy, where candidates are ranked randomly, based on their TaskRabbit relevance scores, and using a fair ranking algorithm called Det-Greedy.
Benefits Of Fair Ranking
According to the researchers, the analysis revealed several critical and surprising insights about gender biases in online hiring. They found that fair ranking algorithms can help increase the number of underrepresented candidates selected, even after controlling for visibility.
They also found that fair ranking is more effective when the features of underrepresented candidates are similar to those of the overrepresented class. However, this ranking is ineffective at increasing representation when employer selections already represent demographic parity.
Wrapping Up
Due to the pandemic, there have been various challenges, such as businesses have been compelled to work remotely, employees getting laid off, pay cuts, among others. Moreover, with the limited social gathering, candidates have been unable to apply for in-person interviews. As per a blog post, organisations have been making more use of algorithmic hiring tools to screen a flood of job applicants during the coronavirus pandemic.