Ever since technologies like artificial intelligence and machine learning have become mainstream, the world has witnessed the resulting gender bias as well as racism in a lot of sectors — from workplaces to beauty contests.
This time, the bias allegations are regarding the credit limit.
Is Apple Card Sexist?
Created by Apple and developed by Goldman Sachs, Apple card was supposed to be a game-changer when it came to taking credits — it was supposed to be available to consumers who might otherwise struggle to access credit. But that doesn’t seem to be coming true anytime soon as the Apple Card is already courting controversy.
Recently, several Apple Card customers have complained that women are getting significantly less credit limit than men. While many are blaming Apple for this event, there are people who are saying Goldman Sachs is responsible for this gender bias event.
Goldman Sachs comes into the picture because even though the Apple card is branded and marketed by Apple, the credit aspect of the card is maintained by the noted bank.
While allegations were on the flow, last Saturday, tech entrepreneur David Heinmeier Hansson took it to twitter saying the card has offered him twenty times the credit limit as his wife.
In fact, the lower credit limit is even happening to those who share assets and accounts with their spouse. Apple co-founder Steve Wozniak tweeted that “the same thing happened to us (10x) despite not having any separate assets or accounts.”
Also, To spread the word with the world, ever since Heinmeier has witnessed this bias with the credit limit, he is on a tweet spree showing a lot of other people experiencing the same.
While consumer factors seem to be legit when it’s about credit difference, experts are also saying this could be because of an algorithm that is not precise. And to an extent, this could be true. With the rise of techs like AI, more and more decisions about our lives involve technology.
Through the years, we all have witnessed algorithms using data and delivering results. And in this case, this bias might be a result of the data that it has been fed. Also, these algorithms are made by humans — the codes are written by humans, and humans are biased naturally. So, it wouldn’t be surprising if these same algorithms reproduce human biases — unintentionally or intentionally.
Black boxing of algorithms
Opacity of algorithms is a huge problem in many industries. As more and more controversies are emerging, it is becoming imperative for organisations to make algorithms transparent and show the purpose, structure and the actions of the algorithms. For example, if someone is given a low credit score, they can appeal to check the criteria used to determine the score. But they cannot appeal to check whether the algorithm has done a mistake.
Credit Decisions Based On Consumer Behaviour
Reports suggest that income is also a factor and reasons behind this entire controversial event. For example, if a woman earns less than the man in the house it also affects the credit limit that the woman should get because income also becomes a significant factor.
Furthermore, another factor could be the spending behaviour of a consumer. For instance, if the man is spending more using his credit card and is maintaining a good credit score then there could be a difference in the credit limit.
What The Bank Says
While Apple has been under more pressure, Goldman Sachs is also facing some sheer blames. Even though the Apple card is branded and marketed by Apple, to credit aspect of the card is maintained by Goldman Sachs.
Talking about the response from Goldman Sachs, the bank has denied the allegations and tweeted that “We have not and never will make decisions based on factors like gender.”
Goldman Sachs said in a statement:
“With Apple Card, your account is individual to you; your credit line is yours and you establish your own direct credit history. As with any other individual credit card, your application is evaluated independently. We look at an individual’s income and an individual’s creditworthiness, which includes factors like personal credit scores, how much debt you have, and how that debt has been managed. Based on these factors, it is possible for two family members to receive significantly different credit decisions. In all cases, we have not and will not make decisions based on factors like gender.”
Now, reports suggest that the New York Department of Financial Services has launched an investigation regarding the bank’s credit card practices.