A supposed money-making algorithm by vacation rental company Airbnb has led to an increase in revenue divide between black and white hosts.
Airbnb had introduced a property ‘Smart Pricing’ algorithm in 2015 to suggest optimal prices and reduce the revenue gap between black and white hosts. As per the US federal laws, all algorithms have to exclude any discriminatory factors, like race, to make recommendations. And so did Airbnb’s algorithm. In many aspects, it was similar to the pricing algorithms used by Amazon and eBay.
However, research conducted by the Universities of Harvard, Toronto and Carnegie Mellon suggests otherwise. The research involved collecting and analysing data from randomly selected 9396 properties over 324 zip codes between July 2015 and August 2017. It further examined the algorithm as voluntary adoption by hosts in a quasi-natural experiment. The results and the recommendations, least to say, have been shocking.
After adopting the algorithm, the average nightly rate dipped by 5.7 percent, and the average daily revenue increased by 8.6 percent. Before adoption, white hosts were earning $12.16 more than black hosts; this revenue gap closed by 71.3 percent post adoption. The circumstances looked highly desirable, but the algorithm’s effect on hosts who did not adopt the algorithm, mostly black hosts, was the opposite. The gap increased.
How does the algorithm work?
When a host turns the algorithm on, it adjusts the property’s nightly rate– based on factors like characteristics of the property, season, price of neighbouring properties, and other factors that influence the demand of the property.
The algorithm is expected to be more effective than human price setting as it has access to Airbnb servers containing large amounts of data. However, the opacity of the algorithm makes it difficult to access, and thus, the algorithm does not guarantee benefit to the host.
Why wasn’t the gap mitigated?
The algorithm’s ability to mitigate the revenue gap in Airbnb is dependent on its adoption rate among the black community.
Black hosts were 41 percent less likely to adopt the algorithm, and in cases of such low adoption rates, the effect of the algorithm is the opposite. It meant that the algorithm deduced that prices would be skewed towards the white demand curve, generating lower sub-optimal prices for black hosts than their white counterparts.
The algorithm finds the same price for black and white properties. Race is not a deciding factor in the pricing. Still, the demand for black properties is more responsive to price changes, particularly downward price correction for black hosts, which led to a greater occupancy.
The black community is a minority in both neighbourhood and city levels. Thus, the algorithm is built on data representing the demand curve of white hosts more than the demand curve of black hosts. Therefore, the race-blind algorithm sets prices that lean more towards the optimal cost of the white demand curve.
There was also a difference in demand for properties owned by white and black hosts. There can be many reasons for that, but the primary suggested reasons are race bias, education, and less access to other resources by black hosts. The study also indicates that the revenue gap will be difficult to bridge since we are yet to overcome racial discrimination ingrained in society.
Recommendations of the study
The study maintains that a racially blind algorithm in a society with racism is not the best way to approach business. It says that perhaps the gap cannot be closed by any algorithm. It outright rejects the possibility of hiding the host’s race from the guest, saying that it will lead further to unwanted incidents or people avoiding areas/neighbourhoods about which they have stereotypes.
The study suggests developing an algorithm that considers a person’s race and backgrounds, like socioeconomic situation and other factors, before ‘Smart Pricing’ its property. As an individual’s race is protected under US laws and cannot be used as a determining factor in business decisions, the researchers ask policy-makers to reflect on the law change.
The study concludes that a racially blind algorithm for a racially divided society doesn’t bridge any gap– economic or ethical– but only widens it.
Join Our Telegram Group. Be part of an engaging online community. Join Here.
Subscribe to our NewsletterGet the latest updates and relevant offers by sharing your email.
I am a journalism undergrad who loves playing basketball and writing about finance and technology. I believe in the power of words.