Advertisement

Airbnb’s Smart Pricing: All-Inclusive Algorithm Leads To Racial Disparity

Analysis of Airbnb’s data reveals the low adoption rate of its Smart Pricing algorithm among black hosts, leading to racial disparity.
Airbnb smart pricing

A supposed money-making algorithm by vacation rental company Airbnb has led to an increase in revenue divide between black and white hosts. 

Airbnb had introduced a property ‘Smart Pricing’ algorithm in 2015 to suggest optimal prices and reduce the revenue gap between black and white hosts. As per the US federal laws, all algorithms have to exclude any discriminatory factors, like race, to make recommendations. And so did Airbnb’s algorithm. In many aspects, it was similar to the pricing algorithms used by Amazon and eBay.

However, research conducted by the Universities of Harvard, Toronto and Carnegie Mellon suggests otherwise. The research involved collecting and analysing data from randomly selected 9396 properties over 324 zip codes between July 2015 and August 2017. It further examined the algorithm as voluntary adoption by hosts in a quasi-natural experiment. The results and the recommendations, least to say, have been shocking.

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

After adopting the algorithm, the average nightly rate dipped by 5.7 percent, and the average daily revenue increased by 8.6 percent. Before adoption, white hosts were earning $12.16 more than black hosts; this revenue gap closed by 71.3 percent post adoption. The circumstances looked highly desirable, but the algorithm’s effect on hosts who did not adopt the algorithm, mostly black hosts, was the opposite. The gap increased.

How does the algorithm work?

When a host turns the algorithm on, it adjusts the property’s nightly rate– based on factors like characteristics of the property, season, price of neighbouring properties, and other factors that influence the demand of the property. 


Download our Mobile App



The algorithm is expected to be more effective than human price setting as it has access to Airbnb servers containing large amounts of data. However, the opacity of the algorithm makes it difficult to access, and thus, the algorithm does not guarantee benefit to the host.  

Why wasn’t the gap mitigated? 

The algorithm’s ability to mitigate the revenue gap in Airbnb is dependent on its adoption rate among the black community. 

Black hosts were 41 percent less likely to adopt the algorithm, and in cases of such low adoption rates, the effect of the algorithm is the opposite. It meant that the algorithm deduced that prices would be skewed towards the white demand curve, generating lower sub-optimal prices for black hosts than their white counterparts. 

The algorithm finds the same price for black and white properties. Race is not a deciding factor in the pricing. Still, the demand for black properties is more responsive to price changes, particularly downward price correction for black hosts, which led to a greater occupancy.

The black community is a minority in both neighbourhood and city levels. Thus, the algorithm is built on data representing the demand curve of white hosts more than the demand curve of black hosts. Therefore, the race-blind algorithm sets prices that lean more towards the optimal cost of the white demand curve.

There was also a difference in demand for properties owned by white and black hosts. There can be many reasons for that, but the primary suggested reasons are race bias, education, and less access to other resources by black hosts. The study also indicates that the revenue gap will be difficult to bridge since we are yet to overcome racial discrimination ingrained in society. 

Recommendations of the study

The study maintains that a racially blind algorithm in a society with racism is not the best way to approach business. It says that perhaps the gap cannot be closed by any algorithm. It outright rejects the possibility of hiding the host’s race from the guest, saying that it will lead further to unwanted incidents or people avoiding areas/neighbourhoods about which they have stereotypes. 

The study suggests developing an algorithm that considers a person’s race and backgrounds, like socioeconomic situation and other factors, before ‘Smart Pricing’ its property. As an individual’s race is protected under US laws and cannot be used as a determining factor in business decisions, the researchers ask policy-makers to reflect on the law change. 

The study concludes that a racially blind algorithm for a racially divided society doesn’t bridge any gap– economic or ethical– but only widens it.

More Great AIM Stories

Meenal Sharma
I am a journalism undergrad who loves playing basketball and writing about finance and technology. I believe in the power of words.

AIM Upcoming Events

Regular Passes expire on 3rd Mar

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Early Bird Passes expire on 17th Feb

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, Virtual
Deep Learning DevCon 2023
27 May, 2023

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
AIM TOP STORIES

A beginner’s guide to image processing using NumPy

Since images can also be considered as made up of arrays, we can use NumPy for performing different image processing tasks as well from scratch. In this article, we will learn about the image processing tasks that can be performed only using NumPy.

RIP Google Stadia: What went wrong?

Google has “deprioritised” the Stadia game streaming platform and wants to offer its Stadia technology to select partners in a new service called “Google Stream”.