MITB Banner

Airbnb’s Smart Pricing: All-Inclusive Algorithm Leads To Racial Disparity

Analysis of Airbnb’s data reveals the low adoption rate of its Smart Pricing algorithm among black hosts, leading to racial disparity.

Share

Airbnb smart pricing

A supposed money-making algorithm by vacation rental company Airbnb has led to an increase in revenue divide between black and white hosts. 

Airbnb had introduced a property ‘Smart Pricing’ algorithm in 2015 to suggest optimal prices and reduce the revenue gap between black and white hosts. As per the US federal laws, all algorithms have to exclude any discriminatory factors, like race, to make recommendations. And so did Airbnb’s algorithm. In many aspects, it was similar to the pricing algorithms used by Amazon and eBay.

However, research conducted by the Universities of Harvard, Toronto and Carnegie Mellon suggests otherwise. The research involved collecting and analysing data from randomly selected 9396 properties over 324 zip codes between July 2015 and August 2017. It further examined the algorithm as voluntary adoption by hosts in a quasi-natural experiment. The results and the recommendations, least to say, have been shocking.

After adopting the algorithm, the average nightly rate dipped by 5.7 percent, and the average daily revenue increased by 8.6 percent. Before adoption, white hosts were earning $12.16 more than black hosts; this revenue gap closed by 71.3 percent post adoption. The circumstances looked highly desirable, but the algorithm’s effect on hosts who did not adopt the algorithm, mostly black hosts, was the opposite. The gap increased.

How does the algorithm work?

When a host turns the algorithm on, it adjusts the property’s nightly rate– based on factors like characteristics of the property, season, price of neighbouring properties, and other factors that influence the demand of the property. 

The algorithm is expected to be more effective than human price setting as it has access to Airbnb servers containing large amounts of data. However, the opacity of the algorithm makes it difficult to access, and thus, the algorithm does not guarantee benefit to the host.  

Why wasn’t the gap mitigated? 

The algorithm’s ability to mitigate the revenue gap in Airbnb is dependent on its adoption rate among the black community. 

Black hosts were 41 percent less likely to adopt the algorithm, and in cases of such low adoption rates, the effect of the algorithm is the opposite. It meant that the algorithm deduced that prices would be skewed towards the white demand curve, generating lower sub-optimal prices for black hosts than their white counterparts. 

The algorithm finds the same price for black and white properties. Race is not a deciding factor in the pricing. Still, the demand for black properties is more responsive to price changes, particularly downward price correction for black hosts, which led to a greater occupancy.

The black community is a minority in both neighbourhood and city levels. Thus, the algorithm is built on data representing the demand curve of white hosts more than the demand curve of black hosts. Therefore, the race-blind algorithm sets prices that lean more towards the optimal cost of the white demand curve.

There was also a difference in demand for properties owned by white and black hosts. There can be many reasons for that, but the primary suggested reasons are race bias, education, and less access to other resources by black hosts. The study also indicates that the revenue gap will be difficult to bridge since we are yet to overcome racial discrimination ingrained in society. 

Recommendations of the study

The study maintains that a racially blind algorithm in a society with racism is not the best way to approach business. It says that perhaps the gap cannot be closed by any algorithm. It outright rejects the possibility of hiding the host’s race from the guest, saying that it will lead further to unwanted incidents or people avoiding areas/neighbourhoods about which they have stereotypes. 

The study suggests developing an algorithm that considers a person’s race and backgrounds, like socioeconomic situation and other factors, before ‘Smart Pricing’ its property. As an individual’s race is protected under US laws and cannot be used as a determining factor in business decisions, the researchers ask policy-makers to reflect on the law change. 

The study concludes that a racially blind algorithm for a racially divided society doesn’t bridge any gap– economic or ethical– but only widens it.

Share
Picture of Meenal Sharma

Meenal Sharma

I am a journalism undergrad who loves playing basketball and writing about finance and technology. I believe in the power of words.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.