According to a recent report published by the International Labour Organisation (ILO), digital labour platforms erode the worker’s rights and quality of life.
The ILO report analysed the role of digital labour platforms in transforming the nature of work – highlighting the challenges gig economy workers face.
The number of digital labour platforms has increased by five times from 2010 to 2020, of which 8% are in India. The platforms received a further boost due to the pandemic as millions got laid off and were looking for work to make ends meet.
Below, we look at the issues gig workers face in an algorithm-driven world.
Lack Of Transparency
The data collected by digital labour platforms help organisations in enhancing operations, maximising performances and accelerating decision-making that eventually leads to better ROIs.
However, the data has created a power imbalance between the companies collecting the data and the gig workers generating it. The generated data is entirely considered as the property of the platforms. The lack of transparency and restricted access to personal data leave gig workers with no leverage to take on these platforms in case of disputes. In sum, the house always wins.
Regulations like the GDPR in Europe allow gig workers to get a copy of their data. Taxi drivers in the UK have filed a case against Uber for withholding data and violating GDPR. However, other countries do not have such strong laws to uphold workers’ rights.
The accrual of data also leads to monopolies or ‘data-opolies’. For instance, Uber acquired a number of their competitors like Careem, Cornershop, Postmates, etc. Uber has exclusive access to the data from these companies raising concerns over the privacy rights of gig workers who worked for them.
The lack of transparency with regards to the source code used in these algorithms is another pressing concern. The workers are in the dark about the underlying decision-making processes used to rate them. The dice is loaded against the workers when an algorithm acts up and shows them in a bad light as they do not have the resources to make their case.
Meanwhile, accessing the underlying source code is problematic because it is protected under the trade secrecy laws and intellectual property rules under the WTO. The EU has found itself in a tricky situation as the e-commerce WTO deal poses future challenges to its digital policy objectives of adopting rules that will, for instance, mandate external audits for AI systems.
Loss Of Autonomy
This opaque ‘algorithmic management’ defines the everyday work experience, performance, and achievement of the workers, which is based on the data generated by these workers. There is a long history of companies using selective data to get off the hook in courts.
The ILO survey uncovers the issues faced by gig economy workers, such as the platforms asking them to install specific software; the working hours getting tracked by clients; pressure to be available during specific times; getting monitored while working in some instances.
Taxi-based or delivery apps monitor data to the point where they use GPS systems to define optimal routes to be taken by drivers as they carefully track the time spent for the ride or delivery.
Secondly, the gig economy’s major attraction is supposedly flexible working hours that allow workers to decline orders or requests for valid reasons such as exhaustion, safety, or personal commitments.
However, the ILO survey showed a sizable portion of the app-based taxi or delivery services could not cancel or refuse their orders as it might affect their ratings. Respondents in the survey even mentioned that they get very little time to decide whether to accept or decline any order. Uber drivers, for instance, have only 15 to 40 seconds to accept the ride based on limited information.
Thirdly, these algorithms are over-reliant on ratings received from the clients. This is especially true when using online web-based platforms.
Sometimes when workers do not get a rating, the platforms log it as incomplete work. Other times, gig workers also get unfair or fraudulent ratings that are not based on their work. Nevertheless, these ratings are factored into the algorithms.
Fourthly, rejection of work is also common on these platforms. According to the ILO survey, there is a high rate of unfair rejections that indicate that work tends to be supervised by algorithms more than humans. Also, algorithms can be designed so that the tasks are approved on the majority of responses, independent of the correct response, leading to unfair rejections. These rejections can further have implications for future work opportunities and can lead to the deactivation of the worker’s account.
Lastly, the study observed that even redressal mechanisms to change their ratings or unfair rejections did not work smoothly. Less than half of people knew of such redressal mechanisms, and among them, less than a third had used the platform to contest or appeal a rating. Almost one in five of these said that their rating was not reversed and this happened even if the worker’s performance was affected by factors beyond their control.
The ratings, rejections, and inability to resolve conflicts have led to reduced work, lost bonuses, penalties and even deactivation of the platform worker’s account.
Transparency in data and algorithms on digital labour platforms is crucial. The gig workers generating the data should get the benefits and access to the personal data collected. Moreover, a well-thought-out data law for the gig economy workers is the need of the hour to hold digital platforms accountable.