MITB Banner

Why China’s Recent Crackdown on Tech Algorithms should bother us

Chinese IT giants have shared details of their 'prized algorithms' with the Cyberspace Administration of China.
Why China's Recent Crackdown on Tech Algorithms should bother us
Listen to this story

Recently, Chinese technology giants have shared details of their ‘prized algorithms’ with the Cyberspace Administration of China (CAC). This step by China comes with its list of setbacks and further extends the discussion going on around the privacy concerns and algorithmic politics worldwide. So far, companies in the West have kept regulators at bay, arguing their algorithms are trade secrets. But for how long?

CAC released a list of 30 algorithms alongside a brief description of their purpose from companies including Alibaba and Tencent. Earlier in 2022, China brought in a law to govern the way tech companies use recommendation algorithms. 

Data is attractive. It can be sold, used to observe ongoing trends and even decide whom to buy out next. It’s a vital tool for business. Unfortunately, in the battle between IT companies and the government to commodify as much data as possible, user privacy gets caught in a crossfire.

Tech companies’ secret ingredient: Algorithms

The algorithm by ByteDance for ‘Douyin’, the Chinese version of TikTok, is used for recommending graphics, videos, products and services that may interest users through behavioural data such as clicks and likes. The algorithm for ‘Taobao’, Alibaba’s Chinese marketplace, recommends content on their app according to a user’s search data history. 

These AI-driven recommendation algorithms are highly valuable trade secrets that have come to govern everyday life in China and elsewhere, determining what people watch, and buy and the routes that delivery workers operate through.

The privacy concerns of IT firms handing out information to governments goes far beyond Alibaba or China. Once governments access data owned by companies, they can leverage it several ways. They could combine data from multiple sources to better understand and target individuals and better interpret inter- and intrapersonal dynamics between people.

What about India? 

In February 2022, the government released a draft policy on data accessibility and use rules. Among other things, the rules proposed that all central government and state government bodies be required to share citizens’ data and create a searchable database. Several industry bodies and activists raised concerns about the draft policy, claiming that it would risk people’s private data.

But this wasn’t the initial plan. In 2019, the government wanted to make it compulsory for IT companies to sell data that they collect to anyone in the country including the government and private entities.

Founded after the 26/11 attacks and operational since 2020, the National Intelligence Grid (NATGRID)—an attached office of the Ministry of Home Affairs, facilitates systematic government access by providing authorised agencies with the ability to connect 21 databases from government and private sector organisations.

NATGRID complicates the picture of governmental access to information because it does not operate under legislation and claims only to connect databases to allow tracking. Since regulations and procedures have not been made public yet, intelligence/law enforcement agencies could potentially access any information held by a private sector company without authorisation or notification.

In 2010, the Indian government threatened to ban RIM’s BlackBerry services unless they were provided with real-time and direct access to BlackBerry’s communication traffic. The long running dispute came to an end in 2013 after Blackberry agreed to provide the authorities with a way to intercept consumers’ messages exchanged on its platform.

Systematic access is growing in the mobile and IT sector in India. It is being justified by the logic of national security and crime detection but, in practice, there is a dilution and disconnect between the policy and its implementation.

The government’s growing demand for disclosure and access to private-sector data can be observed in the aforementioned cases. 

The West

The U.S. and European Union have yet to introduce anything similar to the law introduced in China, although several bills, including bills in California and New York City, propose a ‘solution’: force the tech companies to share some of the data they collect.

The New York Times’ ‘Privacy Project’ looked into all the different ways people are losing their privacy. It’s a thorough, often bewildering, study.

It is noteworthy that government bodies across the world have been eyeing IT majors for their data under the label of security. But an in depth study on the subject reveals how deep the concerns around user privacy, control and algorithmic politics are despite policies and regulations being in a constant state of flux.

Access all our open Survey & Awards Nomination forms in one place >>

Picture of Tasmia Ansari

Tasmia Ansari

Tasmia is a tech journalist at AIM, looking to bring a fresh perspective to emerging technologies and trends in data science, analytics, and artificial intelligence.

Download our Mobile App

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
Recent Stories