Now Reading
Facebook’s Content Moderators A Miss?

Facebook’s Content Moderators A Miss?

  • An investigative report suggests that Facebook has been used to recruit, train and pay hit-men by a Mexican drug cartel.
Facebook, content moderation

In January this year, a Facebook investigator revealed that a Mexican drug cartel was using the platform to recruit, train and pay hitmen. Although this was strictly against Facebook’s policies, the tech giant didn’t do much against it. The Mexican cartel posted on Facebook and its photo-sharing social media platform, Instagram.

The Wall Street Journal reviewed Facebook’s internal documents and figured that the company’s employees have been raising alarms about the menacing usage of the platform in some developing countries. Yet, despite the user base of Facebook in these developing countries being huge and constantly expanding, the company did not do much to try and stop it. 

Register for FREE Workshop on Data Engineering>>

Investigation of posts on Facebook and Instagram and private messages on these platforms revealed that the platforms were actively used to recruit and hire teenagers from developing countries for hit-man training camps. Facebook had multiple pages under the name of CJNC (short for Cartel Jalisco Nueva Generation) with photos of guns and crime scenes. According to the company’s policies, the accounts should have been automatically removed as they were internally labelled as ‘Dangerous.’ The pages were active for at least five months before they were taken down. 

Rising concerns

The Mexican drug cartel instance is just one of the many harmful programmes that Facebook is breeding on its platform. Internal documents further revealed employees have been flagging the presence of Middle East-based human traffickers using the platform, alluring women into abusive employment situations. Women are promised work and instead treated either like slaves or forced into sex work. 

Similarly, armed groups in Ethiopia have been leveraging Facebook to provoke violence against the country’s ethnic minorities. Furthermore, Facebook employees’ have been warning about people from across the world using the social media platform for pornography, selling organs in the black market, and the government’s action against political dissent. 

The illicit use of the platform is mostly because, in some of its operational countries, only a few or no users speak languages and dialects that can be interpreted to identify harmful or dangerous content. The internal documents reveal that while Facebook removes some of these pages from its platform, others operate openly. The social media giant has taken down the offensive posts, but it is yet to fix the systems that could prevent the formation of similar pages in the future. Facebook’s priorities continue to be to retain its users and help business partners. The documents reveal, sometimes, Facebook is busy pleasing authoritarian governments when it needs to operate within its borders. 

Content moderators: A miss 

Facebook has hundreds of millions more users in developing countries than the USA. Ninety per cent of its monthly users reside outside the USA and Canada. The scenario is similar in Europe, where its user growth is stalled. In most developing countries, Facebook is used as the primary online channel for communication and source of news. Betting on this growth in developing countries, the social media giant plans to introduce technologies such as satellite internet and expanded WiFi. 

Facebook claims to have deployed a strategy that includes employing global teams who cover more than 50 languages, investing in educational resources and partnering with local experts and third-party fact-checkers to keep its platform safe. 

But that isn’t enough. 

Facebook does not employ people who can speak the relevant languages to monitor situations like these. Additionally, for some of these languages, Facebook has failed to build automated systems or classifiers that could spot abuses. The AI systems that form the backbone of Facebook’s enforcement do not support most of the languages that are used on the platform. 

See Also
Metaverse

To add to it, Facebook does not publish its community standards in all the languages used in its platform. Thus, users fail to realise or even learn about the rules they are supposed to follow, leading to its abusive usage in Ethiopia. 

Similarly, Facebook’s content reviewers working in the Arabic-speaking countries speak Moroccan Arabic, often being unable to point out abusive or violent content published in other dialects; or end up pulling down inoffensive content. In addition, Facebook’s algorithms responsible for enforcement are also incapable of handling different dialects. 

Closer to home, India, which houses more than 300 million Facebook users, has been facing similar nightmares. Its company researchers set up a test account as an Indian female user, and things went south. The user feed was bombarded with polarised nationalist content, misinformation and violence. 

Summing up 

Earlier last week, Facebook said that it was increasing the review capacity of its content moderation systems to prevent the posting of harmful content by expanding it to various Ethiopian languages. It further claims to have deployed a dedicated team to reduce the risks in Ethiopia. 

After Apple’s warning of removing Facebook and its other apps from the App Store, Facebook has sped things up. However, its focus continues to be the developed and English-speaking countries. Former Vice President at Facebook Brian Boland calls this callous behaviour of the tech giant – ‘The cost of doing business.’ Facebook uses the developing and often poorer countries to grow its user base, while it focuses on the safety of its rich markets and ones with powerful governments.

Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.
Join our Telegram Group. Be part of an engaging community

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top