A New AI Tool To Detect & Remove Caste-Based Abuse From Social Media Platforms

A New AI Tool To Detect & Remove Caste-Based Abuse From Social Media Platforms

In the wake of increasing cyberbullying to fake news, Social Media Matters has partnered with Spectrum Labs to launch a Behaviour Identification Model in order to detect caste discrimination within online communities. 

According to the company, the social media platforms of users usually contain a spectrum of information ranging from personal to mundane to sharing political opinions and building communities. And that’s why the online communities have become a fertile ground for groups based on ethnicity or castes. Social Media Matters, that’s why designed this model — The Behaviour Identification model in order to detect the same.

The company stated in their release that the designed model had been made available through Spectrum’s behaviour identification solution. The model has been designed to be flexible and fit into any workflow. Spectrum’s deployment options include a SaaS offering or an on-premise binary. 

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

According to the company, the model is currently trained to detect caste discrimination within all forms of text data, including users’ status updates, messages, tweets, comments, etc. Both deployments have a streaming API or batch mode for data processing.

Spectrum’s behaviour models are all designed to be updated continuously over time. The company is currently working with customers to iterate on the baseline model on a regular cadence in order to ensure that the content is flagged that their customers need. Such a process ensures that the results are customised for each customer in order for them to build trust in it.

Some of the critical features of Behaviour Identification Model in order to detection caste discrimination includes real-time recognition and response to toxicity before it evolves into an even bigger problem; multi-languages detection, where the tool can scale across regions; and secure deployment of the device, which offers the power to understand the community while maintaining the data privacy requirements.

The company stated the results produced by the model are highlighted for customers in a way that they can plug into existing moderation efforts. This includes webhooks into internal systems for customers to manage users, manage content, send alerts, send for moderator review, and even pipe into analytics platforms to see trends over time. 

According to Amitabh Kumar, the founder of Social Media Matters, caste discrimination is one of the oldest forms of evils still existing in Indian society; and sadly, it is also reflected in the cyberspaces. And that’s why Social Media Matters together with Spectrum has created an artificial intelligence tool to help social media platforms, like Facebook, TikTok, Twitter, Instagram, to detect and remove caste-based abuse from their platforms. 

“It will decrease the time taken for detection, and also decrease the constant stress that human moderators have to go through constantly dealing with abuse. Initially, the model is trained to work with several languages English, Hindi, and Hindi-English mix and we’ll continue to upgrade it further,” said Kumar.

Spectrum is also offering a set of moderator tools through a UI called Guardian, which includes four main features — moderation queue, automation builder, retraining, and analytics. And with this tool, the produced results can either be put into this Spectrum offering or can be plugged into existing moderation efforts.

Speaking on the collaboration Justin Davis, CEO at Spectrum Labs said that the hardest part of building an AI model to detect caste discrimination online effectively is to define and understand what caste discrimination is, and also what it is not. 

“The team of Social Media Matters have taken bold strides in raising awareness about injustice and discrimination in many forms, so we could not have asked for better partners. Spectrum’s insights and expertise helped in navigate the nuances, history, and politics of caste discrimination, to build a tool that can combat it effectively and inclusively. We were humbled and honoured to work with them,” concluded Davis.

More Great AIM Stories

Sejuti Das
Sejuti currently works as Associate Editor at Analytics India Magazine (AIM). Reach out at sejuti.das@analyticsindiamag.com

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

Conference, in-person (Bangalore)
Cypher 2023
20-22nd Sep, 2023

3 Ways to Join our Community

Whatsapp group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our newsletter

Get the latest updates from AIM