MITB Banner

ShareChat’s Philosophy of Utilising Cloud-Based Systems

“Shifting from Node.js to GoLang has given us incredible savings, we have cut down on 90% of the infrastructure on many of our services”, says Bhatia.

Share

Listen to this story

The 2019 Indian government ban on TikTok might have been displeasing to millions of users, but it was definitely good news to its competitors. Home-grown short video platforms have grabbed every ounce of attention from the same user base, with ShareChat also joining the very own social platform cluster. 

Analytics India Magazine spoke to Gaurav Bhatia, SVP of Engineering, ShareChat and Moj, to learn more about how ShareChat has built its cloud-fashioned technology stack. 

Read: Nothing like TikTok, Kudos for Trying

AIM: How has ShareChat built its technology stack in a cloud-first fashion?

Gaurav: ShareChat (Mohalla Tech Pvt Ltd) is India’s largest homegrown social media company, with 400+ million MAUs across both its platforms—ShareChat and Moj. We witness over 280 billion views per day, with over 165 million content pieces uploaded daily across both our platforms combined.

To manage and scale our services to serve hundreds of millions of users every month, we have built our technology stack in a cloud-first fashion. We run on Google Cloud Platform (GCP) and follow a microservices architecture where business functionality and features are broken down into smaller components that are developed, deployed, and scaled independently. We build and run the microservices on a Kubernetes (k8) cluster hosted in the Google Cloud. Reliance on the Platform-as-a-service model allows us to have central functions that provide best-in-class infrastructure to teams building product functionality.

AIM: Could you provide more details on how it uses Google Cloud to follow a microservices architecture?

Gaurav: All the services that power our applications can be broken down into small components. For example, when opening ShareChat or Moj, our Android application would contact our Video Feed Service, which would determine the list of best-personalised videos to show the user. The Video Feed Service in turn relies on many services internally to fetch the user details as well as a list of videos and metadata (e.g., likes, comments, shares), which A/B test experiments bucket a user is a part of, and more. 

When a user interacts with the content in the form of a successful watch/like or skip, the events are recorded and sent to an Events service which can then use these signals to better personalise the users’ Video Feed. Along similar lines, when a user uploads a video, it goes through multiple different small services which include the Upload service, Content Moderation, Encoding pipelines, and services. Each one runs in its own Docker container on k8 and can be scaled up or down based on needs and functionality. 

When new services are built or existing services are updated, they can be rolled out initially to a smaller percentage of users (in the form of a canary deployment) so that any unexpected errors or scenarios can be caught and fixed quickly with minimal impact. The services running in the Kubernetes cluster (k8) rely on Google PaaS services including Pub/Sub, Dataflow, Spanner, and BigTable. We perform data analytics using BigQuery. 

AIM: How does ShareChat provide best-in-class infrastructure to teams building product functionality?

Gaurav: At ShareChat, we have an internal platform engineering team that performs several actions including building abstractions on top of Google services such as our Database, Queue and Cache Drivers. Services are built and deployed in a CI/CD model, and we have also built internal tools, such as Atlas, that allow developers to log in and harness the power of the Google Cloud with a few clicks while ensuring that we have enough guardrails to prevent any inadvertent errors. We invest a great deal in test automation and have a device lab with many physical devices which are hosted and can be used remotely to run automation tests for our android and iOS apps. 

AIM: Has shifting your services to GoLang been beneficial for ShareChat?What are some of the services that you’d like to work on in the future? 

Gaurav: Shifting from Node.js to GoLang has given us incredible savings, we have cut down on 90% of the infrastructure on many of our services. This has led to not just server cost optimisation but also easier monitoring and maintainability with quick scale-up/scale-down. We are currently on a journey to rewrite our entire product stack using GoLang. We are constantly evaluating the new services Google is offering. We are also excited about the security offerings as well as the observability and disaster recovery features Google Cloud is adding. In Kubernetes, we are thrilled about eBPF and are actively looking at service mesh. We are also very impressed with the early results we are seeing with the adoption of ScyllaDB, which is giving us excellent low latency and high throughput.  

AIM: What is the technology used behind ShareChat’s recommendation systems? How would it stand out from its competitors? 

Gaurav: At ShareChat we have many different kinds of content that include short videos, long-format videos, images, GIFs, microblog posts, and news content. We have invested heavily in building ML infrastructure for rapid personalisation during a session that applies across all content surfaces. Having a common feature store that can be used for quick experimentation across each content surface—which has its own requirements—is a very powerful way to personalise. Additionally, building our Ranker as a service for different contextual needs has served us well. For instance, during festivals users are looking for content that is more suited for sharing compared to devotional content that is for personal consumption on other days. 

Share
Picture of Bhuvana Kamath

Bhuvana Kamath

I am fascinated by technology and AI’s implementation in today’s dynamic world. Being a technophile, I am keen on exploring the ever-evolving trends around applied science and innovation.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.