ShareChat’s Philosophy of Utilising Cloud-Based Systems

“Shifting from Node.js to GoLang has given us incredible savings, we have cut down on 90% of the infrastructure on many of our services”, says Bhatia.
Listen to this story

The 2019 Indian government ban on TikTok might have been displeasing to millions of users, but it was definitely good news to its competitors. Home-grown short video platforms have grabbed every ounce of attention from the same user base, with ShareChat also joining the very own social platform cluster. 

Analytics India Magazine spoke to Gaurav Bhatia, SVP of Engineering, ShareChat and Moj, to learn more about how ShareChat has built its cloud-fashioned technology stack. 

Read: Nothing like TikTok, Kudos for Trying

AIM Daily XO

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

AIM: How has ShareChat built its technology stack in a cloud-first fashion?

Gaurav: ShareChat (Mohalla Tech Pvt Ltd) is India’s largest homegrown social media company, with 400+ million MAUs across both its platforms—ShareChat and Moj. We witness over 280 billion views per day, with over 165 million content pieces uploaded daily across both our platforms combined.

To manage and scale our services to serve hundreds of millions of users every month, we have built our technology stack in a cloud-first fashion. We run on Google Cloud Platform (GCP) and follow a microservices architecture where business functionality and features are broken down into smaller components that are developed, deployed, and scaled independently. We build and run the microservices on a Kubernetes (k8) cluster hosted in the Google Cloud. Reliance on the Platform-as-a-service model allows us to have central functions that provide best-in-class infrastructure to teams building product functionality.

Download our Mobile App

AIM: Could you provide more details on how it uses Google Cloud to follow a microservices architecture?

Gaurav: All the services that power our applications can be broken down into small components. For example, when opening ShareChat or Moj, our Android application would contact our Video Feed Service, which would determine the list of best-personalised videos to show the user. The Video Feed Service in turn relies on many services internally to fetch the user details as well as a list of videos and metadata (e.g., likes, comments, shares), which A/B test experiments bucket a user is a part of, and more. 

When a user interacts with the content in the form of a successful watch/like or skip, the events are recorded and sent to an Events service which can then use these signals to better personalise the users’ Video Feed. Along similar lines, when a user uploads a video, it goes through multiple different small services which include the Upload service, Content Moderation, Encoding pipelines, and services. Each one runs in its own Docker container on k8 and can be scaled up or down based on needs and functionality. 

When new services are built or existing services are updated, they can be rolled out initially to a smaller percentage of users (in the form of a canary deployment) so that any unexpected errors or scenarios can be caught and fixed quickly with minimal impact. The services running in the Kubernetes cluster (k8) rely on Google PaaS services including Pub/Sub, Dataflow, Spanner, and BigTable. We perform data analytics using BigQuery. 

AIM: How does ShareChat provide best-in-class infrastructure to teams building product functionality?

Gaurav: At ShareChat, we have an internal platform engineering team that performs several actions including building abstractions on top of Google services such as our Database, Queue and Cache Drivers. Services are built and deployed in a CI/CD model, and we have also built internal tools, such as Atlas, that allow developers to log in and harness the power of the Google Cloud with a few clicks while ensuring that we have enough guardrails to prevent any inadvertent errors. We invest a great deal in test automation and have a device lab with many physical devices which are hosted and can be used remotely to run automation tests for our android and iOS apps. 

AIM: Has shifting your services to GoLang been beneficial for ShareChat?What are some of the services that you’d like to work on in the future? 

Gaurav: Shifting from Node.js to GoLang has given us incredible savings, we have cut down on 90% of the infrastructure on many of our services. This has led to not just server cost optimisation but also easier monitoring and maintainability with quick scale-up/scale-down. We are currently on a journey to rewrite our entire product stack using GoLang. We are constantly evaluating the new services Google is offering. We are also excited about the security offerings as well as the observability and disaster recovery features Google Cloud is adding. In Kubernetes, we are thrilled about eBPF and are actively looking at service mesh. We are also very impressed with the early results we are seeing with the adoption of ScyllaDB, which is giving us excellent low latency and high throughput.  

AIM: What is the technology used behind ShareChat’s recommendation systems? How would it stand out from its competitors? 

Gaurav: At ShareChat we have many different kinds of content that include short videos, long-format videos, images, GIFs, microblog posts, and news content. We have invested heavily in building ML infrastructure for rapid personalisation during a session that applies across all content surfaces. Having a common feature store that can be used for quick experimentation across each content surface—which has its own requirements—is a very powerful way to personalise. Additionally, building our Ranker as a service for different contextual needs has served us well. For instance, during festivals users are looking for content that is more suited for sharing compared to devotional content that is for personal consumption on other days. 

Sign up for The Deep Learning Podcast

by Vijayalakshmi Anandan

The Deep Learning Curve is a technology-based podcast hosted by Vijayalakshmi Anandan - Video Presenter and Podcaster at Analytics India Magazine. This podcast is the narrator's journey of curiosity and discovery in the world of technology.

Bhuvana Kamath
I am fascinated by technology and AI’s implementation in today’s dynamic world. Being a technophile, I am keen on exploring the ever-evolving trends around applied science and innovation.

Our Upcoming Events

24th Mar, 2023 | Webinar
Women-in-Tech: Are you ready for the Techade

27-28th Apr, 2023 I Bangalore
Data Engineering Summit (DES) 2023

23 Jun, 2023 | Bangalore
MachineCon India 2023 [AI100 Awards]

21 Jul, 2023 | New York
MachineCon USA 2023 [AI100 Awards]

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox

Council Post: Evolution of Data Science: Skillset, Toolset, and Mindset

In my opinion, there will be considerable disorder and disarray in the near future concerning the emerging fields of data and analytics. The proliferation of platforms such as ChatGPT or Bard has generated a lot of buzz. While some users are enthusiastic about the potential benefits of generative AI and its extensive use in business and daily life, others have raised concerns regarding the accuracy, ethics, and related issues.