According to a Stackoverflow survey by, while TensorFlow is the most in demand library, PyTorch is loved more. The data from Papers With Code suggests PyTorch is the most favourite library among researchers. When it comes to frameworks (repositories are classified by framework by inspecting the contents of every GitHub repository and checking for imports in the code), a search from January 2020 to December 2021, shows PyTorch accounts for 75% of implementations– a lot higher than Tensorflow (below 25%), JAX, MXNeT, PaddlePaddle, Torch, Caffe2, and MindSpore.
Sign up for your weekly dose of what's up in emerging technology.
PyTorch’s soaring popularity
University of Geneva professor Francois Fleuret held a poll on Twitter earlier this month and asked developers to vote for their main framework and 72% voted for PyTorch.
Back in 2020, Elon Musk replied to a Pune-based engineer Pranay Pothole in a tweet thread about the AI capabilities at Tesla: Musk said ‘PyTorch is the most frequently used external tool set/library’ to Pranay’s question PyTorch for building NN?. The response set off a debate on the micro blogging site and Pranay was later seen defending PyTorch saying, “Pytorch is as fast and sometimes faster than Tensorflow because the execution in pytorch is by default, asynchronous.”
PyTorch’s API has become a runaway hit among developers. While some users stick to JAX and NumPy, they are open to PyTorch for DL.
Many PyTorch users have actually switched from TensorFlow and find it much user friendly.
Tony Petrov, Co-Founder and AI Engineer at Netlyt, wrote, “Personally for NLP tasks I use PyTorch. The reason for this is not that Keras or Tensorflow are inferior in terms of performance, it’s rather their lack of supporting libraries.”
Mukhtab Mayank, co-founder of Parallel Dots, said: “PyTorch is the most productive and easy-to-use framework according to me. It is very easy to deploy in production for medium sized deployments in form of the pytorch library we know.”
PyTorch is an open-source machine learning library based on the Torch library. The system was primarily developed by Facebook’s AI Research lab (FAIR): Adam Paszke, Soumith Chintala, Sam Gross, and Gregory Chanan worked on building the system and now has core maintainers, and a broader set of developers that directly merge pull requests and own various parts of the core code base.
Many researchers and users who don’t focus a lot on production-level code, love PyTorch as coding on it is straightforward. With Torch Script they get high flexibility in transitioning between eager mode and graph mode to use both the processes.
PyTorch also has the backing of a thriving community. PyTorch has its own Forum, but the developers using the codes expand to Reddit, Quora, Twitter, and Github. From tutorials, discussions to opinions and advice, the PyTorch community is flourishing at a rapid pace. “PyTorch adopts a governance structure with a small set of maintainers driving the overall project direction with a strong bias towards PyTorch’s design philosophy where design and code contributions are valued,” according to the official website.
A Google trend report showed PyTorch is attracting a lot of interest in the last 12 months compared to TensorFlow.
Important chunks of Deep Learning software built on top of PyTorch including Tesla Autopilot, HuggingFace’s Transformers and even Uber’s Pyro. The future of Pytorch looks bright.