Listen to this story
Developed by Facebook AI Research (FAIR), PyTorch is one of the most widely used open-source machine learning libraries for deep learning applications. It was first introduced in 2016. Since then, PyTorch has been gaining popularity among researchers and developers, at the expense of TensorFlow.
At the NeurIPS conference in 2019, PyTorch appeared in 166 papers, whereas TensorFlow appeared in 74 papers. This year, NVIDIA GTC 2021 hosted over 50 different sessions related to PyTorch. Plus, Facebook has plans to migrate all its AI systems to PyTorch.
In this article, we will list free resources and tools to help you learn deep learning with PyTorch:
Sign up for your weekly dose of what's up in emerging technology.
Courses & modules
Microsoft recently launched a free course called ‘PyTorch Fundamentals.’ This beginner-friendly module introduces you to key concepts to build machine learning models in multiple domains, including speech, vision, and natural language processing (NLP).
Course criteria include:
- Basic Python knowledge
- Basic understanding of how to use Jupyter Notebooks
- Machine learning basics
This tutorial helps you learn PyTorch from scratch. In this quick start guide, you will learn how to load data, build deep neural networks, train and save your models. Plus, it includes PyTorch Recipes, which consists of bite-size, ready-to-deploy PyTorch code examples, and PyTorch Cheat Sheet. You can access PyTorch tutorials on GitHub and run tutorials on Google Colab.
Curated by Facebook AI in partnership with Udacity, this free course lets you learn the basics of deep learning and build your own deep neural networks using PyTorch. It also helps you get practical experience with PyTorch through coding exercises and projects implementing SOTA AI applications such as style transfer and text generation.
Check out the free course here.
In this tutorial, you will learn PyTorch basics (Torch and NumPy), how to build first neural network (regression, classification, optimisers, etc.), advanced neural network architectures (CNN, RNN-classification, RNN-regression, AutoEncoder, etc.) and different concepts (Train on GPU, Dropout, Batch Normalization, etc.). It is available on GitHub.
You will learn everything about neural network programming and PyTorch in detail in this course. Curated by DEEPLIZARD, it covers the basics of PyTorch and CUDA, alongside understanding why neural networks use GPUs.
The series touches upon the tensor and deep learning fundamentals, training a network, analysing results, tuning hyperparameters, and using TensorBorad with PyTorch for visual analytics. Start your PyTorch journey here.
Check out their YouTube videos for more content around PyTorch here.
In this tutorial series, machine learning expert Harrison Kinsley aka Sentdex, has curated in-depth video content around PyTorch, starting from basics to advanced, and building your own deep learning and neural network, training model, model analysis and more. Check out the complete playlist here.
Co-authored by Eli Stevens, Luca Antiga and Thomas Viehmann, ‘Deep Learning With PyTorch’ covers the basics and abstractions of PyTorch in great detail, and explains the underpinnings of data structures like tensors and neural networks and making sure you understand their implementation.
The book also covers advanced concepts such as JIT and deployment to production. Plus, it covers applications, taking you through the steps of using neural networks to help solve a complex medical problem. Check out the PDF version here. The book is also available on Amazon. The code for the ‘Deep Learning with PyTorch’ book is available on GitHub.
Written by Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola, ‘Dive into Deep Learning’ offers an interactive deep learning experience with code, math and discussions across multiple frameworks, including NumPy/MXNet, PyTorch, and TensorFlow.
Currently, the book has been adopted at 175 universities from 40 countries. The book will teach deep learning concepts from scratch. Check out the PDF version of the book here.