Earlier this month, Facebook AI launched an open-source machine learning library called Flashlight that lets developers and researchers execute AI/ML applications seamlessly via C++ API. The library is currently available on Github.
Facebook AI said its machine learning library is intuitive and simple to use as it contains only the most basic building blocks needed for research. Further, it claimed that it takes seconds to rebuild the entire library and training machine learning pipelines.
“Deep and ML frameworks are good at what they do — but altering the internals of these frameworks has traditionally proved difficult. Finding the right code to change is time-consuming and error-prone, as low-level internals can be unintentionally obfuscated closed-source or hand-tuned for particular purposes. And once you have made changes, recompiling the framework afterwards is both time-and compute-intensive,” said Facebook Artificial Intelligence Research (FAIR) in its blog post.
Like dlib, mlpack and Shogun, Flashlight is also written in modern C++. It has an incredibly low framework overhead, as modern C++ enables parallelism and speed. In addition to this, it also provides simple bridges to integrate code from low-level domain-specific languages and libraries.
Other packages C++ supports includes Tensorflow for deep learning, Microsoft Cognitive Toolkit (CNTK) for deep learning; OpenCV for computer vision; and DyNet and FANN for neural networks.
Tech experts believe C++ has its limitations as it is very syntax oriented. In comparison, Python or R, C++ are beginner-friendly, and there is plenty of library support.
While modern C++ eliminates the need for tasks like memory management for providing powerful tools for functional programming, Flashlight claims to support research in C++ with no external figures or bindings to perform tasks such as threading, memory mapping, or interoperating with low-level hardware. Ergo, integrating fast, parallel code becomes direct and straightforward.
“We are open-sourcing Flashlight to make for the AI community to tinker with the low-level code underpinning deep and ML frameworks, taking better advantage of the hardware at hand and pushing the limits of performance,” said Jacob Kahn, a research engineer at FAIR.
Under the hood
Facebook AI said Flashlight is developed on top of a shallow stack of basic abstractions that are modular and easy to use. For this, it has used an ArrayFire tensor library, which supports dynamic tensor shapes and types, thereby removing the need for rigid compile-time specifications and C++ templates. Besides this, ArrayFire helps in optimising operations with an efficient just-in-time compiler.
Flashlight also includes custom, tunable memory managers and APIs for distributed and mixed-precision training. In addition to this, Flashlight also features modular abstractions for working with data and training at scale, combined with a fast, lightweight Autograd. “These components are built to support general research directions, whether in deep learning or elsewhere,” said Kahn.
Flashlight-based lightweight domain applications (as shown below) support research across various modalities, including speech recognition, language modelling and image classification and image segmentation — all in one codebase.
Join Our Telegram Group. Be part of an engaging online community. Join Here.
Subscribe to our NewsletterGet the latest updates and relevant offers by sharing your email.
Amit Raja Naik is a senior writer at Analytics India Magazine, where he dives deep into the latest technology innovations. He is also a professional bass player.