This week, we witnessed open-source tools focusing mostly on making models lighter and explainable. OpenAI, especially, has come up with an interesting tool to promote the interpretability of ML models. Furthermore, TensorFlow has made it even more simple for developers to execute their models. Let us take a look at top AI news for developers from this week.
OpenAI Microscope
OpenAI Microscope tool is a collection of visualisations of every significant layer and neuron of eight vision ‘model organisms’, which are often studied in interpretability. OpenAI’s Microscope has just made it even easier to visualise and analyse the features within the neural networks. The visualisation is systematic for popular vision models, and it also makes all the neurons linkable.
Microscope’s initial release includes nine popular vision models, and the OpenAI team promises to expand to other models and techniques soon.
Facebook Open Sources Nevergrad
Nevergrad is an open-source library for derivative-free and evolutionary optimisation. Facebook AI team has released Nevergrad this week to enable researchers and engineers to work with multi-objective optimisation or with constraints, which are very crucial for NLP applications. For example, with Nevergrad, language translation tasks can be benchmarked on multiple metrics simultaneously.
EffiencentDet Open Sourced
EfficentDet Architecture
Google AI open-sourced its“EfficientDet” model, which is designed to make object detection more efficient on a scale. This work has also been accepted at CVPR 2020. EfficientDet comprises a new family of scalable and efficient object detectors. It is an extension of the popular network EfficientNet. In this case, the team has also incorporated a novel bi-directional feature network (BiFPN) and new scaling rules, which helps EfficientDet achieve state-of-the-art accuracy while using significantly less computation and nine times smaller, compared to prior state-of-the-art detectors.
TensorFlow Profiler
TensorFlow’s Profiler has been released for developers to leverage it for measuring the training performance and resource consumption of the TensorFlow models. The Profile comes on board with a bunch of tools, and has also been integrated into TensorBoard.
The Profiler offers the following:
- Provide a top-level view of model performance and recommendations to optimise performance
- Analyse the model’s data input pipeline for bottlenecks and recommend suggestions to improve performance
- Display performance statistics during the profiling session for every TensorFlow operation executed
- Display performance statistics and the originating operation for every GPU accelerated kernel
Uber Open-Sources FIBER
Uber has open-sourced its Fiber framework to help researchers and developers streamline their large-scale parallel scientific computation. Fiber is a Python-based distributed computing framework for modern computer clusters. Initially, Uber built Fiber to support complex projects like POET and similar projects that required distributed computing, but today, it has open-sourced the framework for the larger community.
With Fiber, users are not limited to programming only on desktop or laptop, but the whole computer cluster.
Unlike other distributed machine learning tools, Fiber introduces a new concept called ‘job-backed processes’ or ‘Fiber process.’ Although it is similar to Python’s multiprocessing library, Fiber comes with more flexibility – apart from running locally, it can also execute remotely on different machines.
TFLite Model Maker
There has been a lot of interest lately into the on-device segment with regard to ML. TensorFlow, especially, with its packages designed to work on edge devices, has become popular. Today, it releases one more feature—TFLite Model Maker. Model Maker is a library that simplifies the process of adapting and converting a TensorFlow neural-network model to particular input data when deploying this model for on-device ML applications.
Here is a snippet of how you can run an end-to-end image classification with just four lines of code using Model Maker:
#Load input data specific to an on-device ML app.
data = ImageClassifierDataLoader.from_folder('flower_photos/')
#Customize the TensorFlow model.
model = image_classifier.create(data)
#Evaluate the model.
loss, accuracy = model.evaluate()
#Export to Tensorflow Lite model.
model.export('flower_classifier.tflite', 'flower_label.txt')
Know more here.