Listen to this story
Last week, Google released the latest version of its popular open-source software library, TensorFlow 2.10. The launch comes three months after the introduction of TensorFlow 2.9. In 2022 alone, the TensorFlow team has released three versions – 2.8, 2.9, and 2.10.
The highlights of TensorFlow 2.10 include user-friendly updates to Keras to help users develop transformers, updates to optimizers API, deterministic and stateless initializers, and even a new tool to load audio data. Furthermore, like in the case of TensorFlow 2.9, in this version, too, the creators have more enhancements with oneDNN. TensorFlow’s team is now introducing Decision Forests 1.0 with the latest release.
Updates to Keras
For v2.9, TensorFlow had also released a new experimental version of Keras Optimizer API for a unified and expanded catalogue of built-in and customised optimisers. The then introduced optimizers experimental version will replace the current version namespace in the next release (TensorFlow v2.11). To prepare for the formal switch of the optimizer namespace to the new API, the team has initiated the export of current Keras optimizers under tf.keras.optimizers.legacy in v2.10.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
It may be noted that TensorFlow had introduced an API to make ops deterministic in v2.8; in v2.9, the team worked to improve its performance. Built on top of stateless TF random ops, the team has now made Keras intializers stateless and deterministic. From the new version, both seeded and unseeded initializers will generate the same values every time they are called. With a stateless initializer, Keras will be able to support new features like multi-client model training with DTensor (a TensorFlow extension for synchronous distributed computing).
The team has expanded and unified mask handling for Keras attention layers like tf.keras.layers.Attention, tf.keras.layers.AdditiveAttention, and tf.keras.layers.MultiHeadAttention. In addition to that, two features have been added – causal attention and implicit masking. These two features, when combined, simplify the implementation of a Transformer-style model – solving the tricky problem of getting the mask right.
In v2.9, TensorFlow integrated the oneDNN performance library to achieve better performance on Intel CPUs. In this version, oneDNN optimisations would be turned on by default on Linux x86 packages and for CPUs with neural-network-focused hardware features which are found on Intel Cascade Lake and newer CPUs.
Moving forward, in v2.10, TensorFlow has collaborated with Arm AWS and Linaro to integrate Compute Library for the Arm Architecture through oneDNN. This integration would accelerate performance on aarch64 CPUs.
TensorFlow Decision Forests
TensorFlow Decision Forests is a collection of state-of-the art algorithms for training and interpretation of Decision Forest models. It is a collection of Keras models that supports classification, ranking, and regression.
TensorFlow Decision Tree 1.0 improves the performance of oblique splits, which recursively divide feature space by making splits depending on the linear combinations of attributes. They allow decision trees to express complex patterns by conditioning them on multiple features at the same time. It has been shown that oblique splits outperform axis-aligned splits on a majority of datasets.