Now Reading
Top 8 JAX Libraries for Data Scientists in 2021

Top 8 JAX Libraries for Data Scientists in 2021

  • - EMLP
  • - FedJAX
  • - PIX
  • - BRAX
  • - EFAX
  • - Sklearn-jax-kernels
JAX Libraries

Python library JAX is designed for high-performance numerical computing. Developed by Google researchers, JAX was launched in 2018 and is presently used by Alphabet subsidiary DeepMind. It is very similar to numerical computing library NumPy, another library for Python programming. In fact, its API for numerical functions is based on NumPy. 

It is a framework for automatic differentiation, very similar to TensorFlow or PyTorch

Register for our upcoming Masterclass>>

Why JAX? 

JAX is promising for machine learning scientists in the sense that it makes machine learning programming intuitive, structured and clean. It includes extensible system, of composable function transformations to help machine learning researchers with the following: 

  • Differentiation: JAX supports both forward and reverse mode automatic differentiation of arbitrary numerical functions. 
  • Vectorisation: JAX provides automatic vectorisation via vmap transformation. It also supports large scale data parallelism via related pmap transformation. 
  • JIT-compilation: Just-in-time or JIT compilation together with JAX’s NumPy-consistent API allows researchers to scale to one or many accelerators. 

Today, we take a look at some of the recent JAX libraries:

EMLP

Equivariant-MLP is a JAX library for automated construction of equivariant layers in deep learning, through constrained solving. It is based on the ICML2021 paper A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups. 

It is used as a tool for building larger models where EMLP is just one component in a larger system. 

To install the package, one needs to use: 

pip install emlp

For more information, click here

FedJAX 

FedJAX library is used for developing custom Federated Learning algorithms in JAX. It prioritises the ease-of-use for anyone with the knowledge of NumPy. 

FedJAX is built around the components of: 

  • Federated datasets: clients and a dataset for each client 
  • Models: CNN, ResNet 
  • Optimisers: SGD, Momentum 
  • Federated algorithms: Client updates and server aggregation 

FedJAX uses lightweight wrappers and containers to work on existing implementations– Haiku, Stax, and Optax. 

To install FedJAX, one would require Python 3.6 and a working JAX installation. For CPU-only versions: 

pip install –upgrade pip

pip install –upgrade jax jaxlib  

For other devices: 

pip install fedjax

FedJAX is still in its early stages and is yet to be officially supported by Google. 

For more information, click here

PIX 

PIX is an image processing library in JAX and for JAX. While it is written in pure Python, PIX depends on C++ code via JAX.

Built on top of JAX, PIX provides image processing functions and tools that can be optimised and parallelised. 

To install PIX use: 

$ pip install git+https://github.com/deepmind/dm_pix

For more information, click here

RLax 

Built on top of JAX, RLax exposes building blocks for implementing reinforcement learning agents. 

It can be installed with pip directly from github using: 

pip install git+git://github.com/deepmind/rlax.git 

Or from PyPI: 

pip install relax

For more information, click here

BRAX 

It is a differentiable physics engine to stimulate environments made of rigid bodies, joints and actuators. Written in JAX and designed for use on acceleration hardware, BRAX is also a suite of learning algorithms to train agents to operate in environments. 

BRAX is not only efficient for single-core training but is also scalable to massively parallel simulation. 

One can install BRAX from source using: 

python3 -m venv env

source env/bin/activate

pip install –upgrade pip

See Also

pip install -e .

To train a model, use learn. 

For more information, click here

EFAX

EFAX provides tools to work with the exponential family– a class of probability distributions that include normal, gamma, beta, exponential, Poisson, binomial and Bernoulli– distributions in JAX. 

EFAX provides natural and expectation parametrization, often making developers prefer it over a library like Tensorflow Probability. 

For more information, click here

Jax-unirep

It is a performant reimplementation of the UniRep protein featurisation model in JAX. Developed by George Church’s Lab, Jax-unirep is self-contained and easily customisation version of the UniRep model, with additional utility APIs that support protein engineering workflows. 

To install Jax-unirep one has to ensure that their compute environment allows them to run JAX code. Additionally, a modern Linux or macOS with a GLIBX>=2.23 is necessary. 

Jax-unirep can by installed from PyPI using the following code:

pip install jax-unirep

Or directly from the source using: 

pip install git+https://github.com/ElArkk/jax-unirep.git

For more information, click here

Sklearn-jax-kernels 

Sklearn-jax-kernels has been developed to be utilised on JAX to allow accelerated kernel computations through XLA optimisation, computation on GPUs and for the computation of gradients through kernels. 

It provides the same flexibility and ease of use as scikit-learn kernels, but while improving speed and allowing the faster design of new kernels through automatic differentiation. 

To install via pip use: 

pip install sklearn-jax-kernels

For more information, click here.

What Do You Think?

Join Our Discord Server. Be part of an engaging online community. Join Here.


Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top