Now Reading
KotlinDL 0.2 From JetBrains Is Here

KotlinDL 0.2 From JetBrains Is Here

  • KotlinDL is built on top of the TensorFlow Java API.

Version 0.2 of JetBrains deep learning library, KotlinDL from JetBrains has been released with new layers and Functional API. It’s now easy to build and train models such as ResNet or MobileNet.

This also includes a special Kotlin-idiomatic DSL for image preprocessing, which can be used for operations including load, crop, resize, rotate, rescale, sharpen and save. 

Register for our upcoming Masterclass>>

KotlinDL is built on top of the TensorFlow Java API and has an API close to Keras and other high-level frameworks like Sonnet, PyTorch Lighting, and Catalyst.

KotlinDL is a high-level Deep Learning API written in Kotlin and inspired by Keras. Under the hood, it uses TensorFlow Java API. 

According to JetBrains, it offers simple APIs for training deep learning models from scratch, importing existing Keras models for inference, and leveraging transfer learning for tailoring existing pre-trained models to your tasks.

Looking for a job change? Let us help you.

Overall, this project aims to make Deep Learning easier for JVM developers and simplify deploying deep learning models in JVM production environments.

Here’s an example of what a classic convolutional neural network LeNet will look like in KotlinDL:

See Also

“In version 0.2, we’ve added a new Functional API that makes it possible for you to build models such as ResNet or MobileNet. The Functional API provides a way to create models that are more flexible than the Sequential API. The Functional API can handle models with non-linear topology, shared layers, and even multiple inputs or outputs,” said Alexey Zinoviev, Machine Learning Engineer in JetBrains.

He also said with the previous version of the library, you could only use the Sequential API to describe your model. 

“Using the Sequential.of(..) method call, it has been possible to build a sequence of layers to describe models in a style similar to VGG. Since 2014, many new architectures have addressed the disadvantages inherent in simple layer sequences, such as vanishing gradients or the degradation (accuracy saturation) problem. The famous residual neural networks (ResNet) use skip connections or shortcuts to jump over layers,” he said.

What Do You Think?

Join Our Discord Server. Be part of an engaging online community. Join Here.


Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top