Top 10 Open-Source Hyperparameter Optimisation Libraries For ML Models

The tools for optimising hyperparameters are multiplying as the complexity of deep learning models increase. Generally, there are two types of toolkits for hyperparameter optimisation (HPO): open-source tools and services dependent on cloud computing resources. 

Below, we have put together the top ten open-sourced hyperparameter optimisation libraries for ML models.

(The list is in alphabetical order)

1| BayesianOptimisation

About: BayesianOptimisation is a Python implementation of Bayesian global optimisation, built on Bayesian inference and Gaussian process looking to find the maximum value of an unknown function in as few iterations as possible. This technique is particularly suited for optimisation of high-cost functions, situations where the balance between exploration and exploitation is critical.

Know more here.

2| GPyOpt

About: GPyOpt is a Python open-source library for Bayesian Optimisation. It is based on GPy, a Python framework for Gaussian process modelling. The library is used to automatically configure models and machine learning algorithms, design wet-lab experiments, etc.

Know more here.

3| Hyperopt

About: Hyperopt is a Python library for serial and parallel optimisation over search spaces, which may include real-valued, discrete, and conditional dimensions. This library has been designed to accommodate Bayesian optimisation algorithms based on Gaussian processes and regression trees. It provides algorithms and parallelisation infrastructure for performing hyperparameter optimisation (model selection) in Python.

Know more here.

4| Keras Tuner

About: Keras Tuner is a library that allows us to find optimal hyperparameters for machine learning models. The library includes pre-made tunable applications: HyperResNet and HyperXception — the ready-to-use hyper-models for computer vision.

Know more here.

5| Metric Optimisation Engine (MOE)

About: Metric Optimisation Engine (MOE) is an open source, black box, Bayesian Global Optimisation engine for optimal experimental design. MOE is an efficient way to optimise a system’s parameters, when evaluating parameters is time-consuming or expensive. It can help tackle problems including optimising a system’s click-through or conversion rate via A/B testing, tuning parameters of a machine learning prediction method or expensive batch job, designing an engineering system or finding the optimal parameters of a real-world experiment.   

Know more here.

6| Optuna

About: Optuna is an automatic hyperparameter optimisation software framework, ideal for machine learning. It has an imperative, define-by-run style user API to dynamically construct the search spaces for the hyperparameters. The framework includes a number of libraries for Pythonic search spaces, platform agnostic architecture, easy parallelisation, etc. 

Know more here.

7| Ray Tune

About: Ray Tune is a hyperparameter optimisation framework for long-running tasks such as reinforcement learning and deep learning training. The framework includes a number of intuitive features, such as scalable implementations of search algorithms such as Population Based Training (PBT), Median Stopping Rule, and HyperBand, flexible trial variant generation, including grid search, random search, and conditional parameter distributions, among others.

Know more here.

8| SmartML

About: SmartML is a meta learning-based framework for automated selection and hyperparameter tuning for machine learning algorithms. For any new dataset, SmartML automatically extracts its meta features and searches its knowledge base for the best performing algorithm to start its optimisation process. It can be embedded in any programming language using its available REST APIs.

Know more here.

9| SigOpt

About: SigOpt is a black-box hyperparameter optimisation solution that automates model tuning to accelerate the model development process and amplify the impact of models in production at scale. SigOpt cab increases the computational efficiency with an ensemble of Bayesian and global optimisation algorithms designed to explore and exploit any parameter space.

Know more here.

10| Talos

About: Talos is a hyperparameter optimisation framework for Keras, TensorFlow and PyTorch. The framework changes the ordinary Keras workflow by fully automating hyperparameter tuning and model evaluation. The key features of Talos include automated hyperparameter optimisation, model generalisation evaluator, support for man-machine cooperative optimisation strategy, etc..

Know more here.

Download our Mobile App

Ambika Choudhury
A Technical Journalist who loves writing about Machine Learning and Artificial Intelligence. A lover of music, writing and learning something out of the box.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox