$ pip install deephyper

Black-Box Optimization

Optimize the parameters of a system to maximise performance.

Explore the connexions and operations of neural network architectures.

Uncertainty Quantification

Quantify uncertainty from diverse model exploration.

Scaling for HPC

Scale to thousands of parallel workers with our centralized and fully distributed search algorithms.


The DeepHyper project is lead by Prasanna Balaprakash (pbalapra[at]anl[dot]gov) and co-lead by Romain Egele (romainegele[at]gmail[dot]com).

Asynchronous Distributed Bayesian Optimization at HPC Scale

Bayesian optimization (BO) is a widely used approach for computationally expensive black-box optimization such as simulator calibration and hyperparameter optimization of deep learning methods. In BO, a dynamically updated computationally cheap surrogate model is employed to learn the input-output relationship of the black-box function; this surrogate model is used to... [Read More]