Optuna: A Define-by-Run Hyperparameter Optimization Framework
Optuna: A Define-by-Run Hyperparameter Optimization Framework

Abstract: 

In this workshop, we introduce Optuna, a next-generation hyperparameter optimization framework with new design-criteria: (1) define-by-run API that allows users to concisely construct dynamic, nested, or conditional search spaces, (2) efficient implementation of both sampling and early stopping strategies, and (3) easy-to-setup, versatile architecture that can be deployed for various purposes, ranging from scalable distributed computing to lightweight experiment conducted in a local laptop machine. Our software is available under the MIT license.

Bio: 

Crissman has worked at Preferred Networks on the Deep Learning Open Source Software team for over two years, improving the documentation for the Deep Learning framework Chainer and giving presentations on Open Data Science Conferences, SciPy, PyCon, GTC, and other venues. His ODSC West workshop on Chainer was selected as one of the top 10 workshops for learning Machine Learning.