Preferred Networks Releases Optuna v1.0, Open-source Hyperparameter Optimization Framework ...

Preferred Networks

TOKYO, Jan. 14, 2020 /Kyodo JBN/ --

Preferred Networks, Inc.

Preferred Networks Releases Optuna v1.0, Open-source Hyperparameter Optimization Framework for Machine Learning

Preferred Networks, Inc. (PFN, Head Office: Tokyo, President & CEO: Toru Nishikawa) has released Optuna (TM) v1.0, the first major version of the open-source hyperparameter optimization framework for machine learning. Projects using the existing beta version can be updated to Optuna v1.0 with minimal changes to the code.

(Logo: https://kyodonewsprwire.jp/prwfile/release/M105870/202001065387/_prw_PI2lg_QDnpjE8Q.jpg)

In machine learning and deep learning, it is critical that complex hyperparameters (*1), which control the behavior of an algorithm during the training process, are optimized to deliver a trained model with better accuracy.

Optuna automates the trial-and-error process of optimizing hyperparameters. It finds hyperparameter values that enable the algorithm to give good performance. Since its beta version release as open-source software (OSS) in December 2018, Optuna has received development support from numerous contributors and added a number of new features based on feedbacks from the OSS community as well as in the company.

Main features of Optuna v1.0 include:

- Efficient hyperparameter tuning with state-of-the-art optimization algorithms

- Support for various machine learning libraries including PyTorch, TensorFlow, Keras, FastAI, scikit-learn, LightGBM, and XGBoost

- Support for parallel execution across multiple computing machines to significantly reduce the optimization time

- Search space can be described by Python control statements

- Various visualization techniques that allow users to conduct diverse analyses of the optimization results

Official website of Optuna: https://optuna.org/

Optuna has received many contributions from external developers. PFN will continue to quickly incorporate the results of the latest machine learning research into the development of Optuna and work with the OSS community to promote the use of Optuna.

(*1) Hyperparameters include learning rate, batch size, number of training iterations, number of neural network layers, and number of channels.

About the hyperparameter optimization framework for machine learning Optuna (TM)

Optuna was open-sourced by PFN in December 2018 as a hyperparameter optimization framework written in Python. Optuna automates the trial-and-error process of finding hyperparameters that deliver good performance. Optuna is used in many PFN projects and was an important factor in PFDet team’s award-winning performances in the first Kaggle Open Images object detection competition.

About Preferred Networks (PFN)

PFN was founded in March 2014 with the aim of promoting business utilization of deep learning and robotics technologies. PFN aims to drive innovations mainly in the three priority business areas of transportation systems, manufacturing, and bio/healthcare in collaboration with leading companies and organizations.

PFN developed the open-source deep learning framework Chainer (TM) in 2015 and demonstrated a fully autonomous tidying-up robot system at CEATEC 2018. In 2020, PFN plans to operate a supercomputer equipped with MN-Cor (TM), a deep learning processor developed by PFN. The range of its deep learning applications has been expanded to include areas such as personal robots, plant optimization, material search, sports analysis, and entertainment.

https://www.preferred.jp/en/

*Optuna (TM), Chainer (TM), and MN-Core (TM) are the trademarks or the registered trademarks of Preferred Networks, Inc. in Japan and elsewhere.

本プレスリリースは発表元が入力した原稿をそのまま掲載しております。また、プレスリリースへのお問い合わせは発表元に直接お願いいたします。

プレスリリース添付画像

| Small | Normal |
| Big | Original |

このプレスリリースには、報道機関向けの情報があります。

プレス会員登録を行うと、広報担当者の連絡先や、イベント・記者会見の情報など、報道機関だけに公開する情報が閲覧できるようになります。

プレスリリース受信に関するご案内

このプレスリリースを配信した企業・団体

  • ※購読している企業の確認や削除はWebプッシュ通知設定画面で行なってください
  • SNSでも最新のプレスリリース情報をいち早く配信中