Volltext-Downloads (blau) und Frontdoor-Views (grau)
The search result changed since you submitted your search request. Documents might be displayed in a different sort order.
  • search hit 20 of 1253
Back to Result List

Scalable Hyperparameter Optimization with Lazy Gaussian Processes

  • Most machine learning methods require careful selection of hyper-parameters in order to train a high performing model with good generalization abilities. Hence, several automatic selection algorithms have been introduced to overcome tedious manual (try and error) tuning of these parameters. Due to its very high sample efficiency, Bayesian Optimization over a Gaussian Processes modeling of theMost machine learning methods require careful selection of hyper-parameters in order to train a high performing model with good generalization abilities. Hence, several automatic selection algorithms have been introduced to overcome tedious manual (try and error) tuning of these parameters. Due to its very high sample efficiency, Bayesian Optimization over a Gaussian Processes modeling of the parameter space has become the method of choice. Unfortunately, this approach suffers from a cubic compute complexity due to underlying Cholesky factorization, which makes it very hard to be scaled beyond a small number of sampling steps. In this paper, we present a novel, highly accurate approximation of the underlying Gaussian Process. Reducing its computational complexity from cubic to quadratic allows an efficient strong scaling of Bayesian Optimization while outperforming the previous approach regarding optimization accuracy. First experiments show speedups of a factor of 162 in single node and further speed up by a factor of 5 in a parallel environment.show moreshow less

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Document Type:Conference Proceeding
Conference Type:Konferenzartikel
Zitierlink: https://opus.hs-offenburg.de/3952
Bibliografische Angaben
Title (English):Scalable Hyperparameter Optimization with Lazy Gaussian Processes
Conference:IEEE/ACM Workshop on Machine Learning in High Performance Computing Environments (MLHPC), 18 November 2019, Denver, Colorado, USA
Author:Raju Ram, Sabine Müller, Franz-Josef Pfreundt, Nicolas R. Gauger, Janis KeuperStaff MemberORCiDGND
Year of Publication:2019
Publisher:IEEE
First Page:56
Last Page:65
Parent Title (English):Proceedings of MLHPC 2019: 5th Workshop on Machine Learning in HPC Environments
ISBN:978-1-7281-5985-0 (Online)
ISBN:978-1-7281-5986-7 (Print on Demand)
DOI:https://doi.org/10.1109/MLHPC49564.2019.00011
Language:English
Inhaltliche Informationen
Institutes:Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019)
Institutes:Bibliografie
DDC classes:000 Allgemeines, Informatik, Informationswissenschaft
Formale Angaben
Open Access: Closed Access 
Licence (German):License LogoUrheberrechtlich geschützt
Comment:
Conference held in conjunction with SC19: The International Conference for High Performance Computing, Networking, Storage and Analysis, Denver, Colorado, November 17-22, 2019