Hyperparameter optimization for large-scale machine learning

Palacios Cuesta, Aitor (2018). Hyperparameter optimization for large-scale machine learning. Thesis (Master thesis), E.T.S. de Ingenieros Informáticos (UPM).


Title: Hyperparameter optimization for large-scale machine learning
  • Palacios Cuesta, Aitor
  • Markl, Volker
  • Derakhshan, Behrouz
Item Type: Thesis (Master thesis)
Masters title: Data Science
Date: 2018
Faculty: E.T.S. de Ingenieros Informáticos (UPM)
Department: Otro
Creative Commons Licenses: Recognition - No derivative works - Non commercial

Full text

PDF - Requires a PDF viewer, such as GSview, Xpdf or Adobe Acrobat Reader
Download (759kB) | Preview


Hyperparameter optimization is a crucial task affecting the final performance of machine learning solutions. This thesis analyzes the properties of different hyperparameter optimization algorithms for machine learning problems with large datasets. In hyperparameter optimization, we repeatedly propose a hyperparameter configuration, train a machine learning model using that configuration and validate the performance of the model. In order to scale in these scenarios, this thesis focuses on the data-parallel approach to evaluate hyperparameter configurations. We utilize Apache Spark, a dataparallel processing system, to implement a selection of hyperparameter optimization algorithms. Moreover, this thesis proposes a novel hyperparameter optimization method, Bayesian Geometric Halving (BGH). It is designed to benefit from the characteristics of dataparallel systems and the properties of existing hyperparameter optimization algorithms. BGH enhances Bayesian methods by combining them with adaptive evaluation. At the same time, it has a time-eficient execution independently of the number of con_gurations evaluated. Various experiments compare the performance of several hyperparameter optimization algorithms. These experiments extend their theoretical comparison in order tounderstand under which conditions some algorithms perform better than the others. The experiments show that BGH improves the results of both Bayesian and adaptive evaluation methods. Indeed, one of the variants of BGH achieves the best result in every experiment where they are compared.

More information

Item ID: 57815
DC Identifier: http://oa.upm.es/57815/
OAI Identifier: oai:oa.upm.es:57815
Deposited by: Biblioteca Facultad de Informatica
Deposited on: 31 Jan 2020 08:43
Last Modified: 31 Jan 2020 08:43
  • Logo InvestigaM (UPM)
  • Logo GEOUP4
  • Logo Open Access
  • Open Access
  • Logo Sherpa/Romeo
    Check whether the anglo-saxon journal in which you have published an article allows you to also publish it under open access.
  • Logo Dulcinea
    Check whether the spanish journal in which you have published an article allows you to also publish it under open access.
  • Logo de Recolecta
  • Logo del Observatorio I+D+i UPM
  • Logo de OpenCourseWare UPM