Correcting Boundary Over-Exploration Deficiencies In Bayesian Optimization With Virtual Derivative Sign Observations

Eero Siivola, Aki Vehtari, Jarno Vanhatalo, Javier Gonzalez, Michael Andersen

Tutkimustuotos: Artikkeli kirjassa/raportissa/konferenssijulkaisussaKonferenssiartikkeliTieteellinenvertaisarvioitu

Kuvaus

Bayesian optimization (BO) is a global optimization strategy designed to find the minimum of an expensive black-box function, typically defined on a compact subset of ℛ d , by using a Gaussian process (GP) as a surrogate model for the objective. Although currently available acquisition functions address this goal with different degree of success, an over-exploration effect of the contour of the search space is typically observed. However, in problems like the configuration of machine learning algorithms, the function domain is conservatively large and with a high probability the global minimum does not sit on the boundary of the domain. We propose a method to incorporate this knowledge into the search process by adding virtual derivative observations in the GP at the boundary of the search space. We use the properties of GPs to impose conditions on the partial derivatives of the objective. The method is applicable with any acquisition function, it is easy to use and consistently reduces the number of evaluations required to optimize the objective irrespective of the acquisition used. We illustrate the benefits of our approach in an extensive experimental comparison.
Alkuperäiskielienglanti
Otsikko2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)
ToimittajatN Pustelnik, Z Ma, ZH Tan, J Larsen
Sivumäärä6
Vuosikerta28
KustantajaIEEE
Julkaisupäivämarraskuuta 2018
ISBN (elektroninen)978-1-5386-5477-4
TilaJulkaistu - marraskuuta 2018
OKM-julkaisutyyppiA4 Artikkeli konferenssijulkaisuussa
TapahtumaIEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING - Aalborg, Tanska
Kesto: 17 syyskuuta 201820 syyskuuta 2018
Konferenssinumero: 28
http://mlsp2018.conwiz.dk/home.htm

Julkaisusarja

NimiIEEE International Workshop on Machine Learning for Signal Processing
KustantajaIEEE
ISSN (painettu)2161-0363

Tieteenalat

  • 112 Tilastotiede
  • 113 Tietojenkäsittely- ja informaatiotieteet

Lainaa tätä

Siivola, E., Vehtari, A., Vanhatalo, J., Gonzalez, J., & Andersen, M. (2018). Correcting Boundary Over-Exploration Deficiencies In Bayesian Optimization With Virtual Derivative Sign Observations. teoksessa N. Pustelnik, Z. Ma, ZH. Tan, & J. Larsen (Toimittajat), 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP) (Vuosikerta 28). (IEEE International Workshop on Machine Learning for Signal Processing). IEEE.
Siivola, Eero ; Vehtari, Aki ; Vanhatalo, Jarno ; Gonzalez, Javier ; Andersen, Michael. / Correcting Boundary Over-Exploration Deficiencies In Bayesian Optimization With Virtual Derivative Sign Observations. 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP). Toimittaja / N Pustelnik ; Z Ma ; ZH Tan ; J Larsen. Vuosikerta 28 IEEE, 2018. (IEEE International Workshop on Machine Learning for Signal Processing).
@inproceedings{504dea88e3694472bbe73fe1731c5591,
title = "Correcting Boundary Over-Exploration Deficiencies In Bayesian Optimization With Virtual Derivative Sign Observations",
abstract = "Bayesian optimization (BO) is a global optimization strategy designed to find the minimum of an expensive black-box function, typically defined on a compact subset of ℛ d , by using a Gaussian process (GP) as a surrogate model for the objective. Although currently available acquisition functions address this goal with different degree of success, an over-exploration effect of the contour of the search space is typically observed. However, in problems like the configuration of machine learning algorithms, the function domain is conservatively large and with a high probability the global minimum does not sit on the boundary of the domain. We propose a method to incorporate this knowledge into the search process by adding virtual derivative observations in the GP at the boundary of the search space. We use the properties of GPs to impose conditions on the partial derivatives of the objective. The method is applicable with any acquisition function, it is easy to use and consistently reduces the number of evaluations required to optimize the objective irrespective of the acquisition used. We illustrate the benefits of our approach in an extensive experimental comparison.",
keywords = "112 Statistics and probability, 113 Computer and information sciences, Bayesian optimization, Gaussian process, virtual derivative sign observation",
author = "Eero Siivola and Aki Vehtari and Jarno Vanhatalo and Javier Gonzalez and Michael Andersen",
year = "2018",
month = "11",
language = "English",
volume = "28",
series = "IEEE International Workshop on Machine Learning for Signal Processing",
publisher = "IEEE",
editor = "N Pustelnik and Z Ma and ZH Tan and J Larsen",
booktitle = "2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)",
address = "International",

}

Siivola, E, Vehtari, A, Vanhatalo, J, Gonzalez, J & Andersen, M 2018, Correcting Boundary Over-Exploration Deficiencies In Bayesian Optimization With Virtual Derivative Sign Observations. julkaisussa N Pustelnik, Z Ma, ZH Tan & J Larsen (toim), 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP). Vuosikerta 28, IEEE International Workshop on Machine Learning for Signal Processing, IEEE, IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, Aalborg, Tanska, 17/09/2018.

Correcting Boundary Over-Exploration Deficiencies In Bayesian Optimization With Virtual Derivative Sign Observations. / Siivola, Eero; Vehtari, Aki; Vanhatalo, Jarno ; Gonzalez, Javier; Andersen, Michael.

2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP). toim. / N Pustelnik; Z Ma; ZH Tan; J Larsen. Vuosikerta 28 IEEE, 2018. (IEEE International Workshop on Machine Learning for Signal Processing).

Tutkimustuotos: Artikkeli kirjassa/raportissa/konferenssijulkaisussaKonferenssiartikkeliTieteellinenvertaisarvioitu

TY - GEN

T1 - Correcting Boundary Over-Exploration Deficiencies In Bayesian Optimization With Virtual Derivative Sign Observations

AU - Siivola, Eero

AU - Vehtari, Aki

AU - Vanhatalo, Jarno

AU - Gonzalez, Javier

AU - Andersen, Michael

PY - 2018/11

Y1 - 2018/11

N2 - Bayesian optimization (BO) is a global optimization strategy designed to find the minimum of an expensive black-box function, typically defined on a compact subset of ℛ d , by using a Gaussian process (GP) as a surrogate model for the objective. Although currently available acquisition functions address this goal with different degree of success, an over-exploration effect of the contour of the search space is typically observed. However, in problems like the configuration of machine learning algorithms, the function domain is conservatively large and with a high probability the global minimum does not sit on the boundary of the domain. We propose a method to incorporate this knowledge into the search process by adding virtual derivative observations in the GP at the boundary of the search space. We use the properties of GPs to impose conditions on the partial derivatives of the objective. The method is applicable with any acquisition function, it is easy to use and consistently reduces the number of evaluations required to optimize the objective irrespective of the acquisition used. We illustrate the benefits of our approach in an extensive experimental comparison.

AB - Bayesian optimization (BO) is a global optimization strategy designed to find the minimum of an expensive black-box function, typically defined on a compact subset of ℛ d , by using a Gaussian process (GP) as a surrogate model for the objective. Although currently available acquisition functions address this goal with different degree of success, an over-exploration effect of the contour of the search space is typically observed. However, in problems like the configuration of machine learning algorithms, the function domain is conservatively large and with a high probability the global minimum does not sit on the boundary of the domain. We propose a method to incorporate this knowledge into the search process by adding virtual derivative observations in the GP at the boundary of the search space. We use the properties of GPs to impose conditions on the partial derivatives of the objective. The method is applicable with any acquisition function, it is easy to use and consistently reduces the number of evaluations required to optimize the objective irrespective of the acquisition used. We illustrate the benefits of our approach in an extensive experimental comparison.

KW - 112 Statistics and probability

KW - 113 Computer and information sciences

KW - Bayesian optimization

KW - Gaussian process

KW - virtual derivative sign observation

M3 - Conference contribution

VL - 28

T3 - IEEE International Workshop on Machine Learning for Signal Processing

BT - 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)

A2 - Pustelnik, N

A2 - Ma, Z

A2 - Tan, ZH

A2 - Larsen, J

PB - IEEE

ER -

Siivola E, Vehtari A, Vanhatalo J, Gonzalez J, Andersen M. Correcting Boundary Over-Exploration Deficiencies In Bayesian Optimization With Virtual Derivative Sign Observations. julkaisussa Pustelnik N, Ma Z, Tan ZH, Larsen J, toimittajat, 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP). Vuosikerta 28. IEEE. 2018. (IEEE International Workshop on Machine Learning for Signal Processing).