Bayesian optimization (BayesOpt) is a derivative-free ap-proach for sequentially optimizing stochastic black-box functions. Standard BayesOpt, which has shown many successesin machine learning applications, assumes a finite dimen-sional domain which often is a parametric space. The pa-rameter space is defined by the features used in the function approximations which are often selected manually. There-fore, the performance of BayesOpt inevitably depends onthe quality of chosen features. This paper proposes a newBayesian optimization framework that is able to optimize di-rectly on the domain of function spaces. The resulting frame-work, Bayesian Functional Optimization (BFO), not only ex-tends the application domains of BayesOpt to functional op-timization problems but also relaxes the performance depen-dency on the chosen parameter space. We model the domainof functions as a reproducing kernel Hilbert space (RKHS),and use the notion of Gaussian processes on a real separa-ble Hilbert space. As a result, we are able to define tradi-tional improvement-based (PI and EI) and optimistic acquisi-tion functions (UCB) as functionals. We propose to optimizethe acquisition functionals using analytic functional gradientsthat are also proved to be functions in a RKHS. We evaluateBFO in three typical functional optimization tasks: i) a syn-thetic functional optimization problem, ii) optimizing activa-tion functions for a multi-layer perceptron neural network,and iii) a reinforcement learning task whose policies are mod-eled in RKHS.
|Title of host publication||The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18)|
|Number of pages||8|
|Publication status||Published - 29 Apr 2018|
|Event||AAAI 2018 - New Orleans, New Orleans, United States|
Duration: 02 Feb 2018 → 07 Feb 2018
|Name||Proceedings Of The AAAI Conference On Artificial Intelligence And The Innovative Applications Of Artificial Intelligence Conference|
|Period||02/02/2018 → 07/02/2018|