Online Optimization With Costly and Noisy Measurements Using Random Fourier Expansions
This paper analyzes data-based online nonlinear extremum-seeker (DONE), an online optimization algorithm that iteratively minimizes an unknown function based on costly and noisy measurements. The algorithm maintains a surrogate of the unknown function in the form of a random Fourier expansion. The s...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. 29; no. 1; pp. 167 - 182 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.01.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper analyzes data-based online nonlinear extremum-seeker (DONE), an online optimization algorithm that iteratively minimizes an unknown function based on costly and noisy measurements. The algorithm maintains a surrogate of the unknown function in the form of a random Fourier expansion. The surrogate is updated whenever a new measurement is available, and then used to determine the next measurement point. The algorithm is comparable with Bayesian optimization algorithms, but its computational complexity per iteration does not depend on the number of measurements. We derive several theoretical results that provide insight on how the hyperparameters of the algorithm should be chosen. The algorithm is compared with a Bayesian optimization algorithm for an analytic benchmark problem and three applications, namely, optical coherence tomography, optical beam-forming network tuning, and robot arm control. It is found that the DONE algorithm is significantly faster than Bayesian optimization in the discussed problems while achieving a similar or better performance. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2016.2615134 |