Black Box Optimization (BBO) has seen a growing interest in the field of optimization, and nowadays it is creating even more interest due to its applications in the area of computational neuroscience, machine learning, materials design, and computational biology. The objective of BBO is to find a global minimum point under some resource constraint problems (e.g. the maximum number of function evaluations) of an expensive and possibly noisy non-smooth function, with no gradient information. Thanks to the latest efforts made in this field, the Bayesian Adaptive Direct Search (BADS) algorithm proposed a hybrid Bayesian Optimization (BO) method. This approach combines the Mesh Adaptive Direct Search (MADS) method and a Gaussian Process surrogate to compete and find comparable or better solutions of existing state-of-the-art non-convex derivative-free optimizers. In this thesis, we provide an extended study on BADS, by presenting PyBADS, a Python version of the existing complex MATLAB library of BADS. We include in this study the implementation of a new approach based on the Stochastic Mesh Adaptive Direct Search (Sto-MADS), by integrating it into the existing framework of BADS. The integration provides a new method for proving the convergence of the algorithm towards stationary points for non-smooth stochastic functions. The implementation of this method is present in the PyBADS python library. Moreover, we developed a generic black box optimization benchmark for testing and evaluating different optimizers. In this work, we limited its usage to assess and compare PyBADS and BADS on different black box optimization problems, by designing an evaluation method with high statistical power for assessing the performance of the optimizers. From the benchmarking results, PyBADS has achieved same and competitive outcomes as BADS, which shows the correctness of the porting of BADS.

Improving on Stochastic Bayesian Adaptive Direct Search

SINGH, GURJEET
2021/2022

Abstract

Black Box Optimization (BBO) has seen a growing interest in the field of optimization, and nowadays it is creating even more interest due to its applications in the area of computational neuroscience, machine learning, materials design, and computational biology. The objective of BBO is to find a global minimum point under some resource constraint problems (e.g. the maximum number of function evaluations) of an expensive and possibly noisy non-smooth function, with no gradient information. Thanks to the latest efforts made in this field, the Bayesian Adaptive Direct Search (BADS) algorithm proposed a hybrid Bayesian Optimization (BO) method. This approach combines the Mesh Adaptive Direct Search (MADS) method and a Gaussian Process surrogate to compete and find comparable or better solutions of existing state-of-the-art non-convex derivative-free optimizers. In this thesis, we provide an extended study on BADS, by presenting PyBADS, a Python version of the existing complex MATLAB library of BADS. We include in this study the implementation of a new approach based on the Stochastic Mesh Adaptive Direct Search (Sto-MADS), by integrating it into the existing framework of BADS. The integration provides a new method for proving the convergence of the algorithm towards stationary points for non-smooth stochastic functions. The implementation of this method is present in the PyBADS python library. Moreover, we developed a generic black box optimization benchmark for testing and evaluating different optimizers. In this work, we limited its usage to assess and compare PyBADS and BADS on different black box optimization problems, by designing an evaluation method with high statistical power for assessing the performance of the optimizers. From the benchmarking results, PyBADS has achieved same and competitive outcomes as BADS, which shows the correctness of the porting of BADS.
2021
Improving on Stochastic Bayesian Adaptive Direct Search
Direct Search
Optimization
Bayesian
Gaussian Process
BayesianOptimization
File in questo prodotto:
File Dimensione Formato  
Singh_Gurjeet.pdf

accesso aperto

Descrizione: Improving on Stochastic Bayesian Adaptive Direct Search
Dimensione 22.15 MB
Formato Adobe PDF
22.15 MB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/42162