Bayesian optimization; Evolutionary algorithms; Parallel computing; Surrogate-based optimization; Bayesian; Bayesian optimization algorithms; Benchmark functions; Black boxes; Computational budget; Evolutionary approach; Objective functions; Parallel com- puting; Control and Systems Engineering; Artificial Intelligence; Electrical and Electronic Engineering
Abstract :
[en] Parallel Surrogate-Based Optimization (PSBO) is an efficient approach to deal with black-box time-consuming objective functions. According to the available computational budget to solve a given problem, three classes of algorithms are investigated and opposed in this paper: Bayesian Optimization Algorithms (BOAs), Surrogate-Assisted Evolutionary Algorithms (SAEAs) and Surrogate-free Evolutionary Algorithms (EAs). A large set of benchmark functions and engineering applications are considered with various computational budgets. In this paper, we come up with guidelines for the choice between the three categories. According to the computational expensiveness of the objective functions and the number of processing cores, we identify a threshold from which SAEAs should be preferred to BOAs. Based on this threshold, we derive a new hybrid Bayesian/Evolutionary algorithm that allows one to tackle a wide range of problems without prior knowledge of their characteristics.
Disciplines :
Computer science
Author, co-author :
Gobert, Maxime ; Université de Mons - UMONS > Faculté Polytechnique > Service de Mathématique et Recherche opérationnelle
Briffoteaux, Guillaume ; Université de Mons - UMONS > Faculté Polytechnique > Service de Mathématique et Recherche opérationnelle
Gmys, Jan ; University of Lille - CNRS CRIStAL, Villeneuve d'Ascq, France
Melab, Nouredine ; University of Lille - CNRS CRIStAL, Villeneuve d'Ascq, France ; Inria Lille - Nord Europe, BONUS Team, Lille, France
Tuyttens, Daniel ; Université de Mons - UMONS > Faculté Polytechnique > Service de Mathématique et Recherche opérationnelle
Language :
English
Title :
Observations in applying Bayesian versus evolutionary approaches and their hybrids in parallel time-constrained optimization
Publication date :
November 2024
Journal title :
Engineering Applications of Artificial Intelligence
Experiments presented in this paper were carried out using the Grid'5000 testbed, supported by a scientific interest group hosted by Inria and including CNRS, RENATER and several Universities as well as other organizations (see https://www.grid5000.fr).
Alkan, B., Kaniappan Chinnathai, M., Performance comparison of recent population-based metaheuristic optimisation algorithms in mechanical design problems of machinery components. Machines, 9(12), 2021, 10.3390/machines9120341 URL https://www.mdpi.com/2075-1702/9/12/341.
Ath, G.D., Everson, R.M., Fieldsend, J.E., Rahat, A.A.M., ϵ-shotgun. Proceedings of the 2020 Genetic and Evolutionary Computation Conference, 2020, ACM, 10.1145/3377930.3390154.
Balandat, M., Karrer, B., Jiang, D.R., Daulton, S., Letham, B., Wilson, A.G., Bakshy, E., BoTorch: A framework for efficient Monte-Carlo Bayesian optimization. Advances in Neural Information Processing Systems, Vol. 33, 2020 URL http://arxiv.org/abs/1910.06403.
Balouek, D., Carpen Amarie, A., Charrier, G., Desprez, F., Jeannot, E., Jeanvoine, E., Lèbre, A., Margery, D., Niclausse, N., Nussbaum, L., Richard, O., Pérez, C., Quesnel, F., Rohr, C., Sarzyniec, L., Adding virtualization capabilities to the Grid'5000 testbed. Ivanov, I.I., van Sinderen, M., Leymann, F., Shan, T., (eds.) Cloud Computing and Services Science Communications in Computer and Information Science, vol. 367, 2013, Springer International Publishing, 3–20, 10.1007/978-3-319-04519-1_1.
Binois, M., Collier, N., Ozik, J., A portfolio approach to massively parallel Bayesian optimization. 2023 URL https://hal.inria.fr/hal-03383097. working paper or preprint.
Biscani, F., Izzo, D., A parallel global multiobjective framework for optimization: pagmo. J. Open Source Softw., 5(53), 2020, 2338, 10.21105/joss.02338.
Boeringer, D., Werner, D., Particle swarm optimization versus genetic algorithms for phased array synthesis. IEEE Trans. Antennas and Propagation 52:3 (2004), 771–779, 10.1109/TAP.2004.825102.
Briffoteaux, G., pySBO: Python framework for surrogate-based optimization. 2021 https://pysbo.readthedocs.io/.
Briffoteaux, G., Parallel Surrogate-Based Algorithms for Solving Expensive Optimization Problems. (Ph.D. thesis), 2022, Université de Mons, Université de Lille.
Briffoteaux, G., Gobert, M., Ragonnet, R., Gmys, J., Mezmaz, M., Melab, N., Tuyttens, D., Parallel surrogate-assisted optimization: Batched Bayesian neural network-assisted GA versus q-EGO. Swarm Evol. Comput., 57, 2020, 100717, 10.1016/j.swevo.2020.100717 URL http://www.sciencedirect.com/science/article/pii/S2210650220303709.
Briffoteaux, G., Ragonnet, R., Mezmaz, M., Melab, N., Tuyttens, D., Evolution control for parallel ANN-assisted simulation-based optimization application to tuberculosis transmission control. Future Gener. Comput. Syst. 113 (2020), 454–467, 10.1016/j.future.2020.07.005 URL https://www.sciencedirect.com/science/article/pii/S0167739X19308635.
Briffoteaux, G., Ragonnet, R., Mezmaz, M., Melab, N., Tuyttens, D., 2021. Evolution Control Ensemble Models for Surrogate-Assisted Evolutionary Algorithms. In: High Performance Computing and Simulation 2020. Barcelona, Spain, URL.
Carroll, E., Multi-Swarm Adaptive Velocity PSO for Constrained Engineering Problems. (Ph.D. thesis), 2017.
Chen, Q., Liu, B., Zhang, Q.F., Liang, J.J., Suganthan, P.N., Qu, B., Problem definitions and evaluation criteria for CEC 2015 special session on bound constrained single-objective computationally expensive numerical optimization. 2015.
Chen, T., Tang, K., Chen, G., Yao, X., A large population size can be unhelpful in evolutionary algorithms. Theoret. Comput. Sci. 436 (2012), 54–70, 10.1016/j.tcs.2011.02.016 URL https://www.sciencedirect.com/science/article/pii/S0304397511001368.
Chevalier, C., Fast computation of the multi-points expected improvement with applications in batch selection. 2013, 10.1007/978-3-642-44973-4_7.
Clerc, M., The Swarm and the Queen: Towards a Deterministic and Adaptive Particle Swarm Optimization. 1999, 1957, 10.1109/CEC.1999.785513 Vol. 3.
Clerc, M., Kennedy, J., The particle swarm - explosion, stability, and convergence in a multidimensional complex space. IEEE Trans. Evol. Comput. 6:1 (2002), 58–73, 10.1109/4235.985692.
Cox, D., John, S., A statistical method for global optimization. [Proceedings] 1992 IEEE International Conference on Systems, Man, and Cybernetics, Vol. 2, 1992, 1241–1246, 10.1109/ICSMC.1992.271617.
Dalcin, L., Fang, Y.-L.L., Mpi4py: Status update after 12 years of development. Comput. Sci. & Eng. 23:4 (2021), 47–54, 10.1109/MCSE.2021.3083216.
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T., A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6:2 (2002), 182–197, 10.1109/4235.996017.
Derrac, J., García, S., Molina, D., Herrera, F., A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 1:1 (2011), 3–18, 10.1016/j.swevo.2011.02.002 URL https://app.dimensions.ai/details/publication/pub.1011052808.
Díaz-Manríquez, A., Toscano, G., Barron-Zambrano, J., Tello-Leal, E., A review of surrogate assisted multiobjective evolutionary algorithms. Comput. Intell. Neurosci., 2016, 2016, 14, 10.1155/2016/9420460 URL https://www.hindawi.com/journals/cin/2016/9420460/.
Eberhart, R.C., Shi, Y., Comparison between genetic algorithms and particle swarm optimization. Evolutionary Programming, 1998 URL https://api.semanticscholar.org/CorpusID:14050546.
Eriksson, D., Pearce, M., Gardner, J.R., Turner, R., Poloczek, M., Scalable global optimization via local Bayesian optimization. 2020 arXiv:1910.01739.
Feng, Z., Zhang, Q., Zhang, Q., Tang, Q., Yang, T., Ma, Y., A multiobjective optimization based framework to balance the global exploration and local exploitation in expensive optimization. J. Global Optim. 61 (2014), 1–18, 10.1007/s10898-014-0210-2.
Frazier, P., A tutorial on bayesian optimization. 2018.
Gardner, J.R., Pleiss, G., Bindel, D., Weinberger, K.Q., Wilson, A.G., GPyTorch: Blackbox matrix-matrix Gaussian process inference with GPU acceleration. Advances in Neural Information Processing Systems, 2018.
Ginsbourger, D., Le Riche, R., Carraro, L., A multi-points criterion for deterministic parallel global optimization based on kriging. 2008.
Ginsbourger, D., Riche, R.L., Carraro, L., Kriging is well-suited to parallelize optimization. 2010.
Gobert, M., Gmys, J., Melab, N., Tuyttens, D., 2021a. Adaptive Space Partitioning for Parallel Bayesian Optimization. In: HPCS 2020 - the 18th International Conference on High Performance Computing Simulation. Barcelona / Virtual, Spain, URL.
Gobert, M., Gmys, J., Melab, N., Tuyttens, D., 2021b. Space Partitioning with multiple models for Parallel Bayesian Optimization. In: OLA 2021 - Optimization and Learning Algorithm. Sicilia / Virtual, Italy, URL.
Gobert, M., Gmys, J., Melab, N., Tuyttens, D., 2021c. Space Partitioning with multiple models for Parallel Bayesian Optimization. In: OLA 2021 - Optimization and Learning Algorithm. Sicilia / Virtual, Italy, URL.
Gobert, M., Gmys, J., Toubeau, J.-F., Melab, N., Tuyttens, D., Vallée, F., Batch acquisition for parallel Bayesian optimization; application to hydro-energy storage systems scheduling. Algorithms, 15(12), 2022, 10.3390/a15120446 URL https://www.mdpi.com/1999-4893/15/12/446.
Gobert, M., Gmys, J., Toubeau, J.-F., Vallée, F., Melab, N., Tuyttens, D., Surrogate-assisted optimization for multi-stage optimal scheduling of virtual power plants. 2019 International Conference on High Performance Computing Simulation, HPCS, 2019, 113–120, 10.1109/HPCS48598.2019.9188065.
González, J., Dai, Z., Hennig, P., Lawrence, N.D., Batch Bayesian optimization via local penalization. 2015 arXiv:1505.08052.
Gramacy, R.B., Lee, H.K.H., Bayesian treed Gaussian process models with an application to computer modeling. J. Amer. Statist. Assoc. 103:483 (2008), 1119–1130.
Haftka, R., Villanueva, D., Chaudhuri, A., Parallel surrogate-assisted global optimization with expensive functions – a survey. Struct. Multidiscip. Optim. 54 (2016), 3–13, 10.1007/s00158-016-1432-3.
Holland, J.H., Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. 1992, The MIT Press, 10.7551/mitpress/1090.001.0001.
Jin, Y., Olhofer, M., Sendhoff, B., On evolutionary optimization with approximate fitness functions. Proceedings of the 2nd Annual Conference on Genetic and Evolutionary Computation, GECCO ’00, 2000, Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 786–793 URL http://dl.acm.org/citation.cfm?id=2933718.2933864.
Jin, Y., Olhofer, M., Sendhoff, B., Managing approximate models in evolutionary aerodynamic design optimization. Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546), Vol. 1, 2001, 592–599, 10.1109/CEC.2001.934445 vol. 1.
Jones, D.R., Schonlau, M., Welch, W.J., Efficient global optimization of expensive black-box functions. J. Global Optim. 13:4 (1998), 455–492, 10.1023/A:1008306431147.
Kandasamy, K., Krishnamurthy, A., Schneider, J., Poczos, B., Asynchronous parallel Bayesian optimisation via thompson sampling. 2017 arXiv:1705.09236.
Kaveh, M., Mesgari, M., Application of meta-heuristic algorithms for training neural networks and deep learning architectures: A comprehensive review. Neural Process. Lett., 2022, 10.1007/s11063-022-11055-6.
Khokhar, M.A., Boudt, K., Wan, C., 2021. Cardinality-Constrained Higher-Order Moment Portfolios Using Particle Swarm Optimization, 169–187, http://dx.doi.org/10.1007/978-3-030-70281-6_10.
Kumar, P., Gupta, A., Active learning query strategies for classification, regression, and clustering: A survey. J. Comput. Sci. Tech. 35 (2020), 913–945, 10.1007/s11390-020-9487-4.
Kumar, A., Wu, G., Ali, M.Z., Mallipeddi, R., Suganthan, P.N., Das, S., A test-suite of non-convex constrained optimization problems from the real-world and some baseline results. Swarm Evol. Comput., 56, 2020, 100693, 10.1016/j.swevo.2020.100693 URL https://www.sciencedirect.com/science/article/pii/S2210650219308946.
Kushner, H.J., A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Fluids Eng. 86:1 (1964), 97–106, 10.1115/1.3653121 arXiv:https://asmedigitalcollection.asme.org/fluidsengineering/article-pdf/86/1/97/5763745/97_1.pdf.
Liang, J., Qu, B., Suganthan, P., 2013. Problem Definitions and Evaluation Criteria for the CEC 2014 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization. Tech. Rep.
Liu, D.C., Nocedal, J., On the limited memory BFGS method for large scale optimization. Math. Program. 45:1–3 (1989), 503–528, 10.1007/BF01589116.
Lyu, W., Yang, F., Yan, C., Zhou, D., Zeng, X., Batch Bayesian optimization via multi-objective acquisition ensemble for automated analog circuit design. Dy, J., Krause, A., (eds.) Proceedings of the 35th International Conference on Machine Learning Proceedings of Machine Learning Research, vol. 80, 2018, PMLR, Stockholmsmässan, Stockholm Sweden, 3306–3314 URL http://proceedings.mlr.press/v80/lyu18a.html.
Marmin, S., Chevalier, C., Ginsbourger, D., Differentiating the multipoint expected improvement for optimal batch design. 2015, 37–48, 10.1007/978-3-319-27926-8_4.
Močkus, J., On bayesian methods for seeking the extremum. Marchuk, G.I., (eds.) Optimization Techniques IFIP Technical Conference Novosibirsk, July 1–7, 1974, 1975, Springer Berlin Heidelberg, Berlin, Heidelberg, 400–404.
Palma, A.D., Mendler-Dünner, C., Parnell, T., Anghel, A., Pozidis, H., Sampling acquisition functions for batch Bayesian optimization. 2019 arXiv:1903.09434.
Powell, M.J.D., A direct search optimization method that models the objective and constraint functions by linear interpolation. Gomez, S., Hennart, J.-P., (eds.) Advances in Optimization and Numerical Analysis, 1994, Springer Netherlands, Dordrecht, 51–67, 10.1007/978-94-015-8330-5_4.
Queipo, N.V., Haftka, R.T., Shyy, W., Goel, T., Vaidyanathan, R., Kevin Tucker, P., Surrogate-based analysis and optimization. Prog. Aerosp. Sci. 41:1 (2005), 1–28, 10.1016/j.paerosci.2005.02.001 URL https://www.sciencedirect.com/science/article/pii/S0376042105000102.
Rasmussen, C.E., Williams, C.K.I., Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). 2005, The MIT Press.
Shahriari, B., Swersky, K., Wang, Z., Adams, R.P., de Freitas, N., Taking the human out of the loop: A review of Bayesian optimization. Proc. IEEE 104:1 (2016), 148–175, 10.1109/JPROC.2015.2494218.
Shi, L., Rasheed, K., A survey of fitness approximation methods applied in evolutionary algorithms. Computational Intelligence in Expensive Optimization Problems, 2010, Springer Berlin Heidelberg, Berlin, Heidelberg, 3–28, 10.1007/978-3-642-10701-6_1.
Snoek, J., Larochelle, H., Adams, R.P., Practical bayesian optimization of machine learning algorithms. Advances in Neural Information Processing Systems, 2012, 2951–2959.
Storti, G.L., Paschero, M., Rizzi, A., Frattale Mascioli, F.M., Comparison between time-constrained and time-unconstrained optimization for power losses minimization in smart grids using genetic algorithms. Neurocomputing 170 (2015), 353–367, 10.1016/j.neucom.2015.02.088 URL https://www.sciencedirect.com/science/article/pii/S0925231215008772. Advances on Biological Rhythmic Pattern Generation: Experiments, Algorithms and Applications Selected Papers from the 2013 International Conference on Intelligence Science and Big Data Engineering (IScIDE 2013) Computational Energy Management in Smart Grids.
Syberfeldt, A., Grimm, H., Ng, A., John, R.I., A parallel surrogate-assisted multi-objective evolutionary algorithm for computationally expensive optimization problems. 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), 2008, 3177–3184, 10.1109/CEC.2008.4631228.
Talbi, E., Metaheuristics: From Design to Implementation Wiley Series on Parallel and Distributed Computing, 2009, Wiley URL https://books.google.fr/books?id=SIsa6zi5XV8C.
Thieu, N.V., ENOPPY: A python library for engineering optimization problems. 2023, 10.5281/zenodo.7953206 URL https://github.com/thieu1995/enoppy.
Torres-Jiménez, J., Pavón, J., Applications of metaheuristics in real-life problems. Prog. Artif. Intell., 2, 2014, 10.1007/s13748-014-0051-8.
Viana, F.A.C., Haftka, R.T., Watson, L.T., Efficient global optimization algorithm assisted by multiple surrogate techniques. J. Global Optim. 56:2 (2013), 669–689, 10.1007/s10898-012-9892-5.
Wang, H., Bäck, T., Emmerich, M., Multi-point efficient global optimization using niching evolution strategy. A Bridge Between Probability, Set Oriented Numerics and Evolutionary Computation, 2015.
Wang, J., Clark, S., Liu, E., Frazier, P., Parallel Bayesian global optimization of expensive functions. 2016.
Xiao, Y.-N., Guo, Y., Cui, H., Wang, Y., Li, J., Zhang, Y., IHAOAVOA: An improved hybrid aquila optimizer and african vultures optimization algorithm for global optimization problems. Math. Biosci. Eng.: MBE 19 (2022), 10963–11017, 10.3934/mbe.2022512.
Zhan, D., Qian, J., Cheng, Y., Balancing global and local search in parallel efficient global optimization algorithms. J. Global Optim. 67:4 (2017), 873–892, 10.1007/s10898-016-0449-x.