DOI:
https://doi.org/10.14483/23448393.19815Publicado:
2022-08-12Número:
Vol. 27 Núm. 3 (2022): Septiembre-diciembreSección:
EditorialA Possible Classification for Metaheuristic Optimization Algorithms in Engineering and Science
Palabras clave:
. (en).Descargas
Referencias
L. Abualigah, M. A. Elaziz, A. M. Khasawneh, M. Alshinwan, R. A. Ibrahim, M. A. A. Al-qaness, S. Mirjalili,
P. Sumari, and A. H. Gandomi, “Meta-heuristic optimization algorithms for solving real-world mechanical engineering
design problems: a comprehensive survey, applications, comparative analysis, and results,” Neur. Comp.
App., vol. 34, no. 6, pp. 4081–4110, Jan. 2022. 1 DOI: https://doi.org/10.1007/s00521-021-06747-4
R. Sioshansi and A. J. Conejo, Optimization in engineering. Ney York, NY, USA: Springer, 2017. 1 DOI: https://doi.org/10.1007/978-3-319-56769-3_1
A. Kumar, G.Wu, M. Z. Ali, R. Mallipeddi, P. N. Suganthan, and S. Das, “A test-suite of non-convex constrained
optimization problems from the real-world and some baseline results,” Swarm Evol. Comp., vol. 56, p. 100693,
Aug. 2020. 1
M. Abdel-Basset, L. Abdel-Fatah, and A. K. Sangaiah, “Metaheuristic algorithms: A comprehensive review,” in
Computational intelligence for multimedia big data on the cloud with engineering applications. Amsterdam,
Netherlands: Elsevier, 2018, pp. 185–231. 1
C. Venkateswarlu, “A metaheuristic tabu search optimization algorithm: Applications to chemical and environmental
processes,” in Optimization Problems in Engineering [Working Title]. IntechOpen, jun 2021. 1
J. O. Agushaka and A. E. Ezugwu, “Initialisation Approaches for Population-Based Metaheuristic Algorithms:
A Comprehensive Review,” App. Sci., vol. 12, no. 2, p. 896, jan 2022. 1 DOI: https://doi.org/10.3390/app12020896
K. Dahal, S. Remde, P. Cowling, and N. Colledge, “Improving metaheuristic performance by evolving a variable
fitness function,” in Evolutionary Computation in Combinatorial Optimization. Springer Berlin Heidelberg,
, pp. 170–181. 1
S. Katoch, S. S. Chauhan, and V. Kumar, “A review on genetic algorithm: past, present, and future,” Multimedia
Tools and Applications, vol. 80, no. 5, pp. 8091–8126, oct 2020. 2 DOI: https://doi.org/10.1007/s11042-020-10139-6
A. B. Gabis, Y. Meraihi, S. Mirjalili, and A. Ramdane-Cherif, “A comprehensive survey of sine cosine algorithm:
variants and applications,” Artificial Intelligence Review, vol. 54, no. 7, pp. 5469–5540, jun 2021. 2 DOI: https://doi.org/10.1007/s10462-021-10026-y
Y. Zhang, Z. Jin, and S. Mirjalili, “Generalized normal distribution optimization and its applications in parameter
extraction of photovoltaic models,” Energy Conversion and Management, vol. 224, p. 113301, nov 2020. 2 DOI: https://doi.org/10.1016/j.enconman.2020.113301
A. Biswas, K. K. Mishra, S. Tiwari, and A. K. Misra, “Physics-inspired optimization algorithms: A survey,”
Journal of Optimization, vol. 2013, pp. 1–16, 2013. 2 DOI: https://doi.org/10.1155/2013/438152
S. Q. Salih and A. A. Alsewari, “A new algorithm for normal and large-scale optimization problems: Nomadic
people optimizer,” Neural Computing and Applications, vol. 32, no. 14, pp. 10 359–10 386, oct 2019. 2
Cómo citar
APA
ACM
ACS
ABNT
Chicago
Harvard
IEEE
MLA
Turabian
Vancouver
Descargar cita
The area of optimization in engineering and science evolves because of the human need to solve real-life problems efficiently and in structured way1. The optimization model that represents said real-life problems can have different shapes: from linear programming to mixed-integer nonlinear programming models2. In addition, mathematical models in optimization can have a single objective function or multiple objectives in conflict. Due to the complexity of some optimization problems in engineering and science, exact optimization methods are inefficient for several reasons3: (i) due to the large dimensions of the solution spaces, (ii) non-linearities and non-convexities in objective functions and/or constraints, and (iii) failure to guarantee that the global optimum is reached. Moreover, for most of these optimization problems, processing times increase with non-polynomial forms, which is why most optimization problems belong to the NP-hard family. These complications make it necessary to employ alternative solution methods for addressing multiple programming models, with the aim of finding suitable solutions (local optima) that require low computational effort and are implementable in any programming environment4. These optimization algorithms are known as metaheuristic or combinatorial optimization methods. The main characteristic of these algorithms is that they begin the exploration of the solution space from an initial solution (a single solution or a group of solutions), which advances during the iteration process using different evolution rules. Algorithms which use a single solution are known as trajectory-based optimization algorithms5, while the methods that work with a group of solutions are known as population-based optimizers6.
Metaheuristic optimization methodologies are well accepted in science and engineering, as these work directly with the exact problem formulation by using penalty factors in order to explore and exploit the solution space, with the main advantage that infeasible solutions can be used during the evolution process in order to find promissory solution regions with excellent objective function values7.
Depending on the philosophy that inspires metaheuristic optimizers, these can be grouped into many different families. Here, we present a possible classification of the most common combinatorial optimization methods applied in science and engineering.
Bio-inspired algorithms
These algorithms are inspired by the biological processes and behaviors of living beings in nature. Genetic algorithms, inspired by Darwin’s evolution theory, are the most classical examples of their kind. They work with an initial population of parents that evolves during the iteration process by applying three main concepts: (i) selection, (ii) recombination, and (iii) mutation. They yield a set of offspring that will be part of the population if they improve the worst objective function values of some parents and are different from them. In the current literature, the most implemented version of genetic algorithms corresponds to the Chu and Beasley algorithm given its computational efficiency in terms of processing times, as it only replaces one individual in the population at each iteration8.
Bio-inspired algorithms may also be based on the behavior of groups of living beings searching for food. Some of the most recognized methodologies are (i) particle swarm optimization (flocks of birds and fish), (ii) the crow search algorithm (flocks of crows), (iii) the salp swarm algorithm (flocks of salps), (iv) the whale optimization algorithm (flocks of whales), and (v) the ant lion and ant colony optimizers (flocks of ants).
Mathematics-inspired algorithms
These optimization algorithms correspond to approaches that are based on nonlinear functions, numerical methods, statistical behaviors, or distributed permutation flow9, 10. These algorithms exploit the properties of some well-known functions and statistical distributions in order to model different behaviors present in nature. The most common methods are: (i) the sine-cosine algorithm, (ii) the gradient-based metaheuristic optimizer, (iii) the Newton-metaheuristic algorithm, (iv) the generalized normal distribution algorithm, and (v) Tabu search, among others.
Physics-inspired algorithms
This family of algorithms is based on the behaviors observed in nature that are not related to biological processes. In general, they are based on physical observation and experimentation11. Some of the most recognized and widely used physics-inspired optimization algorithms are (i) the gravitational search algorithm, (i) the black hole optimizer, (iii) the supernova optimizer, (iv) the vortex search algorithm, (v) the charged system search algorithm, (vi) the galaxy-based search algorithm, (vii) the multiverse optimization algorithm, and (viii) the simulated annealing algorithm.
Socially inspired algorithms
Socially inspired algorithms are search optimization methodologies that emulate human interactions in different environments12. These algorithms exploit the behavior of humans in groups in order to solve real-life problems and learn new skills. Some of the most common socially inspired algorithms are (i) the nomadic people optimizer, (ii) teaching-learning-based optimization, (iii) the socio-evolution and learning optimization algorithm, (iv) artificial memory optimization, (v) human mental search, and (vi) the cultural evolution algorithm.
Remark 1 In general, the use of metaheuristic optimization algorithms in science and engineering offers hundreds of possibilities regarding algorithms from different families. Furthermore, their research is also under continuous development, i.e., several papers and many literature every year provide multiple new optimizers trying to emulate situations and processes observed in nature in order to reach optimal solutions for engineering and real-life problems.
References
Licencia
Esta obra está bajo una licencia internacional Creative Commons Atribución-NoComercial-CompartirIgual 4.0.
A partir de la edición del V23N3 del año 2018 hacia adelante, se cambia la Licencia Creative Commons “Atribución—No Comercial – Sin Obra Derivada” a la siguiente:
Atribución - No Comercial – Compartir igual: esta licencia permite a otros distribuir, remezclar, retocar, y crear a partir de tu obra de modo no comercial, siempre y cuando te den crédito y licencien sus nuevas creaciones bajo las mismas condiciones.