Next Article in Journal
Quality of Milled Rice from Large-Scale Dried Paddy Rice by Hot Air Combined with Radio Frequency Heating
Next Article in Special Issue
Scheduling by NSGA-II: Review and Bibliometric Analysis
Previous Article in Journal
Study of Static and Dynamic Behavior of a Membrane Reactor for Hydrogen Production
Previous Article in Special Issue
Deep Ensemble of Slime Mold Algorithm and Arithmetic Optimization Algorithm for Global Optimization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Migration-Based Moth-Flame Optimization Algorithm

by
Mohammad H. Nadimi-Shahraki
1,2,*,
Ali Fatahi
1,2,
Hoda Zamani
1,2,
Seyedali Mirjalili
3,4,*,
Laith Abualigah
5,6 and
Mohamed Abd Elaziz
7,8,9,10
1
Faculty of Computer Engineering, Najafabad Branch, Islamic Azad University, Najafabad 8514143131, Iran
2
Big Data Research Center, Najafabad Branch, Islamic Azad University, Najafabad 8514143131, Iran
3
Centre for Artificial Intelligence Research and Optimisation, Torrens University Australia, Brisbane 4006, Australia
4
Yonsei Frontier Lab, Yonsei University, Seoul 03722, Korea
5
Faculty of Computer Sciences and Informatics, Amman Arab University, Amman 11953, Jordan
6
School of Computer Sciences, University Sains Malaysia, Pulau Pinang 11800, Malaysia
7
Department of Mathematics, Faculty of Science, Zagazig University, Zagazig 44519, Egypt
8
Artificial Intelligence Research Center (AIRC), Ajman University, Ajman 346, United Arab Emirates
9
Department of Artificial Intelligence Science & Engineering, Galala University, Suze 435611, Egypt
10
School of Computer Science and Robotics, Tomsk Polytechnic University, 634050 Tomsk, Russia
*
Authors to whom correspondence should be addressed.
Submission received: 18 November 2021 / Revised: 6 December 2021 / Accepted: 14 December 2021 / Published: 18 December 2021
(This article belongs to the Special Issue Evolutionary Process for Engineering Optimization)

Abstract

:
Moth–flame optimization (MFO) is a prominent swarm intelligence algorithm that demonstrates sufficient efficiency in tackling various optimization tasks. However, MFO cannot provide competitive results for complex optimization problems. The algorithm sinks into the local optimum due to the rapid dropping of population diversity and poor exploration. Hence, in this article, a migration-based moth–flame optimization (M-MFO) algorithm is proposed to address the mentioned issues. In M-MFO, the main focus is on improving the position of unlucky moths by migrating them stochastically in the early iterations using a random migration (RM) operator, maintaining the solution diversification by storing new qualified solutions separately in a guiding archive, and, finally, exploiting around the positions saved in the guiding archive using a guided migration (GM) operator. The dimensionally aware switch between these two operators guarantees the convergence of the population toward the promising zones. The proposed M-MFO was evaluated on the CEC 2018 benchmark suite on dimension 30 and compared against seven well-known variants of MFO, including LMFO, WCMFO, CMFO, CLSGMFO, LGCMFO, SMFO, and ODSFMFO. Then, the top four latest high-performing variants were considered for the main experiments with different dimensions, 30, 50, and 100. The experimental evaluations proved that the M-MFO provides sufficient exploration ability and population diversity maintenance by employing migration strategy and guiding archive. In addition, the statistical results analyzed by the Friedman test proved that the M-MFO demonstrates competitive performance compared to the contender algorithms used in the experiments.

1. Introduction

During past decades, optimization techniques have been developed widely to solve complex problems that emerged in different fields of science, such as engineering [1,2,3,4,5,6,7,8,9], clustering [10,11,12,13,14,15,16,17,18], feature selection [19,20,21,22,23,24,25,26,27,28], and task scheduling [29,30,31,32]. Such optimization problems mainly involve characteristics such as linear/non-linear constraints, non-differentiable functions, and a substantial number of decision variables. These characteristics make optimization problems almost impossible to solve by exact methods reasonably, and an effective approach is needed to tackle such complexities. Approximate algorithms are recognized as an effective approach for solving issues due to their stochastic techniques and global and local search strategies. Although metaheuristic algorithms cannot guarantee the optimality of their solutions, they can offer near-optimal solutions in a reasonable amount of time, which helps solve real-world problems [33,34,35,36,37].
Metaheuristic algorithms mostly employ stochastic techniques to solve optimization problems by exploring the search space to promote population diversity in the early iterations. In the exploitation phase, the algorithm locally searches the promising areas to enhance the quality of solutions discovered in the exploration phase. Striking a proper balance between these two tendencies leads the algorithm toward the global optimum after a limited number of iterations. The bio-inspired algorithms are the primary approach to solve optimization problems by employing biological concepts. In the literature, some of the bio-inspired algorithms, such as genetic algorithm (GA) [38], differential evolution (DE) [39], particle swarm optimization (PSO) [40], and artificial bee colony (ABC) [41], have been used to find the optimum of optimization problems in polynomial time. Although the mentioned algorithms demonstrate satisfactory results for many problems, no single metaheuristic can solve all optimization issues based on the no free lunch (NFL) theorem [42]. The NFL is the main reason for continuous developments in the field of optimization. As a result, numerous bio-inspired algorithms have been developed by introducing novel methods.
To comprehensively investigate the bio-inspired algorithms, we can classify them based on their source of inspiration to evolutionary and swarm intelligence (SI) [43]. The natural biological evolution, reproduction, mutation, and Darwin’s theory of evolution are the most used fundamentals for developing evolutionary optimization algorithms. Genetic algorithm (GA) [44], genetic programming (GP) [45], differential evolution (DE) [39], evolution strategy (ES) [46], and, from recent studies, quantum-based avian navigation optimizer algorithm (QANA) [47] are some evolutionary algorithms. During past years, many variants have been developed to improve the performance of evolutionary algorithms, such as enhanced genetic algorithm (EGA) [48], an ensemble of mutation strategies and control parameters with the DE (EPSDE) [49], the real-coded genetic algorithm using a directional crossover operator (RGA-DX) [50], and an effective multi-trial vector-based differential evolution (MTDE) [51].
Swarm intelligence (SI) algorithms are grounded in the collective behavior of a group of biological organisms. SI algorithms can be divided into four categories: aquatic animals, terrestrial animals, birds, and insects [52]. The natural behavior of aquatic animals, such as prey besieging and foraging, has been mimicked in many SI algorithms, including the krill herd (KH) algorithm [53], whale optimization algorithm (WOA) [54], and salp swarm algorithm (SSA) [55]. Many researchers have simulated the biological behavior of terrestrial animals to propose functional metaheuristic algorithms, such as grey wolf optimizer (GWO) [41], red fox optimization algorithm (RFO) [56], chimp optimization algorithm (ChOA) [57], and horse herd optimization algorithm (HOA) [58]. In the third category, bat algorithm (BA) [59], cuckoo search algorithm (CS) [60], crow search algorithm (CSA) [61], and Aquila optimizer (AO) [62] are among the well-known algorithms inspired by birds’ behaviors. Social behaviors of insects, such as self-organization and cooperation, are the main sources of inspiration behind the fourth group of SI algorithms, including ant colony optimization (ACO) [63], artificial bee colony (ABC) [64], ant lion optimization (ALO) [65], dragonfly algorithm (DA) [66], and moth–flame optimization (MFO) [67].
The SI algorithms intrinsically benefit from autonomy, adaptability, and acceptable time complexity. However, loss of population diversity and sinking into the local optimum are common issues among most SI algorithms. Therefore, many variants have been proposed to address these shortcomings and enhance the performance of the algorithms. Karaboga et al. [68] introduced a quick artificial bee colony (qABC) algorithm to improve the exploitation ability of the traditional algorithm. The conscious neighborhood-based crow search algorithm (CCSA) [52] addresses the imbalance between exploration and exploitation. An improved grey wolf optimizer (I-GWO) [69] was proposed to maintain the population diversity. An enhanced chimp optimization algorithm (EChOA) [70] has been introduced to avoid local optimum.
The moth–flame optimization (MFO) is a prominent bio-inspired metaheuristic algorithm inspired by the moths’ spiral movement around the light source at night. The MFO algorithm stands out among many metaheuristic algorithms for its simplicity and acceptable time complexity. Therefore, the MFO is used for solving a broad range of real-world problems, such as clustering [71,72,73,74,75,76,77], feature selection [78,79,80,81,82,83,84,85], and image processing [86,87,88,89,90,91]. Although the MFO is applicable for solving real-world problems and many improvements have been developed, it has been observed that the MFO and its variants hereditarily suffer from poor exploration and loss of population diversity before the near-optimal solution is met, which leads the algorithm toward local optima trapping and premature convergence.
In this study, an enhanced MFO algorithm, named migration-based moth–flame optimization (M-MFO) algorithm, is proposed to cope with these weaknesses. The M-MFO introduces a guiding archive to maintain the population diversity and a hybrid simple strategy named migration strategy consists of two random migration (RM) and guided migration (GM) operators which take advantage of an adapted crossover introduced in the GA [44]. The RM operator is introduced to enhance the exploration ability and population diversity by crossing the unlucky moths with a randomly generated moth to migrate to new areas. If the migrated moths obtain better positions, they are updated and added to the guiding archive to guide other unlucky moths. When the guiding archive size reaches the size of the problem variables, the archive is mature enough to guide other unlucky moths using the GM operator. This dimensionally aware switch between operators can guarantee the convergence of the algorithm toward promising areas.
To evaluate the efficiency of the M-MFO, the CEC 2018 benchmark functions were conducted to investigate the characteristics and performance of the proposed algorithm and its competitors in different dimensions, 30, 50, and 100. The convergence curves and population diversity provided in Section 5, show that the M-MFO can maintain population diversity until the optimal solution emerges and effectively facilitates the convergence behavior. Moreover, the Friedman test was conducted to evaluate the obtained results statistically. The experimental and statistical results were first compared with seven well-known variants of MFO, including LMFO [92], WCMFO [93], CMFO [94], CLSGMFO [95], LGCMFO [96], SMFO [97], and ODSFMFO [98] in dimension 30. Then, the top four algorithms and eight other state-of-the-art swarm intelligence algorithms were considered for the main experiments. Hence, the total competitors for the main experiments were KH [53], GWO [41], MFO [67], WOA [54], WCMFO [93], CMFO [94], HGSO [99], RGA-DX [50], ChOA [57], AOA [100], and ODSFMFO [98]. The experimental evaluations and statistical tests revealed that the M-MFO algorithm outperforms other competitor algorithms with overall effectiveness of 91%. The experimental results revealed that the migration strategy enhances the exploration ability and maintains the population diversity to avoid local optimum by stochastically migrating the worst individuals across the search space in the first iterations and exploiting promising areas discovered by the RM operator in the next iterations. The main contributions of this study are summarized as follows.
  • Introducing a guiding archive for storing improved moths to guide other unlucky moths.
  • Introducing a migration strategy using RM and GM operators to improve unlucky moths.
  • The RM operator enhances the exploration ability, while the GM operator converges the population toward the promising areas by exploiting around improved moths.
  • The experiments prove that the M-MFO effectively maintains the population diversity by taking advantage of the guiding archive.
  • The Friedman test demonstrated that the M-MFO provides the best results compared to competitors and stands out among MFO variants for solving global optimization problems.
The remainder of the paper is organized as follows. A literature overview of the MFO variants is included in Section 2. Section 3 briefly presents the MFO algorithm. Section 4 comprehensively presents the proposed M-MFO algorithm. A rigorous examination of the effectiveness of the proposed algorithm is provided experimentally in Section 5 and statistically in Section 6. Finally, Section 7 summarizes the conclusions.

2. Related Work

The MFO algorithm is known as a prominent problem solver due to its simple framework, fewer control parameters, and ease of implementation. However, the MFO suffers from some issues for solving complex optimization problems. Therefore, since the release of the MFO, many variants have been developed to address MFO’s shortcomings and offer improved performance. These variants can be categorized into hybrid improvements and non-hybrid improvements, as illustrated in Figure 1.
Since the introduction of MFO, many researchers have proposed hybrid variants to effectively address shortcomings of the canonical MFO by employing operators of other algorithms. Bhesdadiya et al. [101] introduced a hybrid PSO-MFO algorithm by combining particle swarm optimization (PSO) with MFO to boost the exploitation ability of the MFO algorithm. MFO-LSSVM [102] is a hybridization of MFO with least squares support vector machines (LSSVM) to enhance the generalization in the prediction of the MFO algorithm. To boost the exploitation ability of the MFO, Sarma et al. [103] introduced the gravitational search algorithm (GSA) to the canonical MFO and proposed MFOGSA. In WCMFO, Khalilpourazari et al. [93] introduced a combined MFO, water cycle algorithm (WCA) and a random walk to avoid local optimum and enhance the solution quality. Rezk et al. [104] designed a hybrid MPPT method by combining an incremental conductance (INC) approach and MFO, called (INC-MFO), to reach a maximum-power solar PV/thermoelectric system under different environmental conditions.
Ullah et al. [105] introduced a time-constrained genetic moth–flame optimization (TG-MFO) algorithm for an energy management system (EMS) in smart homes and buildings. The FCHMD [106] algorithm combines Harris hawks optimizer (HHO) and MFO to cope with the insufficient exploitation and exploration rate of the HHO and MFO, respectively. Moreover, the method of evolutionary population dynamics (EPD) is employed to address premature convergence and local optima stagnation. ODSFMFO, proposed by Li et al. [98], is a hybridization of MFO with differential evolution (DE) and shuffled frog leaping algorithm (SFLA). In addition, the algorithm is enhanced by the addition of a flame generation strategy and death mechanism. Dang et al. [107] brought up a hybridization of MFO and three different methods, including the Taguchi method, fuzzy logic, and response surface method, to solve the flexure hinge design problem. SMFO has been recently proposed by [97] to enhance the solution quality and convergence speed of the MFO by introducing the sine cosine strategy to the MFO algorithm.
The non-hybrid algorithms are mostly developed to cope with issues such as local optima trapping, premature convergence, the imbalance between search strategies, and poor local and global search abilities. The LMFO algorithm proposed by Li et al. [92] is an enhanced version of MFO, improved by Lévy flight to address premature convergence and local optimum trapping by improving the population diversity. Apinantanakon et al. [108] established an opposition-based moth–flame optimization (OMFO) algorithm to evade local optimum by boosting the exploration ability of the MFO. Xu et al. [109] addressed the MFO’s low population diversity and introduced EMFO by taking advantage of the Gaussian mutation (GM). Li et al. [110] presented a multi-objective moth–flame optimization algorithm (MOMFA) to enhance water resource efficiency by maintaining population diversity and accelerating convergence speed by taking advantage of opposition-based learning and indicator-based learning selection-efficient mechanisms.
The CLSGMFO [95] presents an efficient chaotic mutative moth–flame-inspired algorithm by employing Gaussian mutation and chaotic local search to enhance the population diversity and exploitation rate, respectively. Chaos-enhanced moth–flame optimization (CMFO), proposed by Hongwei et al. [94], is an improved MFO algorithm that employs ten chaotic maps. Xu et al. [96] developed LGCMFO to enhance the global and local search ability of the MFO and avoid local optimum by employing new operators, such as Gaussian mutation (GM), Lévy mutation (LM), and Cauchy mutation (CM). In BFGSOLMFO, Zhang et al. [111] introduced orthogonal learning (OL) and Broyden–Fletcher–Goldfarb–Shanno (BFGS) to the MFO to enhance the solution quality of the MFO. Nadimi-Shahraki et al. [112] proposed an improved moth–flame optimization (I-MFO) algorithm to evade the local optima trapping and premature convergence by adding a memory mechanism and taking advantage of the adapted wandering around search (AWAS) strategy. This algorithm is designed to solve the numerical and mechanical engineering problems.

3. Moth–Flame Optimization (MFO) Algorithm

The moth–flame optimization (MFO) is a prominent SI algorithm inspired by the spiral locomotion behavior of moths around a light source at night. This behavior is derived from the navigation mechanism of moths that is used to fly a long distance in a straight line by maintaining a fixed inclination to the moon. However, this principled navigation mechanism turns into a deadly spiral path toward the light source if the light source is relatively close to the moths. According to the brief description, the MFO algorithm consists of moths and flames. As shown in Equation (1), moths are considered search agents, organized in matrix M (t), that explore the D-dimensional search space, and N is the number of moths.
M ( t ) = [ m 1 , 1 m 1 , 2 m 1 , D m 2 , 1 m 2 , 2 m 2 , D m N , 1 m N , 2 m N , D ]
Additionally, the fitness of the corresponding moth is stored in an array OM (t), as shown below.
O M ( t ) = [ O M 1 ( t ) O M 2 ( t ) O M N ( t ) ]
On the other hand, flames are the best positions discovered by moths and are stored in a similar matrix F (t), along with their fitness values in an array OF (t). The moths spirally move around their corresponding flames, as shown in Equation (3), where Mi (t) is the position of ith moth in the current iteration, the Disi determines the distance between Mi and its corresponding jth flame (Fj) formulated in Equation (4), b indicates the shape of the logarithmic spiral, and k is a random number value between intervals [−1, 1].
M i ( t ) = D i s i ( t ) × e b k × C o s ( 2 π k ) + F j ( t )
D i s i ( t ) = | F j ( t ) M i ( t ) |
To converge the algorithm and provide more exploitation, the number of flames decreases in the course of iterations based on Equation (5), where t determines the current number of iterations, while N and MaxIt demonstrate the total number of flames and the maximum number of iterations, respectively.
F l a m e N u m ( t ) = r o u n d ( N t × N 1 M a x I t   )

4. Proposed Algorithm

The MFO is a prominent population-based algorithm that is successfully applied in different fields. However, based on the conducted analysis reported in Section 5.2 and related studies [113,114,115], the MFO algorithm suffers from poor exploration and rapid loss of population diversity. While the number of flames converges, the algorithm provides more local searches throughout the course of the iterations. Hence, the algorithm is prone to sink into the local optimum due to its limited simple spiral movement of moths around their corresponding flames which cannot offer further exploration to avoid the local optimum. Therefore, this study proposes a migration-based moth–flame optimization (M-MFO) algorithm, which is a hybridization of the MFO algorithm and the crossover operator introduced in the GA. Moreover, the M-MFO utilizes a guiding archive to maintain population diversity and a migration strategy that uses the crossover operator to boost exploration ability. The migration strategy introduces two operators, RM and GM, by taking advantage of an adapted GA’s crossover. The RM operator is introduced to provide sufficient exploration ability during the early iterations, while the GM operator converges the population toward promising areas. Moreover, to maintain the population diversity, a guiding archive is introduced, as outlined in Definition 1, to store lucky moths that have improved using the migration strategy.
Definition 1 (guiding archive).
The guiding archive keeps the position of lucky moths improved by the migration strategy to maintain the population diversity and suppress the premature convergence of the population. Both RM and GM add improved moths to the guiding archive, although only the GM operator exploits the archive. The guiding archive capacity (MaxArc) is formulated in Equation (6), where D and N are dimensions and population size.
M a x A r c = D × [ ln N ]
To ensure that the guiding archive is mature enough to guide other unlucky moths, the current size of the archive (δ) needs to be greater than the size of the problem variables (D). This limitation provides a dimensionally aware switch to choose the right operator in the migration strategy effectively. In addition, if the value of δ exceeds the MaxArc, the next moth is replaced with a random member of the guiding archive.
Migration strategy includes RM and GM operators to ensure that there is high enough exploration capability and convergence toward promising zones. The RM operator provides further exploration ability by changing the position of Mi stochastically. At the same time, the GM operator is introduced to converge the population toward promising zones by exploiting improved moths kept in the guiding archive. Moreover, the migration strategy benefits from a dimensionally aware switch between these two operators as represented in Equation (7), where δ indicates the current size of the guiding archive. The pseudo-code and the flowchart of the M-MFO are presented in Algorithm 1 and Figure 2, respectively.
M i ( t + 1 ) = { RM   operator                             δ < D GM   operator                               δ D
Random migration (RM) operator: Let unlucky moths (t) = {M1, M2, …, Mi, …}, which is a finite set of unlucky moths, such that OMi (t) > OMi (t − 1). Hence, in this operator, the position of Mi changes by considering a randomly generated moth (Mr) and Mi represents the parents in the crossover formulated in Equations (8) and (9), where α is a random number in [0, 1]. The crossover produces two offspring, and the one with better fitness is selected and compared with other offspring to choose the best one, and the position of OMi (t + 1) is added to the guiding archive if it can dominate the OMi (t). The RM operator satisfies the need for exploration by stochastically moving the unlucky moths to discover promising areas in the early iterations.
O f f s p r i n g 1 = α × M i + ( 1 α ) × M r
O f f s p r i n g 2 = α × M r + ( 1 α ) × M i
Guided migration (GM) operator: The GM operator is employed to change the position of unlucky moth, Mi, when the size of the GM reaches the size of the problem variables. The GM changes the position of Mi by employing the crossover formulated in Equations (10) and (11), where LMr is a random lucky moth from the guiding archive. Similarly, to the RM operator, if the new offspring obtains a better position compared to Mi (t), the position of Mi (t + 1) is updated, and it is appended to the guiding archive.
O f f s p r i n g 1 = α × M i + ( 1 α ) × L M r
O f f s p r i n g 2 = α × L M r + ( 1 α ) × M i
Algorithm 1. The pseudocode of proposed M-MFO algorithm.
Input: Max iterations (MaxIt), number of moths (N), max size of guiding archive (MaxArc), and dimension (D).
Output: The best flame position and its fitness value.
Begin
 Randomly distributing M moths in the D-dimensional search space.
 Calculating moths’ fitness (OM).
Set t and δ = 1 //δ is the number of guiding archive members.
OF (t) ← sort OM (t).
F (t) ← sort M (t).
While tMaxIt
   Updating F and OF by the best N moths from F and current M.
   Updating FlameNum (t) using Equation (5).
   For i = 1: N
     Updating the position of Mi (t) using Equation (3) and computing the OMi (t).
     If OMi (t) > OMi (t − 1)
       τ ← Generating a random number between intervals [1, D].
       For j = 1: τ
         If δD (The guiding archive is still immature)
         Generating the next position Mi (t + 1) using RM operator and Equations (8) and (9).
         Else (The guiding archive is mature)
           Generating next position Mi (t + 1) using GM operator and Equations (10) and (11).
         End If
       End For
       If OMi (t + 1) < OMi (t)
          Updating position Mi (t) and adding Mi (t + 1) to guiding archive using Definition 1 and MaxArc.
       End If
     End If
   End For
   Updating the position and fitness value of the global best flame.
   t = t +1.
 End while

5. Numerical Experiment and Analysis

In this section, the performance of the proposed M-MFO has been evaluated on several benchmark functions. In the first section, the population diversity and convergence behavior of the canonical MFO have been analyzed on some selected functions to provide some useful information about the shortcomings of the MFO algorithm. Then, the MFO variants and the M-MFO have been compared on dimension 30 to determine the top four superior MFO variants to participate in the next experiments. Following that, the performance of M-MFO has been compared with ten of the state-of-the-art swarm intelligence algorithms: KH [53], GWO [41], MFO [67], WOA [54], WCMFO [93], CMFO [94], HGSO [99], RGA-DX [50], ChOA [57], AOA [100], and ODSFMFO [98]. The parameters of the competitor algorithms are reported in Table 1.

5.1. Experimental Environment and Benchmark Functions

The M-MFO and competitor algorithms were implemented in Matlab 2020a. All experiments were performed 20 times, independently, on a laptop with Intel Core i7-10750H CPU (2.60 GHz) and 24 GB of memory to ensure fair comparisons. In each run, the maximum number of iterations (MaxIt) was set by (D × 104)/N where D and N were respectively set to the dimensions of the problem and 100. In this study, the CEC 2018 benchmark functions [116] were used to evaluate the effectiveness of the proposed M-MFO. There are 29 test functions in the CEC 2018 benchmark suite, each with its own set of characteristics and different dimensions 30, 50, and 100. These test functions can be classified into unimodal functions F1 and F3, multimodal functions F4–F10, hybrid functions F11–F20, and composition functions F21–F30. The experimental results tabulated in Table 2, Table 3, Table 4 and Table 5 and Table A1, Table A2, Table A3 and Table A4 in the Appendix A are based on each algorithm’s average and minimum fitness value, where the bold values illustrate the winning algorithm. Moreover, the symbols “W|T|L” in the last row of each table demonstrate the number of wins, ties, and losses for each algorithm.

5.2. Population Diversity Analysis

Maintaining population diversity plays a crucial role in metaheuristic algorithms during the optimization process, as low diversity among search agents can lead the algorithm toward getting stuck at a local optimum. In this experiment, the population diversity and convergence behavior of the MFO algorithm were comprehensively examined to discover the shortcomings of the MFO algorithm. The population diversity curves shown in Figure 3 were measured by a moment of inertia (Ic) [117], where the Ic represents the spreading of each individual from their centroid given by Equation (12) and the centroid cj for j = 1, 2 … D was calculated using Equation (13).
I c =   j = 1 D i = 1 N ( M j i c j ) 2
c j = 1 N   i = 1 N M j i
In order to perform a fair analysis and develop a better understanding of how population diversity affects the optimization process, Figure 3 illustrates the population diversity and convergence curves of the MFO side by side. For unimodal function F1, the diversity fell while the optimal solution had not been met yet. Hence, the algorithm sunk into the local optimum until the last iterations. For F3, the slope of losing diversity was not as sharp as F1, and the convergence trend continued until the last iterations. However, the population diversity was still low, and convergence speed was too slow to reach the near-optimal solution in the course of iterations. F4 and F7 represent the multimodal functions in which the MFO could not maintain the population diversity, and the algorithm experienced premature convergence, beginning at early iterations. F13, F18, F22, and F27 were plotted as representatives of hybrid and composition functions with similar diversity and convergence behaviors. The standard behavior was the effort of the algorithm to avoid the local optimum by increasing the population diversity; however, the simple movements of search agents in MFO could not satisfy the needs of exploration to escape the local optimum. To sum up, it can be concluded from the plots that the MFO loses its population diversity before reaching an optimal solution. This behavior was repeated for most of the functions, proving the deficiency of the algorithm in maintaining the population diversity.

5.3. Comparison of M-MFO with MFO Variants

To compare M-MFO with more variants, the results of M-MFO and seven other well-known variants of MFO, including LMFO [92], WCMFO [93], CMFO [94], CLSGMFO [95], LGCMFO [96], SMFO [97], and ODSFMFO [98], were assessed and tabulated in Table 2 and Table A1 in the Appendix A, where the M-MFO outperformed its competitors in dimension 30. Then, the top four algorithms, including the proposed M-MFO, ODSFMFO, WCMFO, and CMFO, were selected for main experiments, including comparison results on dimensions 30, 50, and 100; convergence analysis; population diversity analysis; and the Friedman test.

5.4. Evaluation of Exploitation and Exploration

The exploitation and exploration abilities of the proposed M-MFO have been evaluated by unimodal and multimodal test functions, respectively. As the unimodal functions, F1 and F3 have a single global optimum and they are suitable for assessing the exploitation abilities of optimization algorithms. Based on the results of the unimodal functions reported in Table 3 and Table A2 in the Appendix A, the M-MFO outperformed competitors for 30, 50, and 100 dimensions, particularly on test function F3, where the M-MFO provided the global best solution. The main reason for this exploitation ability is to employ the GM operator, which effectively exploits improved moths kept in the guiding archive. To assess the exploration ability of the M-MFO, the multimodal test functions F4–F10 were considered, as multimodal functions have many local optima. The results of multimodal test functions demonstrated that the M-MFO provides very competitive results compared to other competitors, mainly because of the RM operator and its stochastic movement employed for exploring the landscape effectively in the early iterations.

5.5. Local Optima Avoidance Analysis

In this experiment, the local optima avoidance ability and balance between exploration and exploitation of M-MFO were investigated using hybrid F11–F20 and composition F21–F30 functions with dimensions 30, 50, and 100. The related results, tabulated in Table 4, Table 5, Table A3 and Table A4 in the Appendix A, proved that the M-MFO is very competitive in comparison to other algorithms used for approximating the global optima values. The main reason is that the algorithm optimally trades off exploration and exploitation by defining two operators—the RM operator, which stochastically moves the unlucky search agents across the search space, and the GM operator, which exploits the promising areas located by successful migrants. Additionally, a guiding archive is introduced to maintain the population diversity by storing new solutions obtained by migration strategy. Moreover, defining a dimensionally aware switching between operators of migration strategy guarantees a proper trade-off between exploration and exploitation.

5.6. The Overall Effectiveness of M-MFO

This study evaluated the overall effectiveness (OE) of the M-MFO and contender algorithms based on the results reported in Table 3, Table 4 and Table 5 and Table A2, Table A3, Table A4 in Appendix A. The OE results reported in Table 6 were calculated using Equation (14), where N indicates the total number of test functions and L is the number of losing tests for each algorithm. The results prove that the M-MFO, with overall effectiveness of 91%, is the most effective algorithm for the various dimensions 30, 50, and 100.
O E   ( % ) = N L N × 100

5.7. Convergence Rate Analysis

In this experiment set, the convergence properties of the M-MFO were examined and the results were compared with contender algorithms for dimensions 30, 50, and 100. Figure 4 illustrates the convergence curves of the average fitness values obtained by each algorithm on unimodal, multimodal, hybrid, and composition test functions. The first row shows the convergence behavior of algorithms on F3. The M-MFO hit the global optimum solution in the early iterations for all dimensions, which proved the sufficient exploitation ability of the M-MFO. In contrast, the convergence trends of other algorithms were hampered by local minima or demonstrated a prolonged convergence rate. For multimodal function F10, the M-MFO provided the best solution among competitors in the early iterations due to its exploration ability derived from the migration strategy. The third and fourth rows present the convergence of the hybrid functions. The M-MFO bypassed the local optima and continued its gradual trend toward the near-optimum solutions until the final iterations by striking a balance between exploration and exploitation. The convergence curves of the composition function illustrated in the last row demonstrate that the M-MFO obtained the best solution among competitors in early iterations. To sum up, the plots proved that the M-MFO is superior to the other algorithms and provides sufficient exploitation, exploration, and balance between these two tendencies. In addition, it can be noticed that the M-MFO offered more consistent results by increasing the size of the problem variables.

5.8. Population Diversity Analysis

As mentioned in Section 5.2, adequate population diversity can suspend the algorithm from local optima trapping. Therefore, in this section, the population diversity of M-MFO and other competitors is provided in Figure 5. Comparing the population diversity with convergence curves provided in the previous section demonstrated that the M-MFO effectively maintained its population diversity until the promising area was met for different test functions with dimensions of 30, 50, and 100. Furthermore, the plots suggest that the M-MFO shows strong robustness for maintaining the population diversity as the size of the problem variables increases, mainly for its dimensionally aware switch between operators.

5.9. Sensitivity Analysis on the Guiding Archive Maturity Size

As discussed in Definition 1, the M-MFO algorithm switches from RM to GM operator when the guiding archive has matured. Hence, in this experiment, the impact of considering different maturity sizes of the guiding archive was evaluated and is discussed in relation to four different scenarios. Table 7 shows the fitness values gained in each scenario among five functions (i.e., F1, F9, F17, and F30) for different dimensions 30, 50, and 100.
The reported results indicate that the fourth scenario (δ = D) among all the tested functions provided better results overall compared to other scenarios. Nevertheless, it can be noticed that other scenarios provided competitive results, especially for unimodal functions. The main reason lies behind the fact that the higher value for maturity provides more population diversity and exploration ability, while the need for exploitation ability is more highlighted for unimodal functions. Moreover, previous studies [47,52] have proved that increasing the dimension has a negative impact on the effectiveness and scalability of metaheuristic algorithms. Hence, the size of the problem variables for maturity condition (δ = D) can provide dimensional robustness for the algorithm. Furthermore, considering D as the maturity size does not add any additional parameters to the algorithm, and it provides an autotune parameter for different dimensions.

6. Statistical Analysis

In this experiment, the Friedman test [118] was conducted to statistically prove the superiority of M-MFO by ranking the algorithms based on their performance on CEC 2018 benchmark functions. Table 8 illustrates the obtained results for unimodal and multimodal test functions. In addition, hybrid and composition functions have been tabulated in Table 9. Inspecting the overall rank of the Friedman test, it is evident that the M-MFO demonstrated superior performance in comparison to contender algorithms for dimensions of 30, 50, and 100.
In Figure 6 and Figure 7, the M-MFO and competitors are visually ranked based on their performance in CEC 2018 benchmark suite for various dimensions. Figure 6 depicts the ranking results of algorithms in solving CEC 2018 benchmark functions, expressed through a radar graph. Meanwhile the clustered bar chart of Friedman’s test average results is shown in Figure 7. The radar graph demonstrates that the M-MFO surpassed other algorithms in the majority of test functions for various dimensions. The clustered bar chart shows that the M-MFO achieved the best rank among competitors since it has the shortest bar in the different dimensions of 30, 50, and 100.

7. Conclusions

The MFO is a successful metaheuristic algorithm inspired by moths’ behavior converging to a light source at night. The MFO has been used in various real-world optimization problems during recent years, mainly due to its simple structure. However, as the experiments revealed, the canonical MFO algorithm experiences local optima trapping and premature convergence due to the rapid dropping of population diversity and poor exploration. Hence, the M-MFO algorithm is proposed to overcome MFO’s shortcomings by introducing a migration strategy that includes two new operators to boost exploration ability and maintain the population diversity.
The performance of M-MFO was experimentally evaluated by conducting CEC 2018 benchmark functions on dimension 30 and compared with seven recent variants of MFO, including LMFO, WCMFO, CMFO, CLSGMFO, LGCMFO, SMFO, and ODSFMFO. The top four latest high-performing variants and eight other state-of-the-art swarm intelligence algorithms were considered for experiments on the 30, 50, and 100 dimensions. The M-MFO stood out among competitors by providing highly competitive results and maintaining robustness while the size of the problem variables increased. In addition, to rank the algorithms, M-MFO and competitors were analyzed statistically by the Friedman test, in which the M-MFO obtained the first rank. For future works and studies, the migration strategy and guiding archive could be considered a reference in handling low population diversity and inefficient exploration of other metaheuristic algorithms. Moreover, the M-MFO can be used to solve engineering design problems. It can be converted for solving discrete optimization problems, such as feature selection, data mining, and image segmentation.

Author Contributions

Conceptualization, M.H.N.-S.; methodology, M.H.N.-S. and A.F.; software, M.H.N.-S., A.F. and H.Z.; validation, M.H.N.-S. and H.Z.; formal analysis, M.H.N.-S., A.F. and H.Z.; investigation, M.H.N.-S., A.F. and H.Z.; resources, M.H.N.-S., S.M., L.A. and M.A.E.; data curation, M.H.N.-S., A.F. and H.Z.; writing, M.H.N.-S., A.F. and H.Z.; original draft preparation, M.H.N.-S., A.F. and H.Z.; writing—review and editing, M.H.N.-S., A.F., H.Z., S.M., L.A. and M.A.E.; visualization, M.H.N.-S., A.F. and H.Z.; supervision, M.H.N.-S. and S.M.; project administration, M.H.N.-S. and S.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data and code used in the research may be obtained from the corresponding author upon request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1 provides the detailed results of the proposed M-MFO algorithm and other variants of the MFO for solving CEC 2018 benchmark functions in dimension 30. Furthermore, the detailed results of the proposed M-MFO and contender algorithms for unimodal, multimodal, hybrid, and composition functions of CEC 2018 benchmark suite in dimensions 30, 50, and 100 are reported in Table A2, Table A3 and Table A4.
Table A1. Comparison results of MFO variants.
Table A1. Comparison results of MFO variants.
FDMetricsLMFO
(2016)
WCMFO
(2019)
CMFO
(2019)
CLSGMFO
(2019)
LGCMFO
(2019)
SMFO
(2021)
ODSFMFO
(2021)
M-MFO
F130Avg2.402 × 1071.328 × 1041.878 × 1089.430 × 1084.532 × 1083.091 × 10107.519 × 1061.660 × 103
Min1.731 × 1071.214 × 1022.464 × 1061.923 × 1089.606 × 1072.010 × 10101.570 × 1061.002 × 102
F330Avg2.786 × 1031.887 × 1034.825 × 1045.191 × 1044.491 × 1048.189 × 1042.875 × 1043.006 × 102
Min1.424 × 1033.092 × 1022.643 × 1043.418 × 1043.166 × 1047.186 × 1041.359 × 1043.000 × 102
F430Avg5.335 × 1024.886 × 1026.536 × 1026.009 × 1025.829 × 1025.977 × 1035.335 × 1024.247 × 102
Min4.755 × 1024.239 × 1025.311 × 1025.259 × 1024.890 × 1023.030 × 1034.722 × 1024.001 × 102
F530Avg6.470 × 1026.744 × 1026.150 × 1026.587 × 1026.468 × 1028.721 × 1025.526 × 1025.136 × 102
Min5.707 × 1026.104 × 1025.709 × 1026.140 × 1026.040 × 1028.041 × 1025.270 × 1025.070 × 102
F630Avg6.038 × 1026.225 × 1026.179 × 1026.285 × 1026.208 × 1026.830 × 1026.037 × 1026.000 × 102
Min6.017 × 1026.096 × 1026.116 × 1026.117 × 1026.117 × 1026.615 × 1026.010 × 1026.000 × 102
F730Avg8.986 × 1028.986 × 1029.041 × 1029.291 × 1029.167 × 1021.349 × 1038.151 × 1027.446 × 102
Min8.438 × 1028.402 × 1028.306 × 1028.555 × 1028.532 × 1021.175 × 1037.824 × 1027.363 × 102
F830Avg9.379 × 1029.841 × 1029.127 × 1029.350 × 1029.273 × 1021.096 × 1038.460 × 1028.141 × 102
Min8.797 × 1029.344 × 1028.679 × 1028.859 × 1028.940 × 1021.058 × 1038.262 × 1028.070 × 102
F930Avg1.064 × 1038.747 × 1032.277 × 1033.984 × 1033.275 × 1039.591 × 1031.062 × 1039.005 × 102
Min9.456 × 1025.118 × 1031.626 × 1032.051 × 1032.089 × 1037.754 × 1039.647 × 1029.000 × 102
F1030Avg4.422 × 1034.808 × 1034.967 × 1035.252 × 1034.970 × 1038.363 × 1034.221 × 1031.958 × 103
Min3.149 × 1033.333 × 1033.912 × 1034.057 × 1033.939 × 1037.473 × 1033.066 × 1031.348 × 103
F1130Avg1.292 × 1031.336 × 1032.126 × 1031.784 × 1031.512 × 1035.265 × 1031.285 × 1031.122 × 103
Min1.177 × 1031.255 × 1031.360 × 1031.334 × 1031.275 × 1032.547 × 1031.210 × 1031.105 × 103
F1230Avg5.460 × 1061.254 × 1064.296 × 1071.932 × 1071.814 × 1074.462 × 1092.297 × 1067.118 × 104
Min1.046 × 1063.718 × 1048.503 × 1053.033 × 1064.012 × 1061.749 × 1092.010 × 1052.650 × 104
F1330Avg4.494 × 1051.047 × 1056.841 × 1032.837 × 1051.553 × 1058.738 × 1089.819 × 1031.116 × 104
Min2.705 × 1051.436 × 1042.860 × 1034.152 × 1042.770 × 1042.189 × 1081.596 × 1035.053 × 103
F1430Avg2.614 × 1042.074 × 1047.665 × 1043.506 × 1052.108 × 1051.548 × 1065.267 × 1046.136 × 103
Min2.724 × 1036.252 × 1032.082 × 1038.610 × 1031.008 × 1047.879 × 1044.686 × 1031.533 × 103
F1530Avg9.006 × 1043.448 × 1045.245 × 1031.596 × 1041.131 × 1044.469 × 1076.098 × 1032.252 × 103
Min5.003 × 1042.547 × 1031.693 × 1033.298 × 1032.834 × 1032.375 × 1061.703 × 1031.508 × 103
F1630Avg2.640 × 1032.807 × 1032.581 × 1032.763 × 1032.736 × 1034.402 × 1032.334 × 1031.774 × 103
Min2.110 × 1032.095 × 1032.009 × 1032.046 × 1032.025 × 1033.607 × 1031.949 × 1031.602 × 103
F1730Avg2.203 × 1032.315 × 1032.050 × 1032.125 × 1032.076 × 1032.752 × 1031.937 × 1031.738 × 103
Min1.801 × 1031.942 × 1031.805 × 1031.864 × 1031.762 × 1032.359 × 1031.771 × 1031.703 × 103
F1830Avg3.682 × 1051.734 × 1054.761 × 1051.747 × 1061.130 × 1062.844 × 1079.249 × 1059.790 × 104
Min8.629 × 1043.793 × 1043.856 × 1041.058 × 1058.154 × 1042.007 × 1069.952 × 1045.073 × 104
F1930Avg6.193 × 1043.223 × 1041.816 × 1041.949 × 1041.084 × 1041.072 × 1089.241 × 1036.433 × 103
Min1.764 × 1042.168 × 1032.460 × 1032.390 × 1032.973 × 1035.192 × 1061.968 × 1031.946 × 103
F2030Avg2.498 × 1032.468 × 1032.494 × 1032.496 × 1032.333 × 1032.847 × 1032.306 × 1032.128 × 103
Min2.180 × 1032.073 × 1032.162 × 1032.069 × 1032.132 × 1032.454 × 1032.053 × 1032.028 × 103
F2130Avg2.439 × 1032.493 × 1032.396 × 1032.421 × 1032.420 × 1032.653 × 1032.352 × 1032.312 × 103
Min2.378 × 1032.398 × 1032.347 × 1032.386 × 1032.375 × 1032.552 × 1032.331 × 1032.303 × 103
F2230Avg5.006 × 1036.637 × 1032.373 × 1032.507 × 1032.492 × 1038.654 × 1032.318 × 1032.300 × 103
Min2.325 × 1035.330 × 1032.315 × 1032.371 × 1032.358 × 1035.677 × 1032.307 × 1032.300 × 103
F2330Avg2.786 × 1032.785 × 1032.828 × 1032.795 × 1032.787 × 1033.283 × 1032.718 × 1032.662 × 103
Min2.710 × 1032.721 × 1032.764 × 1032.745 × 1032.707 × 1033.027 × 1032.699 × 1032.647 × 103
F2430Avg2.928 × 1032.978 × 1032.928 × 1032.952 × 1032.959 × 1033.433 × 1032.871 × 1032.827 × 103
Min2.898 × 1032.928 × 1032.877 × 1032.878 × 1032.920 × 1033.217 × 1032.848 × 1032.820 × 103
F2530Avg2.889 × 1032.894 × 1033.004 × 1032.980 × 1033.005 × 1033.940 × 1032.928 × 1032.888 × 103
Min2.888 × 1032.884 × 1032.933 × 1032.939 × 1032.920 × 1033.463 × 1032.890 × 1032.887 × 103
F2630Avg5.012 × 1035.447 × 1034.227 × 1034.838 × 1034.163 × 1038.871 × 1034.425 × 1033.408 × 103
Min4.607 × 1034.955 × 1032.936 × 1033.514 × 1033.241 × 1035.057 × 1032.876 × 1032.800 × 103
F2730Avg3.241 × 1033.228 × 1033.257 × 1033.286 × 1033.275 × 1033.688 × 1033.230 × 1033.221 × 103
Min3.200 × 1033.201 × 1033.232 × 1033.224 × 1033.218 × 1033.397 × 1033.222 × 1033.210 × 103
F2830Avg3.255 × 1033.194 × 1033.444 × 1033.451 × 1033.290 × 1035.524 × 1033.295 × 1033.110 × 103
Min3.210 × 1033.100 × 1033.247 × 1033.285 × 1033.268 × 1034.419 × 1033.251 × 1033.100 × 103
F2930Avg3.785 × 1033.965 × 1034.050 × 1033.976 × 1033.872 × 1035.698 × 1033.669 × 1033.319 × 103
Min3.596 × 1033.650 × 1033.631 × 1033.601 × 1033.556 × 1034.728 × 1033.475 × 1033.312 × 103
F3030Avg1.579 × 1052.812 × 1048.574 × 1053.989 × 1053.406 × 1053.278 × 1081.629 × 1046.645 × 103
Min4.934 × 1041.582 × 1047.507 × 1043.747 × 1044.618 × 1043.212 × 1077.769 × 1036.062 × 103
SummaryW|T|L0| 0| 290| 0| 291| 0| 280| 0| 290| 0| 290| 0| 290| 0| 2928|0|1
Table A2. Results of the comparative algorithms on unimodal and multimodal test functions.
Table A2. Results of the comparative algorithms on unimodal and multimodal test functions.
FDMetricsKH
(2012)
GWO
(2014)
MFO
(2015)
WOA
(2016)
WCMFO
(2019)
CMFO
(2019)
HGSO
(2019)
RGA-DX
(2019)
ChOA
(2020)
AOA
(2021)
ODSFMFO
(2021)
M-MFO
F130Avg1.371 × 1048.223 × 1086.952 × 1091.906 × 1061.328 × 1045.824 × 1071.455 × 10102.575 × 1032.395 × 10104.015 × 10107.519 × 1061.660 × 103
Min3.462 × 1034.405 × 1071.027 × 1095.654 × 1051.214 × 1022.553 × 1067.442 × 1091.272 × 1021.123 × 10103.092 × 10101.570 × 1061.002 × 102
50Avg1.954 × 1054.523 × 1093.099 × 10107.172 × 1062.826 × 1041.046 × 1093.844 × 10103.059 × 1034.417 × 10101.003 × 10113.066 × 1081.466 × 103
Min4.342 × 1041.231 × 1097.095 × 1091.980 × 1066.883 × 1022.822 × 1082.159 × 10101.327 × 1023.506 × 10108.424 × 10109.629 × 1071.001 × 102
100Avg5.646 × 1073.207 × 10101.173 × 10113.677 × 1072.017 × 1053.803 × 1091.643 × 10114.575 × 1031.463 × 10112.629 × 10115.316 × 1094.465 × 103
Min2.550 × 1061.634 × 10106.748 × 10101.409 × 1071.093 × 1041.832 × 1091.299 × 10111.587 × 1021.282 × 10112.350 × 10111.894 × 1091.032 × 103
F330Avg4.863 × 1042.993 × 1041.009 × 1051.715 × 1051.887 × 1034.248 × 1043.687 × 1047.905 × 1035.178 × 1047.318 × 1042.875 × 1043.006 × 102
Min2.979 × 1041.576 × 1041.920 × 1038.481 × 1043.092 × 1023.385 × 1042.335 × 1049.933 × 1023.954 × 1045.445 × 1041.359 × 1043.000 × 102
50Avg1.216 × 1057.147 × 1041.650 × 1056.180 × 1041.150 × 1049.610 × 1041.363 × 1052.712 × 1041.309 × 1051.625 × 1059.729 × 1043.000 × 102
Min6.121 × 1043.628 × 1041.176 × 1043.098 × 1047.428 × 1026.899 × 1041.050 × 1051.139 × 1041.006 × 1051.249 × 1056.538 × 1043.000 × 102
100Avg3.477 × 1052.023 × 1054.556 × 1055.928 × 1057.361 × 1042.317 × 1052.945 × 1051.387 × 1053.065 × 1053.325 × 1053.252 × 1053.000 × 102
Min2.569 × 1051.595 × 1051.191 × 1053.355 × 1053.430 × 1042.058 × 1052.601 × 1057.593 × 1042.849 × 1053.027 × 1052.456 × 1053.000 × 102
F430Avg4.963 × 1025.441 × 1029.082 × 1025.476 × 1024.886 × 1025.631 × 1022.171 × 1034.896 × 1022.545 × 1038.649 × 1035.335 × 1024.247 × 102
Min4.043 × 1024.963 × 1025.424 × 1024.995 × 1024.239 × 1024.759 × 1021.194 × 1034.040 × 1021.134 × 1033.825 × 1034.722 × 1024.001 × 102
50Avg5.683 × 1028.767 × 1024.098 × 1036.676 × 1025.493 × 1021.069 × 1038.889 × 1035.095 × 1029.023 × 1032.568 × 1047.414 × 1024.872 × 102
Min4.996 × 1026.745 × 1021.216 × 1035.138 × 1024.849 × 1026.394 × 1025.286 × 1034.285 × 1025.017 × 1031.686 × 1046.237 × 1024.092 × 102
100Avg7.431 × 1022.813 × 1032.348 × 1049.992 × 1026.423 × 1022.354 × 1032.840 × 1046.436 × 1022.822 × 1047.733 × 1041.400 × 1035.378 × 102
Min6.443 × 1021.870 × 1036.743 × 1038.615 × 1025.980 × 1021.123 × 1031.677 × 1045.671 × 1022.145 × 1046.186 × 1041.103 × 1034.753 × 102
F530Avg6.363 × 1025.855 × 1026.894 × 1028.044 × 1026.744 × 1025.957 × 1028.061 × 1025.430 × 1027.905 × 1027.873 × 1025.526 × 1025.136 × 102
Min5.936 × 1025.508 × 1026.280 × 1027.242 × 1026.104 × 1025.678 × 1027.824 × 1025.259 × 1027.471 × 1027.217 × 1025.270 × 1025.070 × 102
50Avg7.659 × 1026.892 × 1028.934 × 1029.209 × 1028.940 × 1028.095 × 1021.049 × 1036.004 × 1021.043 × 1031.074 × 1036.212 × 1025.296 × 102
Min7.050 × 1026.379 × 1027.731 × 1028.081 × 1027.743 × 1027.138 × 1029.990 × 1025.677 × 1029.853 × 1029.951 × 1025.630 × 1025.179 × 102
100Avg1.216 × 1031.058 × 1031.666 × 1031.413 × 1031.726 × 1031.297 × 1031.824 × 1037.916 × 1021.787 × 1031.960 × 1038.658 × 1025.564 × 102
Min1.054 × 1039.864 × 1021.455 × 1031.329 × 1031.328 × 1031.154 × 1031.701 × 1037.259 × 1021.743 × 1031.842 × 1037.800 × 1025.368 × 102
F630Avg6.428 × 1026.043 × 1026.267 × 1026.671 × 1026.225 × 1026.166 × 1026.655 × 1026.000 × 1026.603 × 1026.654 × 1026.037 × 1026.000 × 102
Min6.175 × 1026.011 × 1026.144 × 1026.410 × 1026.096 × 1026.078 × 1026.511 × 1026.000 × 1026.537 × 1026.566 × 1026.010 × 1026.000 × 102
50Avg6.515 × 1026.105 × 1026.437 × 1026.760 × 1026.400 × 1026.355 × 1026.813 × 1026.001 × 1026.710 × 1026.837 × 1026.081 × 1026.000 × 102
Min6.440 × 1026.052 × 1026.270 × 1026.638 × 1026.165 × 1026.257 × 1026.724 × 1026.000 × 1026.608 × 1026.747 × 1026.041 × 1026.000 × 102
100Avg6.587 × 1026.276 × 1026.648 × 1026.768 × 1026.664 × 1026.528 × 1026.936 × 1026.001 × 1026.860 × 1027.029 × 1026.184 × 1026.000 × 102
Min6.527 × 1026.229 × 1026.467 × 1026.676 × 1026.526 × 1026.418 × 1026.867 × 1026.000 × 1026.761 × 1026.970 × 1026.133 × 1026.000 × 102
F730Avg8.280 × 1028.418 × 1021.011 × 1031.238 × 1038.986 × 1028.989 × 1021.080 × 1037.801 × 1021.187 × 1031.302 × 1038.151 × 1027.446 × 102
Min7.853 × 1027.801 × 1028.671 × 1021.089 × 1038.402 × 1028.415 × 1021.032 × 1037.586 × 1021.063 × 1031.154 × 1037.824 × 1027.363 × 102
50Avg1.070 × 1031.016 × 1031.701 × 1031.684 × 1031.141 × 1031.224 × 1031.530 × 1038.481 × 1021.663 × 1031.862 × 1039.911 × 1027.684 × 102
Min9.625 × 1029.654 × 1021.113 × 1031.500 × 1031.020 × 1031.021 × 1031.333 × 1038.062 × 1021.464 × 1031.744 × 1039.354 × 1027.588 × 102
100Avg2.118 × 1031.710 × 1034.169 × 1033.250 × 1031.988 × 1032.421 × 1033.184 × 1031.129 × 1033.326 × 1033.694 × 1031.619 × 1038.363 × 102
Min1.819 × 1031.542 × 1032.576 × 1032.814 × 1031.531 × 1032.103 × 1032.813 × 1039.899 × 1023.182 × 1033.580 × 1031.416 × 1038.161 × 102
F830Avg9.196 × 1028.713 × 1029.790 × 1021.000 × 1039.841 × 1028.945 × 1021.051 × 1038.434 × 1021.031 × 1031.041 × 1038.460 × 1028.141 × 102
Min8.707 × 1028.435 × 1028.938 × 1029.488 × 1029.344 × 1028.574 × 1021.033 × 1038.249 × 1029.726 × 1021.002 × 1038.262 × 1028.070 × 102
50Avg1.065 × 1039.792 × 1021.229 × 1031.249 × 1031.213 × 1031.055 × 1031.369 × 1039.002 × 1021.305 × 1031.425 × 1039.168 × 1028.315 × 102
Min1.019 × 1039.384 × 1021.118 × 1031.132 × 1031.087 × 1039.831 × 1021.308 × 1038.567 × 1021.251 × 1031.339 × 1038.625 × 1028.199 × 102
100Avg1.576 × 1031.397 × 1031.968 × 1031.897 × 1032.026 × 1031.533 × 1032.240 × 1031.063 × 1032.151 × 1032.414 × 1031.193 × 1038.694 × 102
Min1.465 × 1031.225 × 1031.717 × 1031.716 × 1031.756 × 1031.410 × 1032.093 × 1039.900 × 1022.052 × 1032.248 × 1031.122 × 1038.398 × 102
F930Avg3.059 × 1031.384 × 1036.278 × 1037.233 × 1038.747 × 1031.893 × 1035.814 × 1039.064 × 1026.551 × 1035.578 × 1031.062 × 1039.005 × 102
Min1.768 × 1031.025 × 1034.471 × 1034.425 × 1035.118 × 1031.554 × 1033.388 × 1039.009 × 1025.576 × 1034.101 × 1039.647 × 1029.000 × 102
50Avg9.536 × 1034.571 × 1031.644 × 1041.783 × 1042.195 × 1047.504 × 1032.616 × 1049.773 × 1022.577 × 1042.294 × 1041.750 × 1039.045 × 102
Min6.223 × 1032.135 × 1038.748 × 1031.187 × 1041.190 × 1043.715 × 1032.123 × 1049.213 × 1021.969 × 1041.804 × 1041.299 × 1039.007 × 102
100Avg2.251 × 1042.638 × 1044.508 × 1043.820 × 1045.208 × 1042.315 × 1046.515 × 1042.428 × 1036.876 × 1045.410 × 1044.877 × 1039.454 × 102
Min1.965 × 1041.102 × 1043.679 × 1042.557 × 1043.986 × 1041.973 × 1045.587 × 1041.304 × 1035.806 × 1044.674 × 1043.620 × 1039.174 × 102
F1030Avg4.876 × 1033.909 × 1035.130 × 1036.156 × 1034.808 × 1034.728 × 1036.636 × 1033.700 × 1037.996 × 1036.444 × 1034.221 × 1031.958 × 103
Min3.664 × 1032.718 × 1033.575 × 1034.506 × 1033.333 × 1033.522 × 1035.706 × 1032.875 × 1037.199 × 1035.410 × 1033.066 × 1031.348 × 103
50Avg8.127 × 1036.428 × 1038.566 × 1039.478 × 1037.956 × 1037.769 × 1031.242 × 1045.949 × 1031.427 × 1041.216 × 1047.662 × 1032.391 × 103
Min6.288 × 1034.582 × 1036.288 × 1036.969 × 1036.204 × 1036.035 × 1031.100 × 1044.819 × 1031.301 × 1041.073 × 1045.766 × 1031.246 × 103
100Avg1.549 × 1041.498 × 1041.728 × 1042.012 × 1041.618 × 1041.578 × 1042.579 × 1041.361 × 1043.140 × 1042.787 × 1041.733 × 1044.814 × 103
Min1.267 × 1041.141 × 1041.417 × 1041.687 × 1041.147 × 1041.335 × 1042.440 × 1041.115 × 1043.051 × 1042.582 × 1041.468 × 1042.834 × 103
Summary30W|T|L0|0|90|0|90|0|90|0|90|0|90|0|90|0|90|1|80|0|90|0|90|0|98|1|0
50W|T|L0|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|99|0|0
100W|T|L0|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|99|0|0
Table A3. Results of the comparative algorithms on hybrid test functions.
Table A3. Results of the comparative algorithms on hybrid test functions.
FDMetricsKH
(2012)
GWO
(2014)
MFO
(2015)
WOA
(2016)
WCMFO
(2019)
CMFO
(2019)
HGSO
(2019)
RGA-DX
(2019)
ChOA
(2020)
AOA
(2021)
ODSFMFO
(2021)
M-MFO
F1130Avg1.514 × 1031.406 × 1033.749 × 1031.462 × 1031.336 × 1032.126 × 1032.811 × 1031.138 × 1033.267 × 1033.249 × 1031.285 × 1031.122 × 103
Min1.262 × 1031.271 × 1031.363 × 1031.282 × 1031.255 × 1031.360 × 1031.707 × 1031.109 × 1031.731 × 1031.797 × 1031.210 × 1031.105 × 103
50Avg4.926 × 1033.078 × 1037.297 × 1031.591 × 1031.491 × 1031.984 × 1035.800 × 1031.251 × 1038.848 × 1031.587 × 1041.725 × 1031.128 × 103
Min2.518 × 1031.480 × 1031.574 × 1031.421 × 1031.344 × 1031.278 × 1033.881 × 1031.129 × 1036.441 × 1039.287 × 1031.394 × 1031.123 × 103
100Avg7.658 × 1043.531 × 1041.257 × 1057.762 × 1032.191 × 1034.024 × 1041.281 × 1051.033 × 1047.093 × 1041.631 × 1053.043 × 1041.195 × 103
Min3.912 × 1041.647 × 1042.137 × 1044.463 × 1031.841 × 1032.012 × 1041.128 × 1052.076 × 1036.100 × 1041.268 × 1051.322 × 1041.127 × 103
F1230Avg3.051 × 1063.900 × 1076.158 × 1073.770 × 1071.254 × 1064.296 × 1071.343 × 1097.608 × 1053.594 × 1097.204 × 1092.297 × 1067.118 × 104
Min1.788 × 1052.109 × 1067.306 × 1042.509 × 1063.718 × 1048.503 × 1056.717 × 1081.016 × 1056.620 × 1083.034 × 1092.010 × 1052.650 × 104
50Avg1.249 × 1074.764 × 1082.475 × 1091.861 × 1087.229 × 1061.684 × 1081.550 × 10101.983 × 1061.896 × 10105.311 × 10101.951 × 1073.292 × 105
Min1.930 × 1067.558 × 1071.646 × 1075.114 × 1071.549 × 1066.965 × 1069.677 × 1095.848 × 1051.045 × 10102.948 × 10107.129 × 1061.293 × 105
100Avg6.902 × 1074.919 × 1093.523 × 10106.875 × 1083.428 × 1071.982 × 1095.643 × 10103.019 × 1066.718 × 10101.822 × 10114.254 × 1082.834 × 103
Min2.478 × 1071.450 × 1091.435 × 10102.918 × 1083.806 × 1069.911 × 1073.888 × 10101.129 × 1064.928 × 10101.296 × 10111.525 × 1082.195 × 105
F1330Avg3.536 × 1048.368 × 1057.958 × 1061.463 × 1051.047 × 1056.841 × 1034.947 × 1081.173 × 1048.944 × 1084.457 × 1049.819 × 1031.116 × 104
Min1.619 × 1041.991 × 1041.122 × 1042.283 × 1041.436 × 1042.860 × 1031.781 × 1081.376 × 1035.646 × 1072.158 × 1041.596 × 1035.053 × 103
50Avg4.578 × 1041.532 × 1082.428 × 1081.657 × 1058.895 × 1042.085 × 1042.604 × 1094.464 × 1036.036 × 1094.764 × 1091.952 × 1042.083 × 103
Min2.381 × 1041.312 × 1051.136 × 1054.764 × 1042.174 × 1046.621 × 1031.204 × 1091.455 × 1038.730 × 1081.041 × 1079.648 × 1031.317 × 103
100Avg3.464 × 1044.163 × 1084.053 × 1098.423 × 1041.378 × 1051.269 × 1069.302 × 1095.906 × 1031.894 × 10103.573 × 10105.455 × 1043.236 × 103
Min2.377 × 1041.579 × 1062.629 × 1083.701 × 1043.658 × 1041.538 × 1045.247 × 1091.409 × 1031.203 × 10102.155 × 10101.136 × 1041.611 × 103
F1430Avg5.166 × 1051.438 × 1058.969 × 1049.075 × 1052.074 × 1047.665 × 1043.822 × 1051.064 × 1053.622 × 1054.148 × 1045.267 × 1046.136 × 103
Min1.184 × 1043.679 × 1032.197 × 1031.364 × 1056.252 × 1032.082 × 1038.651 × 1045.924 × 1035.125 × 1042.213 × 1034.686 × 1031.533 × 103
50Avg5.216 × 1054.016 × 1053.086 × 1056.358 × 1058.151 × 1042.215 × 1054.049 × 1062.251 × 1051.203 × 1063.163 × 1055.207 × 1052.475 × 104
Min1.101 × 1054.749 × 1041.072 × 1049.639 × 1041.194 × 1041.658 × 1041.690 × 1062.594 × 1045.706 × 1054.727 × 1046.023 × 1048.641 × 103
100Avg3.785 × 1063.480 × 1067.558 × 1061.876 × 1063.627 × 1051.108 × 1061.595 × 1076.317 × 1058.127 × 1061.993 × 1073.338 × 1061.466 × 105
Min2.148 × 1061.057 × 1063.097 × 1056.461 × 1051.387 × 1053.212 × 1051.187 × 1078.401 × 1045.390 × 1066.674 × 1061.282 × 1061.009 × 105
F1530Avg1.744 × 1043.637 × 1053.412 × 1048.683 × 1043.448 × 1045.245 × 1032.685 × 1066.528 × 1035.743 × 1062.428 × 1046.098 × 1032.252 × 103
Min8.598 × 1031.847 × 1043.640 × 1031.368 × 1042.547 × 1031.693 × 1033.185 × 1051.537 × 1031.019 × 1061.478 × 1041.703 × 1031.508 × 103
50Avg2.004 × 1049.315 × 1062.145 × 1077.839 × 1047.164 × 1049.137 × 1032.119 × 1087.314 × 1031.006 × 1083.197 × 1047.557 × 1035.907 × 103
Min1.128 × 1041.565 × 1044.235 × 1042.225 × 1041.422 × 1042.035 × 1031.213 × 1081.598 × 1036.005 × 1071.979 × 1042.315 × 1032.972 × 103
100Avg2.449 × 1049.478 × 1071.045 × 1092.527 × 1059.337 × 1042.651 × 1062.409 × 1092.975 × 1035.122 × 1094.998 × 1096.568 × 1031.821 × 103
Min1.274 × 1045.864 × 1051.058 × 1052.549 × 1041.223 × 1043.473 × 1031.423 × 1091.621 × 1031.096 × 1091.070 × 1093.039 × 1031.522 × 103
F1630Avg2.908 × 1032.287 × 1032.995 × 1033.519 × 1032.807 × 1032.581 × 1033.628 × 1032.435 × 1033.456 × 1033.700 × 1032.334 × 1031.774 × 103
Min2.538 × 1031.744 × 1032.487 × 1032.728 × 1032.095 × 1032.009 × 1033.221 × 1031.854 × 1032.949 × 1032.867 × 1031.949 × 1031.602 × 103
50Avg3.336 × 1032.791 × 1034.150 × 1034.689 × 1033.778 × 1033.302 × 1034.712 × 1033.313 × 1035.278 × 1036.365 × 1032.936 × 1032.003 × 103
Min2.736 × 1032.209 × 1033.133 × 1033.895 × 1033.014 × 1032.761 × 1033.890 × 1032.592 × 1034.488 × 1033.693 × 1032.394 × 1031.845 × 103
100Avg6.038 × 1035.610 × 1038.085 × 1039.811 × 1036.869 × 1036.627 × 1031.213 × 1045.397 × 1031.224 × 1041.873 × 1045.155 × 1032.566 × 103
Min5.126 × 1034.748 × 1036.389 × 1037.513 × 1034.978 × 1034.601 × 1039.757 × 1033.740 × 1031.047 × 1041.409 × 1044.009 × 1031.851 × 103
F1730Avg2.253 × 1031.956 × 1032.411 × 1032.520 × 1032.315 × 1032.050 × 1032.488 × 1031.941 × 1032.595 × 1032.601 × 1031.937 × 1031.738 × 103
Min1.884 × 1031.777 × 1031.975 × 1031.931 × 1031.942 × 1031.805 × 1032.223 × 1031.718 × 1032.277 × 1032.085 × 1031.771 × 1031.703 × 103
50Avg3.405 × 1032.676 × 1033.708 × 1033.892 × 1033.758 × 1033.115 × 1033.827 × 1032.846 × 1034.046 × 1034.165 × 1032.635 × 1031.931 × 103
Min2.871 × 1032.257 × 1032.866 × 1033.106 × 1032.932 × 1032.590 × 1033.518 × 1032.326 × 1033.304 × 1033.228 × 1032.084 × 1031.858 × 103
100Avg5.589 × 1034.439 × 1037.668 × 1037.212 × 1036.345 × 1035.366 × 1031.919 × 1044.515 × 1031.341 × 1043.461 × 1054.401 × 1032.292 × 103
Min4.266 × 1033.338 × 1035.623 × 1035.421 × 1034.935 × 1033.832 × 1039.150 × 1033.706 × 1039.608 × 1031.666 × 1043.410 × 1031.868 × 103
F1830Avg4.488 × 1056.631 × 1053.177 × 1062.408 × 1061.734 × 1054.761 × 1052.212 × 1066.722 × 1051.276 × 1066.751 × 1059.249 × 1059.790 × 104
Min5.229 × 1048.000 × 1043.737 × 1041.933 × 1053.793 × 1043.856 × 1043.218 × 1055.547 × 1044.340 × 1051.206 × 1059.952 × 1045.073 × 104
50Avg2.760 × 1063.300 × 1063.443 × 1064.272 × 1064.064 × 1055.021 × 1068.705 × 1062.036 × 1068.529 × 1062.406 × 1072.111 × 1061.126 × 105
Min3.941 × 1052.968 × 1051.807 × 1051.009 × 1061.509 × 1058.293 × 1053.908 × 1062.080 × 1053.639 × 1068.365 × 1056.113 × 1055.963 × 104
100Avg2.777 × 1064.158 × 1061.162 × 1072.020 × 1068.326 × 1052.307 × 1062.039 × 1071.049 × 1061.063 × 1073.147 × 1073.743 × 1061.629 × 105
Min1.197 × 1067.431 × 1054.881 × 1058.476 × 1053.782 × 1056.200 × 1051.197 × 1072.595 × 1055.745 × 1069.728 × 1061.203 × 1061.177 × 105
F1930Avg1.127 × 1052.913 × 1054.071 × 1062.647 × 1063.223 × 1041.816 × 1048.962 × 1064.640 × 1034.874 × 1071.068 × 1069.241 × 1036.433 × 103
Min5.515 × 1039.466 × 1032.093 × 1031.744 × 1052.168 × 1032.460 × 1034.803 × 1062.110 × 1032.507 × 1068.696 × 1051.968 × 1031.946 × 103
50Avg2.440 × 1052.362 × 1066.151 × 1062.457 × 1062.362 × 1048.702 × 1041.443 × 1081.438 × 1043.026 × 1084.636 × 1051.433 × 1041.566 × 104
Min2.445 × 1046.908 × 1045.031 × 1031.534 × 1052.700 × 1034.883 × 1037.728 × 1073.740 × 1033.919 × 1074.438 × 1052.057 × 1039.683 × 103
100Avg5.676 × 1051.003 × 1083.561 × 1081.529 × 1077.032 × 1045.510 × 1042.661 × 1092.871 × 1032.968 × 1094.646 × 1099.029 × 1032.852 × 103
Min8.460 × 1042.250 × 1062.761 × 1065.273 × 1061.223 × 1042.334 × 1031.245 × 1092.008 × 1037.255 × 1081.529 × 1092.774 × 1031.974 × 103
F2030Avg2.550 × 1032.288 × 1032.600 × 1032.702 × 1032.468 × 1032.494 × 1032.587 × 1032.319 × 1032.932 × 1032.647 × 1032.306 × 1032.128 × 103
Min2.254 × 1032.154 × 1032.215 × 1032.327 × 1032.073 × 1032.162 × 1032.485 × 1032.165 × 1032.560 × 1032.341 × 1032.053 × 1032.028 × 103
50Avg3.263 × 1032.736 × 1033.557 × 1033.628 × 1033.432 × 1033.030 × 1033.441 × 1032.889 × 1033.933 × 1033.363 × 1032.955 × 1032.082 × 103
Min2.765 × 1032.422 × 1032.897 × 1032.664 × 1032.655 × 1032.476 × 1033.173 × 1032.403 × 1033.576 × 1032.634 × 1032.549 × 1032.027 × 103
100Avg5.414 × 1034.469 × 1035.692 × 1035.875 × 1035.740 × 1035.074 × 1036.761 × 1034.910 × 1036.915 × 1035.748 × 1034.560 × 1032.504 × 103
Min4.508 × 1033.301 × 1034.194 × 1034.326 × 1034.438 × 1034.031 × 1036.164 × 1033.965 × 1036.030 × 1034.700 × 1033.218 × 1032.288 × 103
Summary30W|T|L0|0|100|0|100|0|100|0|100|0|101|0|90|0|101|0|90|0|100|0|100|0|108|0|2
50W|T|L0|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|101|0|99|0|1
100W|T|L0|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|1010|0|0
Table A4. Results of the comparative algorithms on composition test functions.
Table A4. Results of the comparative algorithms on composition test functions.
FDMetricsKH
(2012)
GWO
(2014)
MFO
(2015)
WOA
(2016)
WCMFO
(2019)
CMFO
(2019)
HGSO
(2019)
RGA-DX
(2019)
ChOA
(2020)
AOA
(2021)
ODSFMFO
(2021)
M-MFO
F2130Avg2.416 × 1032.383 × 1032.476 × 1032.558 × 1032.493 × 1032.396 × 1032.564 × 1032.352 × 1032.569 × 1032.619 × 1032.352 × 1032.312 × 103
Min2.366 × 1032.352 × 1032.421 × 1032.463 × 1032.398 × 1032.347 × 1032.509 × 1032.326 × 1032.503 × 1032.531 × 1032.331 × 1032.303 × 103
50Avg2.541 × 1032.485 × 1032.694 × 1032.888 × 1032.694 × 1032.489 × 1032.892 × 1032.401 × 1032.885 × 1033.012 × 1032.409 × 1032.328 × 103
Min2.470 × 1032.440 × 1032.575 × 1032.744 × 1032.580 × 1032.403 × 1032.831 × 1032.342 × 1032.819 × 1032.890 × 1032.379 × 1032.315 × 103
100Avg3.338 × 1032.845 × 1033.594 × 1033.884 × 1033.539 × 1032.998 × 1034.083 × 1032.600 × 1034.037 × 1034.558 × 1032.707 × 1032.376 × 103
Min3.166 × 1032.751 × 1033.262 × 1033.502 × 1033.233 × 1032.781 × 1033.882 × 1032.525 × 1033.916 × 1034.276 × 1032.639 × 1032.356 × 103
F2230Avg2.785 × 1034.413 × 1035.843 × 1035.949 × 1036.637 × 1032.373 × 1034.016 × 1032.471 × 1039.177 × 1037.685 × 1032.318 × 1032.300 × 103
Min2.300 × 1032.420 × 1033.150 × 1032.315 × 1035.330 × 1032.315 × 1033.476 × 1032.300 × 1038.622 × 1035.492 × 1032.307 × 1032.300 × 103
50Avg1.029 × 1048.634 × 1031.029 × 1041.208 × 1041.002 × 1048.679 × 1031.031 × 1047.647 × 1031.654 × 1041.475 × 1045.348 × 1032.506 × 103
Min8.693 × 1037.065 × 1037.958 × 1038.721 × 1038.609 × 1032.525 × 1037.356 × 1032.300 × 1031.554 × 1041.304 × 1042.436 × 1032.300 × 103
100Avg2.000 × 1041.778 × 1042.032 × 1042.397 × 1041.943 × 1042.156 × 1043.032 × 1041.604 × 1043.373 × 1043.089 × 1041.910 × 1046.538 × 103
Min1.641 × 1041.413 × 1041.778 × 1042.087 × 1041.671 × 1041.965 × 1042.877 × 1041.392 × 1043.258 × 1042.791 × 1041.574 × 1045.211 × 103
F2330Avg2.910 × 1032.732 × 1032.801 × 1033.032 × 1032.785 × 1032.828 × 1033.095 × 1032.705 × 1033.015 × 1033.313 × 1032.718 × 1032.662 × 103
Min2.800 × 1032.695 × 1032.762 × 1032.886 × 1032.721 × 1032.764 × 1032.992 × 1032.680 × 1032.966 × 1033.093 × 1032.699 × 1032.647 × 103
50Avg3.407 × 1032.907 × 1033.135 × 1033.592 × 1033.104 × 1033.133 × 1033.617 × 1032.844 × 1033.525 × 1034.337 × 1032.885 × 1032.754 × 103
Min3.157 × 1032.835 × 1033.046 × 1033.377 × 1032.980 × 1032.983 × 1033.310 × 1032.808 × 1033.373 × 1033.850 × 1032.822 × 1032.737 × 103
100Avg4.708 × 1033.405 × 1033.716 × 1034.823 × 1033.545 × 1033.819 × 1036.555 × 1033.061 × 1034.657 × 1036.793 × 1033.255 × 1032.912 × 103
Min4.375 × 1033.289 × 1033.547 × 1034.263 × 1033.306 × 1033.603 × 1034.878 × 1032.974 × 1034.424 × 1036.011 × 1033.123 × 1032.872 × 103
F2430Avg3.105 × 1032.904 × 1032.974 × 1033.167 × 1032.978 × 1032.928 × 1033.304 × 1032.877 × 1033.198 × 1033.704 × 1032.871 × 1032.827 × 103
Min3.007 × 1032.855 × 1032.910 × 1033.021 × 1032.928 × 1032.877 × 1033.241 × 1032.851 × 1033.128 × 1033.490 × 1032.848 × 1032.820 × 103
50Avg3.663 × 1033.087 × 1033.227 × 1033.733 × 1033.231 × 1033.237 × 1033.899 × 1033.010 × 1033.721 × 1034.772 × 1033.026 × 1032.910 × 103
Min3.484 × 1033.000 × 1033.152 × 1033.545 × 1033.135 × 1033.070 × 1033.726 × 1032.955 × 1033.588 × 1034.385 × 1032.985 × 1032.894 × 103
100Avg5.770 × 1033.963 × 1034.272 × 1035.854 × 1034.293 × 1035.163 × 1036.815 × 1033.598 × 1035.906 × 1031.083 × 1043.769 × 1033.309 × 103
Min5.279 × 1033.819 × 1034.124 × 1035.238 × 1034.048 × 1034.336 × 1036.235 × 1033.463 × 1035.543 × 1039.106 × 1033.590 × 1033.275 × 103
F2530Avg2.912 × 1032.957 × 1033.107 × 1032.945 × 1032.894 × 1033.004 × 1033.300 × 1032.890 × 1033.934 × 1034.463 × 1032.928 × 1032.888 × 103
Min2.884 × 1032.913 × 1032.889 × 1032.898 × 1032.884 × 1032.933 × 1033.160 × 1032.887 × 1033.456 × 1033.760 × 1032.890 × 1032.887 × 103
50Avg3.091 × 1033.371 × 1034.930 × 1033.155 × 1033.041 × 1033.954 × 1036.400 × 1033.050 × 1038.767 × 1031.387 × 1043.240 × 1033.070 × 103
Min3.036 × 1033.055 × 1033.159 × 1033.039 × 1032.962 × 1033.482 × 1035.561 × 1032.965 × 1036.928 × 1031.199 × 1043.158 × 1033.017 × 103
100Avg3.376 × 1035.277 × 1031.123 × 1043.590 × 1033.321 × 1035.786 × 1031.404 × 1043.319 × 1031.347 × 1042.325 × 1044.234 × 1033.340 × 103
Min3.228 × 1034.686 × 1034.792 × 1033.464 × 1033.206 × 1034.182 × 1031.131 × 1043.201 × 1031.142 × 1042.080 × 1043.774 × 1033.261 × 103
F2630Avg6.150 × 1034.424 × 1035.689 × 1037.599 × 1035.447 × 1034.227 × 1036.845 × 1034.117 × 1036.353 × 1039.214 × 1034.425 × 1033.408 × 103
Min2.800 × 1033.954 × 1034.921 × 1035.975 × 1034.955 × 1032.936 × 1035.878 × 1032.900 × 1035.882 × 1037.702 × 1032.876 × 1032.800 × 103
50Avg9.583 × 1035.735 × 1038.121 × 1031.306 × 1048.059 × 1038.552 × 1031.102 × 1045.018 × 1031.034 × 1041.537 × 1045.531 × 1034.065 × 103
Min3.154 × 1035.192 × 1036.910 × 1039.977 × 1037.062 × 1035.725 × 1038.677 × 1034.540 × 1039.266 × 1031.326 × 1044.905 × 1033.899 × 103
100Avg2.471 × 1041.263 × 1041.741 × 1043.111 × 1041.752 × 1042.406 × 1043.573 × 1049.328 × 1032.508 × 1045.006 × 1041.123 × 1046.251 × 103
Min2.085 × 1041.124 × 1041.526 × 1042.326 × 1041.518 × 1041.981 × 1043.232 × 1048.106 × 1032.276 × 1044.357 × 1049.727 × 1035.989 × 103
F2730Avg3.402 × 1033.229 × 1033.236 × 1033.346 × 1033.228 × 1033.286 × 1033.200 × 1033.224 × 1033.492 × 1034.337 × 1033.241 × 1033.221 × 103
Min3.316 × 1033.212 × 1033.208 × 1033.282 × 1033.201 × 1033.232 × 1033.200 × 1033.202 × 1033.355 × 1033.959 × 1033.222 × 1033.210 × 103
50Avg4.359 × 1033.471 × 1033.550 × 1034.305 × 1033.504 × 1034.243 × 1033.200 × 1033.375 × 1034.257 × 1036.617 × 1033.509 × 1033.312 × 103
Min4.013 × 1033.342 × 1033.407 × 1033.678 × 1033.377 × 1033.769 × 1033.200 × 1033.293 × 1033.997 × 1035.870 × 1033.448 × 1033.281 × 103
100Avg5.732 × 1033.854 × 1033.867 × 1034.945 × 1033.607 × 1035.277 × 1033.200 × 1033.493 × 1035.696 × 1031.182 × 1043.809 × 1033.422 × 103
Min4.974 × 1033.594 × 1033.655 × 1033.909 × 1033.482 × 1034.288 × 1033.200 × 1033.437 × 1035.303 × 1039.541 × 1033.644 × 1033.369 × 103
F2830Avg3.235 × 1033.339 × 1033.721 × 1033.303 × 1033.194 × 1033.451 × 1033.694 × 1033.196 × 1034.272 × 1036.044 × 1033.295 × 1033.110 × 103
Min3.197 × 1033.269 × 1033.318 × 1033.269 × 1033.100 × 1033.247 × 1033.300 × 1033.101 × 1033.565 × 1034.603 × 1033.251 × 1033.100 × 103
50Avg3.338 × 1033.873 × 1038.080 × 1033.424 × 1033.298 × 1034.172 × 1036.043 × 1033.306 × 1036.053 × 1031.079 × 1043.739 × 1033.292 × 103
Min3.271 × 1033.653 × 1035.324 × 1033.344 × 1033.259 × 1033.761 × 1033.300 × 1033.259 × 1035.216 × 1039.575 × 1033.481 × 1033.259 × 103
100Avg3.496 × 1036.692 × 1031.749 × 1043.721 × 1037.644 × 1036.790 × 1031.916 × 1043.381 × 1031.189 × 1042.947 × 1045.306 × 1033.331 × 103
Min3.393 × 1034.771 × 1031.485 × 1043.598 × 1033.333 × 1034.646 × 1031.478 × 1043.346 × 1039.983 × 1032.587 × 1044.502 × 1033.295 × 103
F2930Avg4.170 × 1033.645 × 1034.003 × 1034.751 × 1033.965 × 1034.050 × 1034.246 × 1033.575 × 1034.362 × 1035.610 × 1033.669 × 1033.319 × 103
Min3.680 × 1033.460 × 1033.603 × 1034.062 × 1033.650 × 1033.631 × 1033.690 × 1033.346 × 1034.057 × 1034.626 × 1033.475 × 1033.312 × 103
50Avg5.252 × 1034.214 × 1035.076 × 1037.281 × 1034.671 × 1035.028 × 1036.835 × 1033.673 × 1036.978 × 1031.520 × 1044.272 × 1033.380 × 103
Min4.165 × 1033.750 × 1034.271 × 1036.025 × 1033.992 × 1033.985 × 1035.129 × 1033.267 × 1036.045 × 1038.818 × 1033.748 × 1033.219 × 103
100Avg8.699 × 1037.229 × 1031.370 × 1041.413 × 1047.986 × 1031.004 × 1041.509 × 1045.944 × 1031.936 × 1048.177 × 1046.761 × 1034.020 × 103
Min6.637 × 1036.385 × 1037.555 × 1031.053 × 1047.019 × 1038.335 × 1038.825 × 1034.620 × 1031.268 × 1043.350 × 1045.675 × 1033.753 × 103
F3030Avg1.679 × 1067.020 × 1063.271 × 1056.709 × 1062.812 × 1048.574 × 1056.234 × 1078.098 × 1033.332 × 1076.074 × 1071.629 × 1046.645 × 103
Min7.205 × 1048.830 × 1051.393 × 1044.463 × 1051.582 × 1047.507 × 1042.348 × 1075.539 × 1031.030 × 1075.150 × 1067.769 × 1036.062 × 103
50Avg5.532 × 1076.713 × 1078.852 × 1078.102 × 1072.475 × 1062.565 × 1075.469 × 1088.623 × 1054.466 × 1087.074 × 1081.839 × 1068.164 × 105
Min2.289 × 1073.536 × 1072.389 × 1064.041 × 1071.155 × 1066.414 × 1063.572 × 1087.148 × 1052.097 × 1081.864 × 1089.999 × 1057.640 × 105
100Avg1.233 × 1073.958 × 1081.283 × 1091.922 × 1081.932 × 1061.750 × 1087.461 × 1091.123 × 1041.216 × 10+103.109 × 10+108.500 × 1059.592 × 103
Min3.908 × 1065.455 × 1073.821 × 1077.264 × 1073.637 × 1052.473 × 1064.267 × 1096.600 × 1038.263 × 1091.450 × 10+101.517 × 1057.153 × 103
Summary30W|T|L0|0|100|0|100|0|100|0|100|0|100|0|101|0|90|0|100|0|100|0|100|0|109|0|1
50W|T|L0|0|100|0|100|0|100|0|101|0|90|0|101|0|90|0|100|0|100|0|100|0|108|0|2
100W|T|L0|0|100|0|100|0|100|0|100|0|100|0|101|0|91|0|90|0|100|0|100|0|108|0|2

References

  1. Akay, B.; Karaboga, D. Artificial bee colony algorithm for large-scale problems and engineering design optimization. J. Intell. Manuf. 2012, 23, 1001–1014. [Google Scholar] [CrossRef]
  2. Aloui, M.; Hamidi, F.; Jerbi, H.; Omri, M.; Popescu, D.; Abbassi, R. A Chaotic Krill Herd Optimization Algorithm for Global Numerical Estimation of the Attraction Domain for Nonlinear Systems. Mathematics 2021, 9, 1743. [Google Scholar] [CrossRef]
  3. Gharehchopogh, F.S.; Farnad, B.; Alizadeh, A. A farmland fertility algorithm for solving constrained engineering problems. Concurr. Comput. Pract. Exp. 2021, 33, e6310. [Google Scholar] [CrossRef]
  4. Ivanov, O.; Neagu, B.-C.; Grigoraș, G.; Scarlatache, F.; Gavrilaș, M. A Metaheuristic Algorithm for Flexible Energy Storage Management in Residential Electricity Distribution Grids. Mathematics 2021, 9, 2375. [Google Scholar] [CrossRef]
  5. Wang, S.; Jia, H.; Abualigah, L.; Liu, Q.; Zheng, R. An Improved Hybrid Aquila Optimizer and Harris Hawks Algorithm for Solving Industrial Engineering Optimization Problems. Processes 2021, 9, 1551. [Google Scholar] [CrossRef]
  6. Varaee, H.; Ghasemi, M.R. Engineering optimization based on ideal gas molecular movement algorithm. Eng. Comput. 2017, 33, 71–93. [Google Scholar] [CrossRef]
  7. Ghasemi, M.R.; Varaee, H. A fast multi-objective optimization using an efficient ideal gas molecular movement algorithm. Eng. Comput. 2017, 33, 477–496. [Google Scholar] [CrossRef]
  8. Pérez-Rodríguez, R. A Hybrid Estimation of Distribution Algorithm for the Quay Crane Scheduling Problem. Math. Comput. Appl. 2021, 26, 64. [Google Scholar] [CrossRef]
  9. Bányai, T. Optimization of Material Supply in Smart Manufacturing Environment: A Metaheuristic Approach for Matrix Production. Machines 2021, 9, 220. [Google Scholar] [CrossRef]
  10. Guerreiro, M.T.; Guerreiro, E.M.A.; Barchi, T.M.; Biluca, J.; Alves, T.A.; de Souza Tadano, Y.; Trojan, F.; Siqueira, H.V. Anomaly Detection in Automotive Industry Using Clustering Methods—A Case Study. Appl. Sci. 2021, 11, 9868. [Google Scholar] [CrossRef]
  11. Abualigah, L.; Diabat, A.; Geem, Z.W. A Comprehensive Survey of the Harmony Search Algorithm in Clustering Applications. Appl. Sci. 2020, 10, 3827. [Google Scholar] [CrossRef]
  12. Bezdan, T.; Stoean, C.; Naamany, A.A.; Bacanin, N.; Rashid, T.A.; Zivkovic, M.; Venkatachalam, K. Hybrid Fruit-Fly Optimization Algorithm with K-Means for Text Document Clustering. Mathematics 2021, 9, 1929. [Google Scholar] [CrossRef]
  13. Sikandar, S.; Baloch, N.K.; Hussain, F.; Amin, W.; Zikria, Y.B.; Yu, H. An Optimized Nature-Inspired Metaheuristic Algorithm for Application Mapping in 2D-NoC. Sensors 2021, 21, 5102. [Google Scholar] [CrossRef] [PubMed]
  14. Rodríguez, A.; Pérez-Cisneros, M.; Rosas-Caro, J.C.; Del-Valle-Soto, C.; Gálvez, J.; Cuevas, E. Robust Clustering Routing Method for Wireless Sensor Networks Considering the Locust Search Scheme. Energies 2021, 14, 3019. [Google Scholar] [CrossRef]
  15. Valdez, F.; Castillo, O.; Melin, P. Bio-Inspired Algorithms and Its Applications for Optimization in Fuzzy Clustering. Algorithms 2021, 14, 122. [Google Scholar] [CrossRef]
  16. Chattopadhyay, S.; Dey, A.; Singh, P.K.; Geem, Z.W.; Sarkar, R. COVID-19 Detection by Optimizing Deep Residual Features with Improved Clustering-Based Golden Ratio Optimizer. Diagnostics 2021, 11, 315. [Google Scholar] [CrossRef]
  17. Abualigah, L.; Gandomi, A.H.; Elaziz, M.A.; Hamad, H.A.; Omari, M.; Alshinwan, M.; Khasawneh, A.M. Advances in Meta-Heuristic Optimization Algorithms in Big Data Text Clustering. Electronics 2021, 10, 101. [Google Scholar] [CrossRef]
  18. Rodríguez, A.; Del-Valle-Soto, C.; Velázquez, R. Energy-Efficient Clustering Routing Protocol for Wireless Sensor Networks Based on Yellow Saddle Goatfish Algorithm. Mathematics 2020, 8, 1515. [Google Scholar] [CrossRef]
  19. Helmi, A.M.; Al-qaness, M.A.A.; Dahou, A.; Damaševičius, R.; Krilavičius, T.; Elaziz, M.A. A Novel Hybrid Gradient-Based Optimizer and Grey Wolf Optimizer Feature Selection Method for Human Activity Recognition Using Smartphone Sensors. Entropy 2021, 23, 1065. [Google Scholar] [CrossRef]
  20. Pichai, S.; Sunat, K.; Chiewchanwattana, S. An Asymmetric Chaotic Competitive Swarm Optimization Algorithm for Feature Selection in High-Dimensional Data. Symmetry 2020, 12, 1782. [Google Scholar] [CrossRef]
  21. Abukhodair, F.; Alsaggaf, W.; Jamal, A.T.; Abdel-Khalek, S.; Mansour, R.F. An Intelligent Metaheuristic Binary Pigeon Optimization-Based Feature Selection and Big Data Classification in a MapReduce Environment. Mathematics 2021, 9, 2627. [Google Scholar] [CrossRef]
  22. Abd Elaziz, M.; Dahou, A.; Alsaleh, N.A.; Elsheikh, A.H.; Saba, A.I.; Ahmadein, M. Boosting COVID-19 Image Classification Using MobileNetV3 and Aquila Optimizer Algorithm. Entropy 2021, 23, 1383. [Google Scholar] [CrossRef]
  23. Fan, C.; Gao, F. Enhanced Human Activity Recognition Using Wearable Sensors via a Hybrid Feature Selection Method. Sensors 2021, 21, 6434. [Google Scholar] [CrossRef]
  24. Cho, D.-H.; Moon, S.-H.; Kim, Y.-H. Genetic Feature Selection Applied to KOSPI and Cryptocurrency Price Prediction. Mathematics 2021, 9, 2574. [Google Scholar] [CrossRef]
  25. Elgamal, Z.M.; Yasin, N.M.; Sabri, A.Q.M.; Sihwail, R.; Tubishat, M.; Jarrah, H. Improved Equilibrium Optimization Algorithm Using Elite Opposition-Based Learning and New Local Search Strategy for Feature Selection in Medical Datasets. Computation 2021, 9, 68. [Google Scholar] [CrossRef]
  26. Zamani, H.; Nadimi-Shahraki, M.-H. Feature selection based on whale optimization algorithm for diseases diagnosis. Int. J. Comput. Sci. Inf. Secur. 2016, 14, 1243. [Google Scholar]
  27. Chatterjee, S.; Biswas, S.; Majee, A.; Sen, S.; Oliva, D.; Sarkar, R. Breast cancer detection from thermal images using a Grunwald-Letnikov-aided Dragonfly algorithm-based deep feature selection method. Comput. Biol. Med. 2021, 105027, in press. [Google Scholar] [CrossRef]
  28. Zamani, H.; Nadimi-Shahraki, M.-H. Swarm intelligence approach for breast cancer diagnosis. Int. J. Comput. Appl. 2016, 151, 40–44. [Google Scholar] [CrossRef]
  29. Sa’ad, S.; Muhammed, A.; Abdullahi, M.; Abdullah, A.; Hakim Ayob, F. An Enhanced Discrete Symbiotic Organism Search Algorithm for Optimal Task Scheduling in the Cloud. Algorithms 2021, 14, 200. [Google Scholar] [CrossRef]
  30. Ren, T.; Zhang, Y.; Cheng, S.-R.; Wu, C.-C.; Zhang, M.; Chang, B.-y.; Wang, X.-y.; Zhao, P. Effective Heuristic Algorithms Solving the Jobshop Scheduling Problem with Release Dates. Mathematics 2020, 8, 1221. [Google Scholar] [CrossRef]
  31. Wang, Y.; Yang, Z.; Guo, Y.; Zhou, B.; Zhu, X. A Novel Binary Competitive Swarm Optimizer for Power System Unit Commitment. Appl. Sci. 2019, 9, 1776. [Google Scholar] [CrossRef] [Green Version]
  32. Izakian, H.; Abraham, A.; Snášel, V. Metaheuristic Based Scheduling Meta-Tasks in Distributed Heterogeneous Computing Systems. Sensors 2009, 9, 5339–5350. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Del Ser, J.; Osaba, E.; Molina, D.; Yang, X.; Salcedo-Sanz, S.; Camacho, D.; Das, S.; Suganthan, P.; Coello, C.C.; Herrera, F. Bio-inspired computation: Where we stand and what’s next. Swarm Evol. Comput 2019, 48, 220–250. [Google Scholar] [CrossRef]
  34. Talbi, E.-G. Metaheuristics: From Design to Implementation; John Wiley & Sons: Hoboken, NJ, USA, 2009; Volume 74. [Google Scholar]
  35. Kar, A.K. Bio inspired computing—A review of algorithms and scope of applications. Expert Syst. Appl. 2016, 59, 20–32. [Google Scholar] [CrossRef]
  36. Dezfouli, M.B.; Nadimi-Shahraki, M.H.; Zamani, H. A novel tour planning model using big data. In Proceedings of the 2018 International Conference on Artificial Intelligence and Data Processing (IDAP), Malatya, Turkey, 28–30 September 2018; pp. 1–6. [Google Scholar]
  37. Zahrani, H.K.; Nadimi-Shahraki, M.H.; Sayarshad, H.R. An intelligent social-based method for rail-car fleet sizing problem. J. Rail Transp. Plan. Manag. 2021, 17, 100231. [Google Scholar] [CrossRef]
  38. Bonabeau, E.; Theraulaz, G.; Dorigo, M. Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 1999. [Google Scholar]
  39. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  40. Eberhart, R.; Kennedy, J. Particle swarm optimization. In Proceedings of the ICNN’95 International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  41. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  42. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  43. Fister, I., Jr.; Yang, X.-S.; Fister, I.; Brest, J.; Fister, D. A brief review of nature-inspired algorithms for optimization. arXiv 2013, arXiv:1307.4186. [Google Scholar]
  44. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  45. Koza, J.R. Genetic programming as a means for programming computers by natural selection. Stat. Comput. 1994, 4, 87–112. [Google Scholar] [CrossRef]
  46. Beyer, H.-G.; Schwefel, H.-P. Evolution strategies—A comprehensive introduction. Nat. Comput. 2002, 1, 3–52. [Google Scholar] [CrossRef]
  47. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. QANA: Quantum-based avian navigation optimizer algorithm. Eng. Appl. Artif. Intell. 2021, 104, 104314. [Google Scholar] [CrossRef]
  48. Bakirtzis, A.G.; Biskas, P.N.; Zoumas, C.E.; Petridis, V. Optimal power flow by enhanced genetic algorithm. IEEE Trans. Power Syst. 2002, 17, 229–236. [Google Scholar] [CrossRef]
  49. Mallipeddi, R.; Suganthan, P.N.; Pan, Q.-K.; Tasgetiren, M.F. Differential evolution algorithm with ensemble of parameters and mutation strategies. Appl. Soft Comput. 2011, 11, 1679–1696. [Google Scholar] [CrossRef]
  50. Das, A.K.; Pratihar, D.K. A directional crossover (DX) operator for real parameter optimization using genetic algorithm. Appl. Intell. 2019, 49, 1841–1865. [Google Scholar] [CrossRef]
  51. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Faris, H. MTDE: An effective multi-trial vector-based differential evolution algorithm and its applications for engineering design problems. Appl. Soft Comput. 2020, 97, 106761. [Google Scholar] [CrossRef]
  52. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. CCSA: Conscious neighborhood-based crow search algorithm for solving global optimization problems. Appl. Soft Comput. 2019, 85, 105583. [Google Scholar] [CrossRef]
  53. Gandomi, A.H.; Alavi, A.H. Krill herd: A new bio-inspired optimization algorithm. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 4831–4845. [Google Scholar] [CrossRef]
  54. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  55. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  56. Połap, D.; Woźniak, M. Red fox optimization algorithm. Expert Syst. Appl. 2021, 166, 114107. [Google Scholar] [CrossRef]
  57. Khishe, M.; Mosavi, M.R. Chimp optimization algorithm. Expert Syst. Appl. 2020, 149, 113338. [Google Scholar] [CrossRef]
  58. MiarNaeimi, F.; Azizyan, G.; Rashki, M. Horse herd optimization algorithm: A nature-inspired algorithm for high-dimensional optimization problems. Knowl. -Based Syst. 2021, 213, 106711. [Google Scholar] [CrossRef]
  59. Yang, X.S.; Gandomi, A.H. Bat algorithm: A novel approach for global engineering optimization. Eng. Comput. 2012, 29, 464–483. [Google Scholar] [CrossRef] [Green Version]
  60. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  61. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  62. Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-qaness, M.A.; Gandomi, A.H. Aquila Optimizer: A novel meta-heuristic optimization Algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  63. Dorigo, M.; Di Caro, G. Ant colony optimization: A new meta-heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; pp. 1470–1477. [Google Scholar]
  64. Karaboga, D. An Idea Based on Honey Bee Swarm for Numerical Optimization; Technical Report-tr06; Erciyes University, Engineering Faculty, Computer Engineering Department: Kayseri, Turkey, 2005. [Google Scholar]
  65. Mirjalili, S. The ant lion optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
  66. Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 2016, 27, 1053–1073. [Google Scholar] [CrossRef]
  67. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. -Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  68. Karaboga, D.; Gorkemli, B. A quick artificial bee colony (qABC) algorithm and its performance on optimization problems. Appl. Soft Comput. 2014, 23, 227–238. [Google Scholar] [CrossRef]
  69. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S. An improved grey wolf optimizer for solving engineering problems. Expert Syst. Appl. 2021, 166, 113917. [Google Scholar] [CrossRef]
  70. Jia, H.; Sun, K.; Zhang, W.; Leng, X. An enhanced chimp optimization algorithm for continuous optimization domains. Complex Intell. Syst. 2021, 7, 1–18. [Google Scholar] [CrossRef]
  71. Singh, T.; Saxena, N.; Khurana, M.; Singh, D.; Abdalla, M.; Alshazly, H. Data Clustering Using Moth-Flame Optimization Algorithm. Sensors 2021, 21, 4086. [Google Scholar] [CrossRef] [PubMed]
  72. Shah, Y.A.; Habib, H.A.; Aadil, F.; Khan, M.F.; Maqsood, M.; Nawaz, T. CAMONET: Moth-Flame Optimization (MFO) Based Clustering Algorithm for VANETs. IEEE Access 2018, 6, 48611–48624. [Google Scholar] [CrossRef]
  73. Kotary, D.K.; Nanda, S.J. Distributed robust data clustering in wireless sensor networks using diffusion moth flame optimization. Eng. Appl. Artif. Intell. 2020, 87, 103342. [Google Scholar] [CrossRef]
  74. Fei, W.; Hexiang, B.; Deyu, L.; Jianjun, W. Energy-Efficient Clustering Algorithm in Underwater Sensor Networks Based on Fuzzy C Means and Moth-Flame Optimization Method. IEEE Access 2020, 8, 97474–97484. [Google Scholar] [CrossRef]
  75. Ishtiaq, A.; Ahmed, S.; Khan, M.F.; Aadil, F.; Maqsood, M.; Khan, S. Intelligent clustering using moth flame optimizer for vehicular ad hoc networks. Int. J. Distrib. Sens. Netw. 2019, 15, 1550147718824460. [Google Scholar] [CrossRef]
  76. Mittal, N. Moth Flame Optimization Based Energy Efficient Stable Clustered Routing Approach for Wireless Sensor Networks. Wirel. Pers. Commun. 2018, 104, 677–694. [Google Scholar] [CrossRef]
  77. Nadimi-Shahraki, M.H.; Moeini, E.; Taghian, S.; Mirjalili, S. DMFO-CD: A Discrete Moth-Flame Optimization Algorithm for Community Detection. Algorithms 2021, 14, 314. [Google Scholar] [CrossRef]
  78. Zawbaa, H.M.; Emary, E.; Parv, B.; Sharawi, M. Feature selection approach based on moth-flame optimization algorithm. In Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC), Vancouver, BC, Canada, 24–29 July 2016; pp. 4612–4617. [Google Scholar]
  79. Khurma, R.A.; Aljarah, I.; Sharieh, A. An Efficient Moth Flame Optimization Algorithm using Chaotic Maps for Feature Selection in the Medical Applications. In Proceedings of the ICPRAM, Valletta, Malta, 22–24 February 2020; pp. 175–182. [Google Scholar]
  80. Wang, M.; Chen, H.; Yang, B.; Zhao, X.; Hu, L.; Cai, Z.; Huang, H.; Tong, C. Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses. Neurocomputing 2017, 267, 69–84. [Google Scholar] [CrossRef]
  81. Hassanien, A.E.; Gaber, T.; Mokhtar, U.; Hefny, H. An improved moth flame optimization algorithm based on rough sets for tomato diseases detection. Comput. Electron. Agric. 2017, 136, 86–96. [Google Scholar] [CrossRef]
  82. Ewees, A.A.; Sahlol, A.T.; Amasha, M.A. A bio-inspired moth-flame optimization algorithm for arabic handwritten letter recognition. In Proceedings of the 2017 International Conference on Control, Artificial Intelligence, Robotics & Optimization (ICCAIRO), Prague, Czech Republic, 20–22 May 2017; pp. 154–159. [Google Scholar]
  83. Gupta, D.; Ahlawat, A.K.; Sharma, A.; Rodrigues, J.J.P.C. Feature selection and evaluation for software usability model using modified moth-flame optimization. Computing 2020, 102, 1503–1520. [Google Scholar] [CrossRef]
  84. Elaziz, M.A.; Ewees, A.A.; Ibrahim, R.A.; Lu, S. Opposition-based moth-flame optimization improved by differential evolution for feature selection. Math. Comput. Simul. 2020, 168, 48–75. [Google Scholar] [CrossRef]
  85. Nadimi-Shahraki, M.H.; Banaie-Dezfouli, M.; Zamani, H.; Taghian, S.; Mirjalili, S. B-MFO: A Binary Moth-Flame Optimization for Feature Selection from Medical Datasets. Computers 2021, 10, 136. [Google Scholar] [CrossRef]
  86. Khan, M.A.; Sharif, M.; Akram, T.; Damaševičius, R.; Maskeliūnas, R. Skin Lesion Segmentation and Multiclass Classification Using Deep Learning Features and Improved Moth Flame Optimization. Diagnostics 2021, 11, 811. [Google Scholar] [CrossRef]
  87. Nguyen, T.-T.; Wang, H.-J.; Dao, T.-K.; Pan, J.-S.; Ngo, T.-G.; Yu, J. A Scheme of Color Image Multithreshold Segmentation Based on Improved Moth-Flame Algorithm. IEEE Access 2020, 8, 174142–174159. [Google Scholar] [CrossRef]
  88. Jaiswal, V.; Sharma, V.; Varma, S. MMFO: Modified moth flame optimization algorithm for region based RGB color image segmentation. Int. J. Electr. Comput. Eng. 2020, 10, 196. [Google Scholar] [CrossRef]
  89. Said, S.; Mostafa, A.; Houssein, E.H.; Hassanien, A.E.; Hefny, H. Moth-flame Optimization Based Segmentation for MRI Liver Images. In Proceedings of the International Conference on Advanced Intelligent Systems and Informatics 2017, Cairo, Egypt, 9–11 September 2017; Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2018; pp. 320–330. [Google Scholar]
  90. Jia, H.; Ma, J.; Song, W. Multilevel thresholding segmentation for color image using modified moth-flame optimization. IEEE Access 2019, 7, 44097–44134. [Google Scholar] [CrossRef]
  91. Aziz, M.A.E.; Ewees, A.A.; Hassanien, A.E. Whale Optimization Algorithm and Moth-Flame Optimization for multilevel thresholding image segmentation. Expert Syst. Appl. 2017, 83, 242–256. [Google Scholar] [CrossRef]
  92. Li, Z.; Zhou, Y.; Zhang, S.; Song, J. Lévy-flight moth-flame algorithm for function optimization and engineering design problems. Math. Probl. Eng. 2016, 2016, 1423930. [Google Scholar] [CrossRef] [Green Version]
  93. Khalilpourazari, S.; Khalilpourazary, S. An efficient hybrid algorithm based on Water Cycle and Moth-Flame Optimization algorithms for solving numerical and constrained engineering optimization problems. Soft Comput. 2019, 23, 1699–1722. [Google Scholar] [CrossRef]
  94. Hongwei, L.; Jianyong, L.; Liang, C.; Jingbo, B.; Yangyang, S.; Kai, L. Chaos-enhanced moth-flame optimization algorithm for global optimization. J. Syst. Eng. Electron. 2019, 30, 1144–1159. [Google Scholar]
  95. Xu, Y.; Chen, H.; Heidari, A.A.; Luo, J.; Zhang, Q.; Zhao, X.; Li, C. An efficient chaotic mutative moth-flame-inspired optimizer for global optimization tasks. Expert Syst. Appl. 2019, 129, 135–155. [Google Scholar] [CrossRef]
  96. Xu, Y.; Chen, H.; Luo, J.; Zhang, Q.; Jiao, S.; Zhang, X. Enhanced Moth-flame optimizer with mutation strategy for global optimization. Inf. Sci. 2019, 492, 181–203. [Google Scholar] [CrossRef]
  97. Chen, C.; Wang, X.; Yu, H.; Wang, M.; Chen, H. Dealing with multi-modality using synthesis of Moth-flame optimizer with sine cosine mechanisms. Math. Comput. Simul. 2021, 188, 291–318. [Google Scholar] [CrossRef]
  98. Li, Z.; Zeng, J.; Chen, Y.; Ma, G.; Liu, G. Death mechanism-based moth–flame optimization with improved flame generation mechanism for global optimization tasks. Expert Syst. Appl. 2021, 183, 115436. [Google Scholar] [CrossRef]
  99. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Future Gener. Comput. Syst. 2019, 101, 646–667. [Google Scholar] [CrossRef]
  100. Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The arithmetic optimization algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  101. Bhesdadiya, R.; Trivedi, I.N.; Jangir, P.; Kumar, A.; Jangir, N.; Totlani, R. A novel hybrid approach particle swarm optimizer with moth-flame optimizer algorithm. In Advances in Computer and Computational Sciences; Springer: Singapore, 2017; pp. 569–577. [Google Scholar]
  102. Mustaffa, Z.; Sulaiman, M.H.; Ernawan, F.; Kamarulzaman, S.F. Hybrid least squares support vector machines for short term predictive analysis. In Proceedings of the 2017 3rd International Conference on Control, Automation and Robotics (ICCAR), Nagoya, Japan, 24–26 April 2017; pp. 571–574. [Google Scholar]
  103. Sarma, A.; Bhutani, A.; Goel, L. Hybridization of moth flame optimization and gravitational search algorithm and its application to detection of food quality. In Proceedings of the 2017 Intelligent Systems Conference (IntelliSys), London, UK, 7–8 September 2017; pp. 52–60. [Google Scholar]
  104. Rezk, H.; Ali, Z.M.; Abdalla, O.; Younis, O.; Gomaa, M.R.; Hashim, M. Hybrid moth-flame optimization algorithm and incremental conductance for tracking maximum power of solar PV/thermoelectric system under different conditions. Mathematics 2019, 7, 875. [Google Scholar] [CrossRef] [Green Version]
  105. Ullah, I.; Hussain, S. Time-Constrained Nature-Inspired Optimization Algorithms for an Efficient Energy Management System in Smart Homes and Buildings. Appl. Sci. 2019, 9, 792. [Google Scholar] [CrossRef] [Green Version]
  106. Abd Elaziz, M.; Yousri, D.; Mirjalili, S. A hybrid Harris hawks-moth-flame optimization algorithm including fractional-order chaos maps and evolutionary population dynamics. Adv. Eng. Softw. 2021, 154, 102973. [Google Scholar] [CrossRef]
  107. Dang, M.P.; Le, H.G.; Chau, N.L.; Dao, T.-P. Optimization for a flexure hinge using an effective hybrid approach of fuzzy logic and moth-flame optimization algorithm. Math. Probl. Eng. 2021, 2021, 6622655. [Google Scholar] [CrossRef]
  108. Apinantanakon, W.; Sunat, K. Omfo: A new opposition-based moth-flame optimization algorithm for solving unconstrained optimization problems. In Proceedings of the International Conference on Computing and Information Technology, Singapore, 27–29 December 2017; Springer: Cham, Switzerland, 2017; pp. 22–31. [Google Scholar]
  109. Xu, L.; Li, Y.; Li, K.; Beng, G.H.; Jiang, Z.; Wang, C.; Liu, N. Enhanced moth-flame optimization based on cultural learning and Gaussian mutation. J. Bionic Eng. 2018, 15, 751–763. [Google Scholar] [CrossRef]
  110. Li, W.K.; Wang, W.L.; Li, L. Optimization of water resources utilization by multi-objective moth-flame algorithm. Water Resour. Manag. 2018, 32, 3303–3316. [Google Scholar] [CrossRef]
  111. Zhang, H.; Li, R.; Cai, Z.; Gu, Z.; Heidari, A.A.; Wang, M.; Chen, H.; Chen, M. Advanced orthogonal moth flame optimization with Broyden–Fletcher–Goldfarb–Shanno algorithm: Framework and real-world problems. Expert Syst. Appl. 2020, 159, 113617. [Google Scholar] [CrossRef]
  112. Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L. An Improved Moth-Flame Optimization Algorithm with Adaptation Mechanism to Solve Numerical and Mechanical Engineering Problems. Entropy 2021, 23, 1637. [Google Scholar] [CrossRef]
  113. Kaur, K.; Singh, U.; Salgotra, R. An enhanced moth flame optimization. Neural Comput. Appl. 2020, 32, 2315–2349. [Google Scholar] [CrossRef]
  114. Pelusi, D.; Mascella, R.; Tallini, L.; Nayak, J.; Naik, B.; Deng, Y. An Improved Moth-Flame Optimization algorithm with hybrid search phase. Knowl.-Based Syst. 2020, 191, 105277. [Google Scholar] [CrossRef]
  115. Li, Y.; Zhu, X.; Liu, J. An improved moth-flame optimization algorithm for engineering problems. Symmetry 2020, 12, 1234. [Google Scholar] [CrossRef]
  116. Awad, N.; Ali, M.; Liang, J.; Qu, B.; Suganthan, P. Problem Definitions and Evaluation Criteria for the cec 2017 Special Session and Competition on Single Objective Bound Constrained Real-Parameter Numerical Optimization; Technical Report; Nanyang Technological University: Singapore, 2016. [Google Scholar]
  117. Morrison, R.W. Designing Evolutionary Algorithms for Dynamic Environments; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2004. [Google Scholar]
  118. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
Figure 1. The classification of SI algorithms and variants of MFO.
Figure 1. The classification of SI algorithms and variants of MFO.
Processes 09 02276 g001
Figure 2. The flowchart of the proposed M-MFO algorithm.
Figure 2. The flowchart of the proposed M-MFO algorithm.
Processes 09 02276 g002
Figure 3. The diversity and convergence curves of the MFO.
Figure 3. The diversity and convergence curves of the MFO.
Processes 09 02276 g003
Figure 4. The convergence behavior of the proposed M-MFO and contender algorithms.
Figure 4. The convergence behavior of the proposed M-MFO and contender algorithms.
Processes 09 02276 g004
Figure 5. Population diversity of M-MFO and contender algorithms on different dimensions.
Figure 5. Population diversity of M-MFO and contender algorithms on different dimensions.
Processes 09 02276 g005
Figure 6. The radar graphs of M-MFO and competitors in different dimensions.
Figure 6. The radar graphs of M-MFO and competitors in different dimensions.
Processes 09 02276 g006
Figure 7. Friedman’s test average results in different dimensions.
Figure 7. Friedman’s test average results in different dimensions.
Processes 09 02276 g007
Table 1. Parameter values for the optimization algorithms.
Table 1. Parameter values for the optimization algorithms.
Alg.Parameter Settings
KHVf = 0.02, Dmax = 0.005, Nmax = 0.01, Sr = 0.
GWOThe parameter a is linearly decreased from 2 to 0.
MFOb = 1, a is decreased linearly from −1 to −2.
WOAα variable decreases linearly from 2 to 0, b = 1.
WCMFOThe number of rivers and sea = 4.
CMFOb = 1, a is decreased linearly from −1 to −2, chaotic map = Singer.
HGSOl1 = 5 × 10−3, l2 = 100, l3 = 1 × 10−2, alpha = 1, beta = 1, M1 = 0.1, M2 = 0.2.
RGA-DXpcv = 0.9, α = 0.95, and pd = 0.75.
ChOAf decreases linearly from 2 to 0.
AOAμ = 0.5, α = 5.
ODSFMFOm = 6, pc = 0.5, γ = 5, α = 1, l = 10, b = 1, β = 1.5.
M-MFOτ = random number between 1 and D, MaxArc = D × [ln N].
Table 2. The overall results of MFO variants for dimension 30.
Table 2. The overall results of MFO variants for dimension 30.
MetricLMFO
(2016)
WCMFO
(2019)
CMFO
(2019)
CLSGMFO
(2019)
LGCMFO
(2019)
SMFO
(2021)
ODSFMFO
(2021)
M-MFO
Overall
results
W|T|L0|0|290|0|291|0|280|0|290|0|290|0|290|0|2928|0|1
Table 3. The overall results of the M-MFO and comparative algorithms on unimodal and multimodal test functions.
Table 3. The overall results of the M-MFO and comparative algorithms on unimodal and multimodal test functions.
DMetricsKH
(2012)
GWO
(2014)
MFO
(2015)
WOA
(2016)
WCMFO
(2019)
CMFO
(2019)
HGSO
(2019)
RGA-DX
(2019)
ChOA
(2020)
AOA
(2021)
ODSFMFO
(2021)
M-MFO
Overall
results
30W|T|L0|0|90|0|90|0|90|0|90|0|90|0|90|0|90|1|80|0|90|0|90|0|98|1|0
50W|T|L0|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|99|0|0
100W|T|L0|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|90|0|99|0|0
Table 4. The overall results of the M-MFO and comparative algorithms on hybrid test functions.
Table 4. The overall results of the M-MFO and comparative algorithms on hybrid test functions.
DMetricsKH
(2012)
GWO
(2014)
MFO
(2015)
WOA
(2016)
WCMFO
(2019)
CMFO
(2019)
HGSO
(2019)
RGA-DX
(2019)
ChOA
(2020)
AOA
(2021)
ODSFMFO
(2021)
M-MFO
Overall
results
30W|T|L0|0|100|0|100|0|100|0|100|0|101|0|90|0|101|0|90|0|100|0|100|0|108|0|2
50W|T|L0|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|101|0|99|0|1
100W|T|L0|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|100|0|1010|0|0
Table 5. The overall results of the M-MFO and comparative algorithms on composition test functions.
Table 5. The overall results of the M-MFO and comparative algorithms on composition test functions.
DMetricsKH
(2012)
GWO
(2014)
MFO
(2015)
WOA
(2016)
WCMFO
(2019)
CMFO
(2019)
HGSO
(2019)
RGA-DX
(2019)
ChOA
(2020)
AOA
(2021)
ODSFMFO
(2021)
M-MFO
Overall
results
30W|T|L0|0|100|0|100|0|100|0|100|0|100|0|101|0|90|0|100|0|100|0|100|0|109|0|1
50W|T|L0|0|100|0|100|0|100|0|101|0|90|0|101|0|90|0|100|0|100|0|100|0|108|0|2
100W|T|L0|0|100|0|100|0|100|0|100|0|100|0|101|0|91|0|90|0|100|0|100|0|108|0|2
Table 6. The overall effectiveness of the M-MFO and contender algorithms.
Table 6. The overall effectiveness of the M-MFO and contender algorithms.
D Algorithms
KH
(W/T/L)
GWO
(W/T/L)
MFO
(W/T/L)
WOA
(W/T/L)
WCMFO
(W/T/L)
CMFO
(W/T/L)
HGSO
(W/T/L)
RGA-DX
(W/T|L)
ChOA
(W|T|L)
AOA
(W|T|L)
ODSFMFO
(W|T|L)
M-MFO
(W|T|L)
300/0/290/0/290/0/290/0/290/0/291/0/281/0/281/1/270/0/290/0/290/0/2925/1/3
500/0/290/0/290/0/290/0/291/0/280/0/291/0/280/0/290/0/290/0/291/0/2826/0/3
1000/0/290/0/290/0/290/0/290/0/290/0/291/0/281/0/280/0/290/0/290/0/2927/0/2
Total0/0/870/0/870/0/870/0/871/0/851/0/863/0/842/1/850/0/870/0/871/0/8678/1/8
OE0%0%0%0%1%1%4%2%0%0%1%91%
Table 7. Results of sensitivity analysis on the guiding archive maturity size.
Table 7. Results of sensitivity analysis on the guiding archive maturity size.
Scenario 1 (δ = 1)Scenario 2 (δ = 3)Scenario 3 (δ = 5)Scenario 4 (δ = D)
M-MFO
DimD30D50D100D30D50D100D30D50D100D30D50D100
F1
(Unimodal)
2.04 × 1031.43 × 1033.48 × 1032.28 × 1031.54 × 1034.24 × 1032.30 × 1031.83 × 1034.69 × 1031.66 × 1031.46 × 1034.46 × 103
F9
(Multimodal)
9.01 × 1029.06 × 1029.58 × 1029.01 × 1029.05 × 1029.57 × 1029.00 × 1029.05 × 1029.49 × 1029.00 × 1029.04 × 1029.45 × 102
F17
(Hybrid)
1.75 × 1032.05 × 1032.39 × 1031.74 × 1031.99 × 1032.35 × 1031.74 × 1031.99 × 1032.29 × 1031.74 × 1031.93 × 1032.29 × 103
F30
(Composition)
6.78 × 1038.29 × 1051.30 × 1046.77 × 1038.26 × 1051.14 × 1046.58 × 1038.22 × 1051.05 × 1046.64 × 1038.16 × 1059.59 × 103
Table 8. Friedman test for unimodal and multimodal functions of the CEC 2018.
Table 8. Friedman test for unimodal and multimodal functions of the CEC 2018.
FunctionsUnimodal FunctionsMultimodal Functions
Dimensions30501003050100
AlgorithmsAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall Rank
KH4.950035.950046.675054.425024.125024.32503
GWO6.400056.675056.100044.450034.750034.97504
MFO9.050098.800088.950097.875087.800087.92508
WOA8.475074.625038.325077.500076.800066.80006
WCMFO2.875022.575022.600024.875044.875044.22502
CMFO7.700066.925076.075037.425067.400077.12507
HGSO8.5000810.3000109.00001010.5500910.6250910.37509
RGA-DX2.025012.150011.925012.225011.800012.25001
ChOA9.6750109.950098.7500811.27501111.22501111.150010
AOA11.20001111.80001110.55001111.07501011.15001011.425011
ODSFMFO5.675046.900067.550065.325056.125056.40005
M-MFO1.475011.350011.500011.000011.325011.02501
Table 9. Friedman test for hybrid and composition functions of the CEC 2018.
Table 9. Friedman test for hybrid and composition functions of the CEC 2018.
FunctionsHybrid FunctionsComposition Functions
Dimensions30501003050100
AlgorithmsAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall Rank
KH6.035066.015066.040056.445066.975066.45505
GWO5.835045.500046.050065.170034.795035.05504
MFO6.295077.205078.135086.690076.575056.97506
WOA9.240098.060086.330078.545088.615088.02008
WCMFO5.970055.755055.040035.750045.310045.01003
CMFO5.355035.190035.440046.385057.025077.33507
HGSO10.91001110.73001010.8550108.980099.040099.76009
RGA-DX3.000012.840012.395012.250012.235012.07001
ChOA10.85501010.87501110.3950910.36501010.0700109.935010
AOA8.490089.7400911.29501111.64001111.68001111.800011
ODSFMFO4.360024.645024.880024.375024.275024.22502
M-MFO1.655011.445011.145011.405011.405011.36001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L.; Abd Elaziz, M. Migration-Based Moth-Flame Optimization Algorithm. Processes 2021, 9, 2276. https://0-doi-org.brum.beds.ac.uk/10.3390/pr9122276

AMA Style

Nadimi-Shahraki MH, Fatahi A, Zamani H, Mirjalili S, Abualigah L, Abd Elaziz M. Migration-Based Moth-Flame Optimization Algorithm. Processes. 2021; 9(12):2276. https://0-doi-org.brum.beds.ac.uk/10.3390/pr9122276

Chicago/Turabian Style

Nadimi-Shahraki, Mohammad H., Ali Fatahi, Hoda Zamani, Seyedali Mirjalili, Laith Abualigah, and Mohamed Abd Elaziz. 2021. "Migration-Based Moth-Flame Optimization Algorithm" Processes 9, no. 12: 2276. https://0-doi-org.brum.beds.ac.uk/10.3390/pr9122276

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop