Abstract

This paper proposes a hybrid model based on decomposition for constrained optimization problems. Firstly, a constrained optimization problem is transformed into a biobjective optimization problem. Then, the biobjective optimization problem is divided into a set of subproblems, and different subproblems are assigned to different Fitness functions by the direction vectors. Different from decomposition-based multiobjective optimization algorithms in which each subproblem is optimized by using the information of its neighboring subproblems, the neighbors of each subproblem are deFined based on corresponding direction vector only in the method. By combining three main components, namely, the local search model, the global search model, and the direction vector adjusting strategy, the population can gradually move toward the global optimal solution. Experiments on two sets of test problems and Five real-world engineering design problems have shown that the proposed method performs better than or is competitive with other compared methods.

1. Introduction

Constrained optimization has a wide application background in many important Fields, such as economics, engineering, and science [13]. In general, the mathematical deFinition of a constrained optimization problem (COP) is as below:where represents a d-dimensional solution vector. represents the objective function. and denote -equality constraints and -inequality constraints. is restricted by the upper and lower bounds and , respectively.

In COPs, an equality constraint is generally converted into the following inequality forms.where denotes a small positive value (i.e., 104). In order to judge whether a solution in COPs is a feasible solution, we must consider its overall constraint violation degree, which is computed as below:where and satisFies the constraints if and only if .

Evolutionary algorithm (EA), which is a metaheuristic algorithm, has been adopted to deal with COPs in the past two decades. When EAs are employed to solve COPs, the constraint-handling techniques (CHTs) should be considered. The current popular CHTs include the penalty function methods [47], the multiobjective optimization methods [813], the feasibility rule methods [1419], the constrained methods [2022], and the hybrid methods [2326]. In the penalty function methods, a penalty Fitness function is deFined by adding a penalty term to the objective function. In the feasibility rule methods, the feasible individuals are superior to the infeasible individuals. The constrained method is a representative CHT, in which the level is utilized to relax the constraints. And the hybrid method solves COPs by combining multiple constraint-handling techniques.

The multiobjective optimization methods have been adopted to solve COPs in the last two decades. These methods always transform a COP into a biobjective optimization problem (BOP), in which one objective is the overall constraint violation degree and another objective is the original objective . Then, the multiobjective optimization techniques, such as the Pareto dominance or the aggregation method, are utilized to compare the individuals. For example, Wang et al. [27] employed a dynamic hybrid model for solving COPs, in which Pareto dominance is employed for the comparison. Gao et al. [28] proposed a dual-population method to solve COPs, where and are optimized by the corresponding subpopulation, respectively. Moreover, Wang et al. [29] utilized the correlation between the objective function and the constraints to deal with COPs.

Decomposition method [30] is a representative multiobjective optimization method. To solve COPs through the decomposition-based multiobjective optimization methods, the transformed BOP is converted into a set of subproblems; that is, a group of Fitness functions are constructed by assigning different direction vectors between and . Generally, in the decomposition-based multiobjective optimization method, each individual optimizes a subproblem by combining the information of its neighboring individuals. However, little effort to optimize each subproblem by using the information of its direction vector in the objective space.

Based on the above analysis, a hybrid search model for constrained optimization, called HyCO, is designed to solve COPs in this paper. First of all, a BOP is decomposed into K subproblems. Then, to balance the diversity and convergence, the local and global models are employed to optimize these subproblems. During the local search model, the whole population is decomposed into K subpopulations by adopting the classiFication operator, and each subpopulation optimizes a subproblem. During the global search model, the whole population is guided by a deFined search direction toimprove the convergence. In the process, differential evolution (DE) is utilized to generate the offsprings, and the direction vectors are adjusted to Fit the characteristic of COPs. Furthermore, a simple restart strategy is proposed by Wang et al. [31] to handle complex constraints. The performance of HyCO is tested on IEEE CEC 2010, IEEE CEC 2017, and Five engineering problems. The results show that HyCO is more competitive than other selected methods.

2. Multiobjective Optimization and Vector Angle

2.1. Multiobjective Optimization

Some details of multiobjective optimization problem (MOP) are introduced in this section. Generally speaking, a MOP can be expressed as where represents the n-dimensional decision space. denotes the ith objective function. For two solutions and , some concepts related to MOP are introduced as follows.

Definition 1. is said to dominate (i.e., ). If and .

Definition 2. is called a Pareto optimal solution. If , such that .

Definition 3. A set of all Pareto optimal solutions is called the Pareto set (PS).

Definition 4. The Pareto front (PF) is the set of all Pareto optimal objective vectors (i.e., ).

2.2. Vector Angle

In MOPs, the vector angle represents the angle between two individuals in the objective space. Typically, for two individuals and , the vector angle between them can be computed as below:where is the jth individual’s normalized objective vector, and is computed according to the following equation:where and represent the minimum and maximum values of the ith objective. represents the ith objective function value. represents the norm of a vector. Generally, the vector angle is used to maintain the population diversity for MOPs [32, 33]. SpeciFically, if the vector angle of two individuals is small enough, then their search directions are similar. On the contrary, a large vector angle of two individuals means the different search directions, and the diversity between them can be maintained. Motivated by the above considerations, a new clustering method is designed in HyCO and the population () would be clustered into K subpopulations according to the vector angle to maintain the diversity.

3. Proposed Method

3.1. Motivation

When using EAs to solve COPs, there are two important issues need to be solved: firstly, achieving the balance between the diversity and convergence; secondly, achieving the balance between the constraints and objective function. In HyCO, the local and global search models based on decomposition are proposed to balance the diversity and convergence. Specifically, in the local search model, a clustering method is designed to divide the population into several subpopulations, and each subpopulation optimizes a subproblem to maintain the population diversity. In the global search model, a direction vector is deFined to guide the evolution and enhance the population convergence. In addition, a direction vector adjustment strategy (DVA) is used in HyCO to balance the objective and constraints, which can guide the population to converge to the feasible optimal solution.

Based on the above considerations, this paper utilizes the local and global search models based on decomposition to solve COPs.

3.2. HyCO

In HyCO, a transformed BOP is first decomposed into K subproblems. Next, the local and global search models are designed to optimize these subproblems, and DVA is proposed to adjust the direction vector. The detailed steps of HyCO are provided in Algorithm 1.

Input: The population size m, the number of subpopulations K, and the total number of function evaluations T_FEs.
Output: The best feasible solutions in .
(1) Initialize a population randomly ;
(2) Calculate and of each individual in .
(3) while stopping conditions are not satisfied do
(4)  Execute the local search model;
(5)  Execute the global search model;
(6)  Execute DVA;
(7)  Execute the restart strategy;
(8) end while

DE [34] is employed to generate the offsprings since its powerful search performance. Then, the weighted sum approach is adopted to compare the fitness of two candidate solutions. For a solution , its weight sum can be deFined as below:wherewhere is a parameter in DVA. and represent the minimum and maximum objective function values, respectively. represents the current generation number. represents the minimum overall constraint violation. is the direction vector. represents the maximum overall constraint violation.

In the following sections, the local search model, the global search model, and DVA are introduced, respectively.

3.3. Local Search Model

The purpose of the local search model is to optimize each subproblem from different search directions and Find the promising search directions. In this model, the whole population is divided into K subpopulations by a classiFication operator and each subpopulation is used to optimize its assigned subproblem.

To introduce the classification operator, the vector angle between the direction vector and the normalized objective vector is firstly defined as shown in Figure 1 and calculated as follows:where is the ith individual’s normalized objective vector. is the direction vector, and denotes the inner product between and . Then, the classification operator is given in Algorithm 2.

Input: The population .
Output: K subpopulations .
(1) Calculate the direction vectors according to DVA;
(2) for do
(3)  Calculate the angle according to (9);
(4)  Find the corresponding minimum individuals, which are the minimum distance to the direction vector , to form a subpopulation;
(5)  Eliminate these solutions from ;
(6) end for

As shown in Figure 2, each subpopulation owns a region which centered by a direction vector. Note that the Kth subpopulation contains all the remaining individuals in , and the size of each subpopulation may not be equal. When the direction vector is adjusted, all individuals will be reclassified by the classification operator. Therefore, the individuals may be classified to the subpopulations different from the previous one, which results in the coevolution of different subpopulations. The whole process of the local search model is given in Algorithm 3.

Input: Entire population .
Output: Updated subpopulation .
(1) Divide the population into K subpopulations by Algorithm 2
(2) Calculate the direction vectors by Algorithm 7;
(3) for do
(4)  Set ;
(5)  Generate offspring by Algorithm 4;
(6)  Select the direction vector corresponding to ;
(7)  for do
(8)   if then
(9)    ;
(10)   else
(11)    ;
(12)   end if
(13)  end for
(14) end for

In the process of local search, two modiFied DEs are employed to generate the offsprings. Their formulations are given as follows:(i)DE/ModiFied/1(ii)DE/ModiFied/2where and represent the ith target vector and trial vector, respectively. and represent the jth dimension of them. and denote the best individual and the mean vector in the subpopulation, respectively. , , and represent three individuals in , which satisfy . , , and represent the jth dimension of , , and , respectively. is a random value in [−1, 1]. l is an integer selected from {1, 2}. denotes the scaling factor. represents the ith direction vector. and are randomly generated from [0, 1].

As shown in (10), each subpopulation is guided by its best individual, which prevents the population from trapping into the local optimal solution. As shown in (11) and (12), the information of the individual with the smaller weighted sum is employed to generate the candidate solutions, which can enhance the rate of convergence. The procedure of the local search algorithm is described in Algorithm 4.

Input:: Initial subpopulations.
Output:: Offspring subpopulations.
(1) for do
(2) Set ;
(3)  for do
(4)   Randomly generate a F value in [0, 1];
(5)   if then
(6)    Generate a candidate solution according to (10);
(7)   else
(8)    Generate a candidate solution according to (11) and (12);
(9)   end if
(10)    ;
(11)   end for
(12) end for
3.4. Global Search Model

In the local search model, each subproblem is optimized by corresponding direction vector, which may lead to the slow convergence rate. Candidate solutions are generated by using the individuals within one subpopulation, resulting in the weak information exchange among different subpopulations. Therefore, the diversity can be maintained but the convergence cannot be proved in the local search process. In order to improve the convergence, the global search model is proposed. In this model, a direction vector is deFined to guide the whole population as follows:where is the direction vector that a subproblem has been improved. is the number of improved subproblems. The framework of the global search model is described in Algorithm 5.

Input: Initial population .
Output: Updated population .
(1) Set ;
(2) Calculate according to (13);
(3) Generate an offspring population by Algorithm 6;
(4) for do
(5)  if then
(6)   ;
(7)  else
(8)   ;
(9)  end if
(10) end for

In the process of global search, two DE operators are combined to generate the offsprings. Their formulations are introduced as follows [30, 35, 36]:(i)DE/rand-to-best/1/bin(ii)DE/current-to-rand/1where represents the ith mutant vector. denotes the jth dimension of it. , , and are three integers in , which satisfy . is the best individual according to the weighted sum. denotes the crossover probability. And is a random value in .

With respect to (14), the solution is utilized for enhancing the convergence. In (15), a randomly chosen solution is employed for promoting the diversity. In this paper, these two operations are executed with a probability of 0.5. Its effectiveness has been validated in [22, 37]. The whole process of the global search algorithm is introduced in Algorithm 6.

Input: Initial population , Direction vector .
Output: Offspring population .
(1) Set ;
(2) for do
(3)  Randomly generate an F value from ;
(4)  Randomly generate a CR value from ;
(5)  if then
(6)   Create a candidate solution according to (14);
(7)  else
(8)   Create a candidate solution according to (15);
(9)  end if
(10)   ;
(11) end for
3.5. DVA

DVA is the major component when solving the transformed BOP through the local and global search models based on decomposition. For MOPs, the image of all Pareto optimal solutions is distributed on the PF [38]. However, for a BOP, only one global optimal solution needs to be obtained. Therefore, the direction vectors need to be adjusted to Fit the characteristic of COPs. DVA is proposed by Wang et al. [30], and the direction vector is adjusted according to the sigmoid function as follows:where represents the maximum generation number. and are two positive values to control the change trend of . Moreover, the constrained method is proposed to determine whether should be reduced to a small number for COPs. The details of DVA are given in Algorithm 7.

(1) Set ;
(2) if then
(3)  ;
(4) else
(5)  ;
(6) end if
(7) Calculate the proportion of feasible solutions () in ;
(8) if then
(9)  ;
(10) else
(11)  ;
(12) end if
(13) for do
(14)  ;
(15)  ;
(16) end for

4. Results and Discussion

4.1. Experiment Settings

To test the performance of HyCO, two sets of COPs are adopted. The first set contains 36 COPs from IEEE CEC 2010 [39] and the second set involves 56 COPs from IEEE CEC 2017 [39]. They have different characteristics, such as multimodality, extremely strong nonlinearity, rotated landscape, and so on. The population size (), the number of subpopulations (K), and the total number of function evaluations (T_FEs), which are reported in Table 1, where d represents the dimension of COPs. In addition, each COP runs 25 times independently. in the restart strategy is set to 104. and in the constrained methods are set to 0.85 and 6, respectively. In (15), and are set to 30 and 0.75, respectively.

4.2. Experiments on the 36 COPs from IEEE CEC 2010

First of all, 36 COPs from CEC 2010 are tested in this section. Five competitive methods are selected: FROFI [40], ITLBO [41], DeCODE [30], AIS-IRP [42], and ECHTDE [43]. The experimental results of these methods are obtained from the literature [30, 37]. Since the true optimal value of this test suite is unknown, the “Mean O” and “S” are selected as the comparison criterion. “Mean O” and “S” are the mean and standard deviation of the results, respectively. Furthermore, the multiple-problem Wilcoxon’s test and the Friedman’s test are obtained via KEEL software [44].

In the case of COPs with 10d, the results of “Mean O” and “S,” the Friedman’s test, and the multiple-problem Wilcoxon’s test are given in Tables 24, respectively. In Table 2, “▽” represents any feasible solutions of the compared algorithm cannot be found after the evolution. “+,” “,” and “” represent that HyCO is worse than, competitive with, and better than the selected method, respectively. It can be seen from Table 2 that HyCO surpasses FROFI, ITLBO, DeCODE, AIS-IRP, and ECHTDE on 6, 7, 5, 9, and 9 test problems, respectively. In contrast, FROFI, ITLBO, DeCODE, AIS-IRP, and ECHTDE outperform HyCO on 3, 4, 3, 6, and 6 test problems, respectively. As shown in Table 4, the R values cannot exceed the R+ values in all comparisons. Furthermore, according to the Friedman’s test, HyCO obtains the First rank. Based on these considerations, HyCO is superior to other compared methods on 18 COPs with 10 d from CEC 2010.

In terms of d = 30, all results are recorded in Tables 57, respectively. As described in Table 5, HyCO is superior to FROFI, ITLBO, DeCODE, AIS-IRP, and ECHTDE on 6, 14, 6, 15, and 15 test problems, respectively. In contrast, FROFI, ITLBO, DeCODE, AIS-IRP, and ECHTDE exhibit better performance than HyCO on 1, 0, 2, 2, and 1 test problems, respectively. From Table 6, HyCO obtains the first rank. In addition, according to the multiple-problem Wilcoxon’s test, the R- values cannot exceed the R+ values in all comparisons, and can be seen in three cases (i.e., HyCO vs. AIS-IRP, HyCO vs. ECHTDE, and HyCO vs. ITLBO). In summary, HyCO exhibits better performances than other compared methods on 18 COPs with 30 d from CEC 2010.

4.3. Experiments on the 56 COPs from IEEE CEC 2017

To evaluate the performance of HyCO on complicated COPs, 56 high-dimensional COPs from IEEE CEC 2017 are employed. Two methods, which are derived from the competition at IEEE CEC 2017, are selected as the competitors: LSHADE44 [45] and UDE [46]. The results are reported in Table 8. The test functions C17, C18, C19, C26, C27, and C28 cannot find feasible solutions by these three algorithms, and thus they are removed from the comparison.

As described in Table 8, with respect to 28 COPs with 50 d from CEC 2017, HyCO surpasses LSHADE44 and UDE on 12 and 16 test problems, respectively. However, LSHADE44 and UDE provide better results on 6 and 2 test functions, respectively. In terms of 28 COPs with 100 d from CEC 2017, HyCO outperforms LSHADE44 and UDE on 12 and 17 test functions, respectively, while LSHADE44 and UDE perform better than HyCO on 7 and 3 test problems, respectively. Therefore, HyCO exhibits better performance for high-dimensional COPs.

4.4. Visualization of the Evolution Process

The convergence graphs of HyCO on six representative COPs are plotted in Figure 3. As shown in Figure 3, at the early evolving stage, the convergence speed is slow, and the local search model plays an important role in guiding the population to explore more promising areas. Along with the evolution, the convergence rate becomes faster, and some individuals in the population are feasible. At this time, the global search model plays an important role in guiding the population toward the feasible region sufficiently. Therefore, the local and global search models proposed in this paper can achieve a balance between convergence and diversity.

4.5. Sensitivity of Parameter K

The effect of the number of subpopulations K on HyCO is investigated in this section; the numerical experiments are conducted on five different K values: 8, 10, 12, 14, and 16. The results on 18 COPs with 30 d from CEC 2010 are given in Table 9. As shown in Table 9, HyCO achieves the best results when K = 15. Specially, HyCO with K = 15 provides better results than K = 8, K = 10, K = 12, K = 14, and K = 16, on 8, 6, 4, 4, and 3 test problems, respectively, While HyCO with K = 8, K = 10, K = 12, K = 14, and K = 16 perform better than that with K = 15 on 0, 1, 0, 0, and 1 test problems, respectively. Therefore, K = 15 is a suitable parameter for 18 COPs with 30 d from CEC 2010.

4.6. Real-World Application

To test the performance of HyCO in real-world COPs, five engineering design problems are adopted. The details of these five engineering problems are obtained from the literature [47]. CMODE [48], which is a representative constrained optimization method, is selected as a competitor. The maximum number of evaluations of these five problems are set to 500, 70000, 10000, 10000, and 5000, respectively. The population size and the number of subpopulations are set to 100 and 15. The parameters of CMODE are consistent with the original literature. The results of these two methods are reported in Table 10.

As shown in Table 10, HyCO outperforms CMODE on 3 engineering design problems, while CMODE cannot be better than HyCO in any problems. In summary, HyCO is effective for solving the real-world engineering optimization problems.

5. Conclusion

In this paper, HyCO is designed to solve COPs. In the method, the local and global search models are designed to balance both diversity and convergence. To balance constraints and objective function, the direction vector is adjusted according to the direction vector adjustment strategy. Experiment results on three benchmark test suites, namely, 36 COPs from IEEE CEC 2010, 56 COPs from IEEE CEC 2017, and 5 real world engineering design issues, demonstrate the following conclusions: (1) HyCO is competitive than other selected methods. (2) The local and global search models can achieve the balance between diversity and convergence. (3) The direction vector adjustment strategy can guide the population to converge to the feasible optimal solution.

In our future research, it is meaningful to design a self-adaptive the direction vector adjustment strategy in HyCO to solve high-dimensional test functions. In addition, online learning [4951] will be introduced into constraint optimization in the future.

Data Availability

Data and code are available upon request through sending an e-mail to [email protected].

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This work was supported in part by the National Nature Science Foundation of China under Grants 61772391 and 62106186, in part by the Natural Science Basic Research Plan in Shaanxi Province of China under Grant 2022JQ-670, and in part by the Fundamental Research Funds for the Central Universities under Grant QTZX22047.