Back to Search
Start Over
A Recursive Decomposition Method for Large Scale Continuous Optimization.
- Source :
- IEEE Transactions on Evolutionary Computation; Oct2018, Vol. 22 Issue 5, p647-661, 15p
- Publication Year :
- 2018
-
Abstract
- Cooperative co-evolution (CC) is an evolutionary computation framework that can be used to solve high-dimensional optimization problems via a “divide-and-conquer” mechanism. However, the main challenge when using this framework lies in problem decomposition. That is, deciding how to allocate decision variables to a particular subproblem, especially interacting decision variables. Existing decomposition methods are typically computationally expensive. In this paper, we propose a new decomposition method, which we call recursive differential grouping (RDG), by considering the interaction between decision variables based on nonlinearity detection. RDG recursively examines the interaction between a selected decision variable and the remaining variables, placing all interacting decision variables into the same subproblem. We use analytical methods to show that RDG can be used to efficiently decompose a problem, without explicitly examining all pairwise variable interactions. We evaluated the efficacy of the RDG method using large scale benchmark optimization problems. Numerical simulation experiments showed that RDG greatly improved the efficiency of problem decomposition in terms of time complexity. Significantly, when RDG was embedded in a CC framework, the optimization results were better than results from seven other decomposition methods. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 1089778X
- Volume :
- 22
- Issue :
- 5
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Evolutionary Computation
- Publication Type :
- Academic Journal
- Accession number :
- 132127317
- Full Text :
- https://doi.org/10.1109/TEVC.2017.2778089