1. Incremental Recursive Ranking Grouping for Large Scale Global Optimization
- Author
-
Komarnicki, Marcin Michal, Przewozniczek, Michal Witold, Kwasnicka, Halina, and Walkowiak, Krzysztof
- Subjects
Computer Science - Neural and Evolutionary Computing - Abstract
Real-world optimization problems may have a different underlying structure. In black-box optimization, the dependencies between decision variables remain unknown. However, some techniques can discover such interactions accurately. In Large Scale Global Optimization (LSGO), problems are high-dimensional. It was shown effective to decompose LSGO problems into subproblems and optimize them separately. The effectiveness of such approaches may be highly dependent on the accuracy of problem decomposition. Many state-of-the-art decomposition strategies are derived from Differential Grouping (DG). However, if a given problem consists of non-additively separable subproblems, DG-based strategies may discover many non-existing interactions. On the other hand, monotonicity checking strategies proposed so far do not report non-existing interactions for any separable subproblems but may miss discovering many of the existing ones. Therefore, we propose Incremental Recursive Ranking Grouping (IRRG) that suffers from none of these flaws. IRRG consumes more fitness function evaluations than the recent DG-based propositions, e.g., Recursive DG 3 (RDG3). Nevertheless, the effectiveness of the considered Cooperative Co-evolution frameworks after embedding IRRG or RDG3 was similar for problems with additively separable subproblems that are suitable for RDG3. After replacing the additive separability with non-additive, embedding IRRG leads to results of significantly higher quality.
- Published
- 2022
- Full Text
- View/download PDF