Back to Search Start Over

Multi-objective evolutionary algorithms are generally good: Maximizing monotone submodular functions over sequences.

Authors :
Qian, Chao
Liu, Dan-Xuan
Feng, Chao
Tang, Ke
Source :
Theoretical Computer Science. Jan2023, Vol. 943, p241-266. 26p.
Publication Year :
2023

Abstract

Evolutionary algorithms (EAs) are general-purpose optimization algorithms, inspired by natural evolution. Recent theoretical studies have shown that EAs can achieve good approximation guarantees for solving the problem classes of submodular optimization, which have a wide range of applications, such as maximum coverage, sparse regression, influence maximization, document summarization and sensor placement, just to name a few. Though they have provided some theoretical explanation for the general-purpose nature of EAs, the considered submodular objective functions are defined only over sets or multisets. To complement this line of research, this paper studies the problem class of maximizing monotone submodular functions over sequences, where the objective function depends on the order of items. We prove that for each kind of previously studied monotone submodular objective functions over sequences, i.e., prefix monotone submodular functions, weakly monotone and strongly submodular functions, and DAG monotone submodular functions, a simple multi-objective EA, i.e., GSEMO, can always reach or improve the best known approximation guarantee after running polynomial time in expectation. Note that these best-known approximation guarantees can be obtained only by different greedy-style algorithms before. Empirical studies on various applications, e.g., accomplishing tasks, maximizing information gain, search-and-tracking and recommender systems, show the excellent performance of the GSEMO. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
03043975
Volume :
943
Database :
Academic Search Index
Journal :
Theoretical Computer Science
Publication Type :
Academic Journal
Accession number :
161080068
Full Text :
https://doi.org/10.1016/j.tcs.2022.12.011