Abstract Metaheuristics have been established as effective solvers for challenging optimization problems. However, their performance is highly dependent on their parameter settings. For this reason, various parameter tuning techniques have been developed, spanning two major categories: online and offline techniques. Online techniques are based on performance data of the algorithm during its run, while offline techniques are based on preprocessing or historical performance data of the algorithm. Alternatively to these techniques, we propose a general online parameter adaptation method based on estimations of the algorithm's performance and gradient search in the parameter domain. The proposed method is demonstrated on Differential Evolution, a state-of-the-art metaheuristic for continuous optimization. Our experimental validation includes problems of low and high dimension as well as comparisons with distinguished adaptive algorithms. The obtained results suggest that the proposed approach is beneficial, relieving the user from the burden of proper parameterization. Graphical abstract Highlights • Metaheuristics can be effective solvers under proper parameterization. • We propose GPALS, a general online parameter adaptation method for metaheurtistics. • GPALS is based on approximate gradient search and line search in parameter domain. • The method is demonstrated on Differential Evolution algorithm for two test suites. • Significant benefits for the user and the algorithm are gained. [ABSTRACT FROM AUTHOR]