1. Comparison of universal approximators incorporating partial monotonicity by structure
- Author
-
Bernhard Lang, Marina Velikova, Hennie Daniels, Alexey Minin, Department of Technology and Operations Management, and Research Group: Operations Research
- Subjects
Mathematical optimization ,Artificial neural network ,Cognitive Neuroscience ,Stability (learning theory) ,Computational Biology ,Monotonic function ,Perceptron ,Pattern Recognition, Automated ,Monotone polygon ,Function approximation ,Approximation error ,Artificial Intelligence ,Convergence (routing) ,Software Science ,Computer Simulation ,Neural Networks, Computer ,Algorithms ,Mathematics - Abstract
Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input–output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN–MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.
- Published
- 2010