Comparison of universal approximators incorporating partial monotonicity by structure

A. Minin, M.V. Velikova, B. Lang, H.A.M. Daniëls

Research output: Contribution to journalArticleScientificpeer-review

19 Citations (Scopus)

Abstract

Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input–output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN–MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.

Fingerprint

Dive into the research topics of 'Comparison of universal approximators incorporating partial monotonicity by structure'. Together they form a unique fingerprint.

Cite this