Comparison of universal approximators incorporating partial monotonicity by structure

A. Minin, M.V. Velikova, B. Lang, H.A.M. Daniëls

Research output: Contribution to journalArticleScientificpeer-review

Abstract

Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input–output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN–MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.

Fingerprint

Neural Networks (Computer)
Neural networks
Multilayer neural networks

Cite this

@article{3ea5c8f28c544e5d85aa8a71d5b23488,
title = "Comparison of universal approximators incorporating partial monotonicity by structure",
abstract = "Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input–output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN–MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.",
author = "A. Minin and M.V. Velikova and B. Lang and H.A.M. Dani{\"e}ls",
year = "2010",
language = "English",
volume = "23",
pages = "471--475",
journal = "Neural Networks: The official journal of the International Neural Network Society, European Neural Network Society, Japanese Neural Network Society",
issn = "0893-6080",
publisher = "Elsevier Limited",

}

TY - JOUR

T1 - Comparison of universal approximators incorporating partial monotonicity by structure

AU - Minin, A.

AU - Velikova, M.V.

AU - Lang, B.

AU - Daniëls, H.A.M.

PY - 2010

Y1 - 2010

N2 - Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input–output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN–MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.

AB - Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input–output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN–MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.

M3 - Article

VL - 23

SP - 471

EP - 475

JO - Neural Networks: The official journal of the International Neural Network Society, European Neural Network Society, Japanese Neural Network Society

JF - Neural Networks: The official journal of the International Neural Network Society, European Neural Network Society, Japanese Neural Network Society

SN - 0893-6080

ER -