TY - JOUR
T1 - Fuzzy-Rough Cognitive Networks
T2 - Theoretical Analysis and Simpler Models
AU - Concepcion, Leonardo
AU - Napoles, Gonzalo
AU - Grau, Isel
AU - Pedrycz, Witold
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2022/5/1
Y1 - 2022/5/1
N2 - Fuzzy-rough cognitive networks (FRCNs) are recurrent neural networks (RNNs) intended for structured classification purposes in which the problem is described by an explicit set of features. The advantage of this granular neural system relies on its transparency and simplicity while being competitive to state-of-the-art classifiers. Despite their relative empirical success in terms of prediction rates, there are limited studies on FRCNs' dynamic properties and how their building blocks contribute to the algorithm's performance. In this article, we theoretically study these issues and conclude that boundary and negative neurons always converge to a unique fixed-point attractor. Moreover, we demonstrate that negative neurons have no impact on the algorithm's performance and that the ranking of positive neurons is invariant. Moved by our theoretical findings, we propose two simpler fuzzy-rough classifiers that overcome the detected issues and maintain the competitive prediction rates of this classifier. Toward the end, we present a case study concerned with image classification, in which a convolutional neural network is coupled with one of the simpler models derived from the theoretical analysis of the FRCN model. The numerical simulations suggest that once the features have been extracted, our granular neural system performs as well as other RNNs.
AB - Fuzzy-rough cognitive networks (FRCNs) are recurrent neural networks (RNNs) intended for structured classification purposes in which the problem is described by an explicit set of features. The advantage of this granular neural system relies on its transparency and simplicity while being competitive to state-of-the-art classifiers. Despite their relative empirical success in terms of prediction rates, there are limited studies on FRCNs' dynamic properties and how their building blocks contribute to the algorithm's performance. In this article, we theoretically study these issues and conclude that boundary and negative neurons always converge to a unique fixed-point attractor. Moreover, we demonstrate that negative neurons have no impact on the algorithm's performance and that the ranking of positive neurons is invariant. Moved by our theoretical findings, we propose two simpler fuzzy-rough classifiers that overcome the detected issues and maintain the competitive prediction rates of this classifier. Toward the end, we present a case study concerned with image classification, in which a convolutional neural network is coupled with one of the simpler models derived from the theoretical analysis of the FRCN model. The numerical simulations suggest that once the features have been extracted, our granular neural system performs as well as other RNNs.
KW - Convergence
KW - fuzzy-rough cognitive networks (FRCNs)
KW - Granular computing
KW - Rough cognitive mapping
UR - http://www.scopus.com/inward/record.url?scp=85130767670&partnerID=8YFLogxK
U2 - 10.1109/TCYB.2020.3022527
DO - 10.1109/TCYB.2020.3022527
M3 - Article
C2 - 33027021
AN - SCOPUS:85130767670
SN - 2168-2267
VL - 52
SP - 2994
EP - 3005
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 5
ER -