TY - GEN
T1 - On the performance of the nonsynaptic backpropagation for training long-term cognitive networks
AU - Nápoles, Gonzalo
AU - Grau, Isel
AU - Concepción, Leonardo
AU - Salgueiro, Yamisleydi
N1 - Funding Information:
This paper was partially supported by CONICYT Program FONDECYT de Postdoctorado through project 3200284 and the Artificial Intelligence Research Program of the Flemish Government, Belgium.
Publisher Copyright:
© 2021 Institution of Engineering and Technology. All rights reserved.
PY - 2021
Y1 - 2021
N2 - Long-term Cognitive Networks (LTCNs) are recurrent neural networks for modeling and simulation. Such networks can be trained in a synaptic or nonsynaptic mode according to their goal. Nonsynaptic learning refers to adjusting the transfer function parameters while preserving the weights connecting the neurons. In that regard, the Nonsynaptic Backpropagation (NSBP) algorithm has proven successful in training LTCN-based models. Despite NSBP’s success, a question worthy of investigation is whether the backpropagation process is necessary when training these recurrent neural networks. This paper investigates this issue and presents three nonsynaptic learning methods that modify the original algorithm. In addition, we perform a sensitivity analysis of both the NSBP’s hyperparameters and the LTCNs’ learnable parameters. The main conclusions of our study are i) the backward process attached to the NSBP algorithm is not necessary to train these recurrent neural systems, and ii) there is a nonsynaptic learnable parameter that does not contribute significantly to the LTCNs’ performance.
AB - Long-term Cognitive Networks (LTCNs) are recurrent neural networks for modeling and simulation. Such networks can be trained in a synaptic or nonsynaptic mode according to their goal. Nonsynaptic learning refers to adjusting the transfer function parameters while preserving the weights connecting the neurons. In that regard, the Nonsynaptic Backpropagation (NSBP) algorithm has proven successful in training LTCN-based models. Despite NSBP’s success, a question worthy of investigation is whether the backpropagation process is necessary when training these recurrent neural networks. This paper investigates this issue and presents three nonsynaptic learning methods that modify the original algorithm. In addition, we perform a sensitivity analysis of both the NSBP’s hyperparameters and the LTCNs’ learnable parameters. The main conclusions of our study are i) the backward process attached to the NSBP algorithm is not necessary to train these recurrent neural systems, and ii) there is a nonsynaptic learnable parameter that does not contribute significantly to the LTCNs’ performance.
KW - Long-term cognitive networks
KW - Neural cognitive mapping
KW - Nonsynaptic learning
UR - http://www.scopus.com/inward/record.url?scp=85118124293&partnerID=8YFLogxK
U2 - 10.1049/icp.2021.1434
DO - 10.1049/icp.2021.1434
M3 - Conference contribution
AN - SCOPUS:85118124293
T3 - IET Conference Publications
SP - 25
EP - 30
BT - 11th International Conference of Pattern Recognition Systems, ICPRS 2021
PB - Institution of Engineering and Technology
T2 - 11th International Conference of Pattern Recognition Systems, ICPRS 2021
Y2 - 17 March 2021 through 19 March 2021
ER -