TY - GEN
T1 - Long Short-term Cognitive Networks
T2 - An Empirical Performance Study
AU - Nápoles, Gonzalo
AU - Grau, Isel
N1 - DBLP License: DBLP's bibliographic metadata records provided through http://dblp.org/ are distributed under a Creative Commons CC0 1.0 Universal Public Domain Dedication. Although the bibliographic metadata records are provided consistent with CC0 1.0 Dedication, the content described by the metadata records is not. Content may be subject to copyright, rights of privacy, rights of publicity and other restrictions.
Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Long Short-Term Cognitive Networks (LSTCNs) are recurrent neural networks for univariate and multivariate time series forecasting. This interpretable neural system is rooted in cognitive mapping formalism in the sense that both neural concepts and weights have a precise meaning for the problem being modeled. However, its weights are not constrained to any specific interval, therefore conferring to the model improved approximation capabilities. Originally designed for handling very long time series, the model's performance remains unexplored when it comes to shorter time series that often describe real-world applications. In this paper, we conduct an empirical study to assess both the efficacy and efficiency of the LSTCN model using 25 time series datasets and different prediction horizons. The numerical simulations have concluded that after performing hyper-parameter tuning, LSTCNs are as powerful as state-of-The-Art deep learning algorithms, such as the Long Short-Term Memory and the Gated Recurrent Unit, in terms of forecasting error. However, in terms of training time, the LSTCN model largely outperforms the remaining recurrent neural networks, thus emerging as the winner in our study.
AB - Long Short-Term Cognitive Networks (LSTCNs) are recurrent neural networks for univariate and multivariate time series forecasting. This interpretable neural system is rooted in cognitive mapping formalism in the sense that both neural concepts and weights have a precise meaning for the problem being modeled. However, its weights are not constrained to any specific interval, therefore conferring to the model improved approximation capabilities. Originally designed for handling very long time series, the model's performance remains unexplored when it comes to shorter time series that often describe real-world applications. In this paper, we conduct an empirical study to assess both the efficacy and efficiency of the LSTCN model using 25 time series datasets and different prediction horizons. The numerical simulations have concluded that after performing hyper-parameter tuning, LSTCNs are as powerful as state-of-The-Art deep learning algorithms, such as the Long Short-Term Memory and the Gated Recurrent Unit, in terms of forecasting error. However, in terms of training time, the LSTCN model largely outperforms the remaining recurrent neural networks, thus emerging as the winner in our study.
KW - fuzzy cognitive maps
KW - long short-Term cognitive networks
KW - recurrent neural networks
KW - time series
U2 - 10.1109/EAIS58494.2024.10570005
DO - 10.1109/EAIS58494.2024.10570005
M3 - Conference contribution
T3 - IEEE Conference on Evolving and Adaptive Intelligent Systems
SP - 1
EP - 8
BT - IEEE International Conference on Evolving and Adaptive Intelligent Systems 2024, EAIS 2024 - Proceedings
A2 - Iglesias Martinez, Jose Antonio
A2 - Baruah, Rashmi Dutta
A2 - Kangin, Dimitry
A2 - De Campos Souza, Paulo Vitor
ER -