Back to Search
Start Over
Reconstruct Recurrent Neural Networks via Flexible Sub-Models for Time Series Classification.
- Source :
- Applied Sciences (2076-3417); Apr2018, Vol. 8 Issue 4, p630, 21p
- Publication Year :
- 2018
-
Abstract
- Recurrent neural networks (RNNs) remain challenging, and there is still a lack of long-term memory or learning ability in sequential data classification and prediction. In this paper, we propose a flexible recurrent model, BIdirectional COnvolutional RaNdom RNNs (BICORN-RNNs), incorporating a series of sub-models: random projection, convolutional operation, and bidirectional transmission. These subcategories advance classification accuracy, which was limited by the gradient vanishing and the exploding problem. Experiments on public time series datasets demonstrate that our proposed method substantially outperforms a variety of existing models. Furthermore, the coordination of the accuracy and efficiency concerning a variety of factors, including SNR, length, data missing, and overlapping, is also discussed. [ABSTRACT FROM AUTHOR]
- Subjects :
- ARTIFICIAL neural networks
RANDOM projection method
TIME series analysis
Subjects
Details
- Language :
- English
- ISSN :
- 20763417
- Volume :
- 8
- Issue :
- 4
- Database :
- Complementary Index
- Journal :
- Applied Sciences (2076-3417)
- Publication Type :
- Academic Journal
- Accession number :
- 129830314
- Full Text :
- https://doi.org/10.3390/app8040630