Back to Search Start Over

Employing Sequence-to-Sequence Stacked LSTM Autoencoder Architecture to Forecast Indian Weather.

Authors :
Maharatha, Arpita
Das, Ratnakar
Mishra, Jibitesh
Nayak, Soumya Ranjan
Aluvala, Srinivas
Source :
Procedia Computer Science; 2024, Vol. 235, p2258-2268, 11p
Publication Year :
2024

Abstract

One of the buzz words in current research depends for a weather forecast is the atmospheric temperature, and this value may change at any time. Predicting air temperature exactly is necessary since it is particularly important for various communities and events. In today's world of massive amounts of data and unforeseen data fluctuations, long-short term memory (LSTM), a sort of neural network method, is becoming more and more popular. The present work employed a sequence to sequence LSTM autoencoder to forecast Indian weather patterns. The prediction model was trained using historical data obtained from the Khordha district administration. The proposed method primarily emphasizes two phases of the procedure. The present study involves an examination of a proposed model, wherein a comparison is made between four distinct variants of LSTM techniques: vanilla LSTM, stacked LSTM, bidirectional LSTM, and convLSTM. The objective of this analysis is to predict temperature patterns. Additionally, this paper utilizes an LSTM autoencoder model to forecast the temperature for the subsequent two-year period in the Khordha district of Odisha, India. The experimental investigation demonstrates that the utilization of Stacking encoder-decoders with bidirectional LSTM cells leads to a notable enhancement in the accuracy of the model, resulting in maximum efficacy. A notably lower Mean Absolute Error (MAE) result suggests that the 9-layer stacked autoencoder model has superior performance in predicting both the minimum and maximum temperature. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
18770509
Volume :
235
Database :
Supplemental Index
Journal :
Procedia Computer Science
Publication Type :
Academic Journal
Accession number :
177603794
Full Text :
https://doi.org/10.1016/j.procs.2024.04.214