1. 주식데이터를 활용한 Attention과 LSTM의 성능 비교.
- Author
-
Yunsang Yoo and Dong-ho Shin
- Subjects
TIME series analysis ,FORECASTING - Abstract
n recent years, there has been significant progress in the fields of Attention-based Transformers[1, 2]. In this paper, we compare the performance of an Attention encoder and an LSTM[3] for the stock prediction task, evaluating the performance based on the model's size and input length. We utilize stock history data for Apple, Samsung, and Amazon, obtained from Kaggle. The Attention Encoder model outperforms the LSTM baseline model on all datasets. This demonstrates the superior performance of the Attention Encoder model. Additionally, increasing the input sequence dimension of the Attention Encoder network leads to improved performance. However, there is a concern regarding overfitting of the Attention Encoder model due to the rapid changes in the data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF