Back to Search Start Over

Multi-Transformer: A New Neural Network-Based Architecture for Forecasting S&P Volatility

Authors :
José Javier Núñez-Velázquez
Eduardo Ramos-Pérez
Pablo J. Alonso-González
Source :
Mathematics, Vol 9, Iss 1794, p 1794 (2021)
Publication Year :
2021
Publisher :
MDPI AG, 2021.

Abstract

Events such as the Financial Crisis of 2007–2008 or the COVID-19 pandemic caused significant losses to banks and insurance entities. They also demonstrated the importance of using accurate equity risk models and having a risk management function able to implement effective hedging strategies. Stock volatility forecasts play a key role in the estimation of equity risk and, thus, in the management actions carried out by financial institutions. Therefore, this paper has the aim of proposing more accurate stock volatility models based on novel machine and deep learning techniques. This paper introduces a neural network-based architecture, called Multi-Transformer. Multi-Transformer is a variant of Transformer models, which have already been successfully applied in the field of natural language processing. Indeed, this paper also adapts traditional Transformer layers in order to be used in volatility forecasting models. The empirical results obtained in this paper suggest that the hybrid models based on Multi-Transformer and Transformer layers are more accurate and, hence, they lead to more appropriate risk measures than other autoregressive algorithms or hybrid models based on feed forward layers or long short term memory cells.

Details

ISSN :
22277390
Volume :
9
Database :
OpenAIRE
Journal :
Mathematics
Accession number :
edsair.doi.dedup.....1fab80e598707d7277b07f05f6366038