Back to Search Start Over

Iterative convolutional enhancing self-attention Hawkes process with time relative position encoding.

Authors :
Bian, Wei
Li, Chenlong
Hou, Hongwei
Liu, Xiufang
Source :
International Journal of Machine Learning & Cybernetics; Jul2023, Vol. 14 Issue 7, p2529-2544, 16p
Publication Year :
2023

Abstract

Modeling Hawkes process using deep learning is superior to traditional statistical methods in the goodness of fit. However, methods based on RNN or self-attention are deficient in long-time dependence and recursive induction, respectively. Universal Transformer (UT) is an advanced framework to integrate these two requirements simultaneously due to its continuous transformation of self-attention in the depth of the position. In addition, migration of the UT framework involves the problem of effectively matching Hawkes process modeling. Thus, in this paper, an iterative convolutional enhancing self-attention Hawkes process with time relative position encoding (ICAHP-TR) is proposed, which is based on improved UT. First, the embedding maps from dense layers are carried out on sequences of arrival time points and markers to enrich event representation. Second, the deep network composed of UT extracts hidden historical information from event expression with the characteristics of recursion and the global receptive field. Third, two designed mechanics, including the relative positional encoding on the time step and the convolution enhancing perceptual attention are adopted to avoid losing dependencies between relative and adjacent positions in the Hawkes process. Finally, the hidden historical information is mapped by Dense layers as parameters in Hawkes process intensity function, thereby obtaining the likelihood function as the network loss. The experimental results show that the proposed methods demonstrate the effectiveness of synthetic datasets and real-world datasets from the perspective of both the goodness of fit and predictive ability compared with other baseline methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
18688071
Volume :
14
Issue :
7
Database :
Complementary Index
Journal :
International Journal of Machine Learning & Cybernetics
Publication Type :
Academic Journal
Accession number :
163726803
Full Text :
https://doi.org/10.1007/s13042-023-01780-2