Back to Search Start Over

自注意力时序点过程生成模型的 Wasserstein 学习方法.

Authors :
芦佳明
李晨龙
魏毅强
Source :
Application Research of Computers / Jisuanji Yingyong Yanjiu. Feb2022, Vol. 39 Issue 2, p456-460. 5p.
Publication Year :
2022

Abstract

At present, the academic circles generally describe the temporal point process by modeling the intensity function using recurrent neural network (RNN). However, this kind of model can't capture the long-range dependence between event sequences, and the specific parameter form of the intensity function will limit the generalization ability of the model. In order to solve these problems, this paper proposed a temporal point process self-attention generation model without intensity function. The model used Wasserstein distance to construct the objective function, which was convenient to measure the deviation between the model distribution and the real distribution, and used the self-attention mechanism to describe the impact of historical events on current events, so that the model was interpretable and had stronger robustness. Comparative experiments show that, in the absence of prior knowledge of intensity function, the deviation of QQ graph slope and empirical intensity deviation of this method reduce 35 . 125% and 24. 200% respectively compared with RNN generation model and maximum likelihood model, which proves the effectiveness of the proposed model. [ABSTRACT FROM AUTHOR]

Details

Language :
Chinese
ISSN :
10013695
Volume :
39
Issue :
2
Database :
Academic Search Index
Journal :
Application Research of Computers / Jisuanji Yingyong Yanjiu
Publication Type :
Academic Journal
Accession number :
154958778
Full Text :
https://doi.org/10.19734/j.issn.1001-3695.2021.08.0298