Back to Search
Start Over
Graph Contrastive Pre-training for Anti-money Laundering
- Source :
- International Journal of Computational Intelligence Systems, Vol 17, Iss 1, Pp 1-16 (2024)
- Publication Year :
- 2024
- Publisher :
- Springer, 2024.
-
Abstract
- Abstract Anti-money laundering (AML) is vital to maintaining financial markets, social stability, and political authority. At present, many studies model the AML task as the graph and leverage graph neural network (GNN) for node/edge classification. Although these studies have achieved some achievements, they struggle with the issue of label scarcity in real-world scenarios. In this paper, we propose a graph contrastive pre-training framework for anti-money laundering (GCPAL), which mines supervised signals from the label-free transaction network to significantly reduce the dependence on annotations. Specifically, we construct three augmented views (i.e., two stochastic perturbed views and a KNN view). Perturbed views are beneficial to the model learning invariant information and improve the robustness against noise. KNN view provides implicit interactions to mitigate the link sparsity in the transaction network. Moreover, we extend the positive sample set using connected neighbors and node pairs with similar features to further enhance the expressiveness of the model. We evaluate the GCPAL on two datasets, and the extensive experimental results demonstrate that the GCPAL is consistently superior to other SOTA baselines, especially with scarce labels.
Details
- Language :
- English
- ISSN :
- 18756883
- Volume :
- 17
- Issue :
- 1
- Database :
- Directory of Open Access Journals
- Journal :
- International Journal of Computational Intelligence Systems
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.b0310cb42b1a4061ac5c6b2fc5b1df20
- Document Type :
- article
- Full Text :
- https://doi.org/10.1007/s44196-024-00720-4