1. Fast Global Self-Attention for Seismic Image Fault Identification
- Author
-
Wang, Shenghou, Si, Xu, Cai, Zhongxian, Sun, Leiming, Wang, Wei, and Jiang, Zirun
- Abstract
Fault identification is one of the challenging techniques for reservoir characterization in seismic exploration and development. Because of the widespread implementation of deep learning, automatic fault identification has developed rapidly. Recently, researchers have begun exploring the application of Transformer-based neural networks from language tasks to image recognition tasks, which provide promising results among various research fields. However, large 3-D seismic data applications bring some challenges to conventional self-attention, such as large memory and expensive computational cost for pixel-level dense identification tasks. We present a new scheme called fast global self-attention (FGSA), whose computation is achieved by cyclic multiplication. The cyclic multiplication brings greater efficiency and less memory through the fast Fourier transform (FFT). Instead of generating large attention matrices, FFT-based computations can directly weigh the features with attention. Compared with windowed attention (e.g., Swin Transformer), this FGSA architecture offers flexible global attention, saves nearly 50% of the memory, and has less computational complexity for the fault detection tasks in this article. These advantages of FGSA allow it to provide more distinct and interpretable fault identification results than conventional fault identification neural networks. Several synthetic and field seismic data examples show that the neural network based on our FGSA architecture has better applicability than some baseline methods in fault identification tasks.
- Published
- 2024
- Full Text
- View/download PDF