1. DSAGAN: A generative adversarial network based on dual-stream attention mechanism for anatomical and functional image fusion
- Author
-
Jiao Du, Liming Xu, Weisheng Li, and Jun Fu
- Subjects
Image fusion ,Fusion ,Information Systems and Management ,Forcing (recursion theory) ,Discriminator ,Computer science ,business.industry ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Process (computing) ,Pattern recognition ,DUAL (cognitive architecture) ,Computer Science Applications ,Theoretical Computer Science ,Transformation (function) ,Artificial Intelligence ,Control and Systems Engineering ,Artificial intelligence ,business ,Software ,Generator (mathematics) - Abstract
In recent years, extensive multimodal medical image fusion algorithms have been proposed. However, existing methods are primarily based on specific transformation theories. There are many problems with existing algorithms, such as poor adaptability, low efficiency and blurry details. To address these problems, this paper proposes a generative adversarial network based on dual-stream attention mechanism (DSAGAN) for anatomical and functional image fusion. The dual-stream architecture and multiscale convolutions are utilized to extract deep features. In addition, the attention mechanism is utilized to further enhance the fused features. Then, the fusion images and multimodal input images are put into the discriminator . In the update stage of the discriminator, we expect to judge the multimodal images as real, and to judge the fusion images as fake. Furthermore, the fusion images are expected to be judged as real in the update stage of the generator, forcing the generator to improve the fusion quality. The training process continues until the generator and discriminator reach a Nash equilibrium . After training, the fusion images can be obtained directly after inputting anatomical and functional images. Compared with the reference algorithms , DSAGAN consumes less fusion time and achieves better objective metrics in terms of Q AG, QEN and QNIQE.
- Published
- 2021
- Full Text
- View/download PDF