Back to Search Start Over

An efficient parallel self-attention transformer for CSI feedback.

Authors :
Liu, Ziang
Song, Tianyu
Zhao, Ruohan
Jin, Jiyu
Jin, Guiyue
Source :
Physical Communication; Oct2024, Vol. 66, pN.PAG-N.PAG, 1p
Publication Year :
2024

Abstract

In massive multi-input multi-output (MIMO) systems, it is necessary for user equipment (UE) to transmit downlink channel state information (CSI) back to the base station (BS). As the number of antennas increases, the feedback overhead of CSI consumes a significant amount of uplink bandwidth resources. To minimize the bandwidth overhead, we propose an efficient parallel attention transformer, called EPAformer, a lightweight network that utilizes the transformer architecture and efficient parallel self-attention (EPSA) for CSI feedback tasks. The EPSA expands the attention area of each token within the transformer block effectively by dividing multiple heads into parallel groups and conducting self-attention in horizontal and vertical stripes. The proposed EPSA achieves better feature compression and reconstruction. The simulation results display that the EPAformer surpasses previous deep learning-based approaches in terms of reconstruction performance and complexity. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
18744907
Volume :
66
Database :
Supplemental Index
Journal :
Physical Communication
Publication Type :
Academic Journal
Accession number :
180174479
Full Text :
https://doi.org/10.1016/j.phycom.2024.102483