Back to Search Start Over

Linear MIMO Precoders With Finite Alphabet Inputs via Stochastic Optimization and Deep Neural Networks (DNNs).

Authors :
Jing, Shusen
Xiao, Chengshan
Source :
IEEE Transactions on Signal Processing; 11/1/2021, p4269-4281, 13p
Publication Year :
2021

Abstract

In this paper, we investigate designs of linear precoders for vector Gaussian channels via stochastic optimizations and deep neural networks (DNNs). We assume that the channel inputs are drawn from practical finite alphabets, and we search for precoders maximizing the mutual information between channel inputs and outputs. Though the problem is generally non-convex, we prove that when the right singular matrix of precoder is fixed, any local optima of this problem is a global optima. Based on this fact, an efficient projected stochastic gradient descent (PSGD) algorithm is designed to search the optimal precoders. Moreover, to reduce the complexity of calculating a posterior means involved in gradients calculation, K-best algorithm is adopted to make approximations of a posterior means with negligible loss of accuracy. Furthermore, to avoid explicit calculation of mutual information and its gradients, DNN-based autoencoders (AEs) are constructed for this precoding task, and an efficient training algorithm is proposed. We also prove that the AEs, with ‘softmax’ activation function and ‘categorical cross entropy’ loss, maximize the mutual information under reasonable assumptions. Then, in order to extend the AE methods to large scale systems, ‘sigmoid’ activation function and ‘binary cross entropy’ loss are used such that the size of AEs will not grow prohibitively large. We prove that this maximizes a lower bound of the mutual information under reasonable assumptions. Finally, to make the precoders practical for high speed wireless scenarios, we propose an offline training paradigm which trains DNNs to infer optimal precoders given channel state information instead of training online for every different channel. Simulation results show that all the proposed methods work well in maximizing mutual information and improving bit error rate (BER) performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1053587X
Database :
Complementary Index
Journal :
IEEE Transactions on Signal Processing
Publication Type :
Academic Journal
Accession number :
153880519
Full Text :
https://doi.org/10.1109/TSP.2021.3096466