Back to Search Start Over

Multi-Layer Bilinear Generalized Approximate Message Passing.

Authors :
Zou, Qiuyun
Zhang, Haochuan
Yang, Hongwen
Source :
IEEE Transactions on Signal Processing; 11/1/2021, p4529-4543, 15p
Publication Year :
2021

Abstract

In this paper, we extend the bilinear generalized approximate message passing (BiG-AMP) approach, originally proposed for high-dimensional generalized bilinear regression, to the multi-layer case for the handling of cascaded problem such as matrix-factorization problem arising in relay communication among others. Assuming statistically independent matrix entries with known priors, the new algorithm called ML-BiGAMP could approximate the general sum-product loopy belief propagation (LBP) in the high-dimensional limit enjoying a substantial reduction in computational complexity. We demonstrate that, in large system limit, the asymptotic MSE performance of ML-BiGAMP could be fully characterized via a set of simple one-dimensional equations termed state evolution (SE). We establish that the asymptotic MSE predicted by ML-BiGAMP’ SE matches perfectly the exact MMSE predicted by the replica method, which is well-known to be Bayes-optimal but infeasible in practice. This consistency indicates that the ML-BiGAMP may still retain the same Bayes-optimal performance as the MMSE estimator in high-dimensional applications, although ML-BiGAMP's computational burden is far lower. As an illustrative example of the general ML-BiGAMP, we provide a detector design that could estimate the channel fading and the data symbols jointly with high precision for the two-hop amplify-and-forward relay communication systems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1053587X
Database :
Complementary Index
Journal :
IEEE Transactions on Signal Processing
Publication Type :
Academic Journal
Accession number :
153880529
Full Text :
https://doi.org/10.1109/TSP.2021.3100305