BER and Outage Probability Approximations for LMMSE Detectors on Correlated MIMO Channels
This paper is devoted to the study of the performance of the Linear Minimum Mean-Square Error receiver for (receive) correlated Multiple-Input Multiple-Output systems. By the random matrix theory, it is well-known that the Signal-to-Noise Ratio (SNR) at the output of this receiver behaves asymptotic...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , , , |
Format | Paper |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
16.10.2008
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper is devoted to the study of the performance of the Linear Minimum Mean-Square Error receiver for (receive) correlated Multiple-Input Multiple-Output systems. By the random matrix theory, it is well-known that the Signal-to-Noise Ratio (SNR) at the output of this receiver behaves asymptotically like a Gaussian random variable as the number of receive and transmit antennas converge to +\(\infty\) at the same rate. However, this approximation being inaccurate for the estimation of some performance metrics such as the Bit Error Rate and the outage probability, especially for small system dimensions, Li et al. proposed convincingly to assume that the SNR follows a generalized Gamma distribution which parameters are tuned by computing the first three asymptotic moments of the SNR. In this article, this technique is generalized to (receive) correlated channels, and closed-form expressions for the first three asymptotic moments of the SNR are provided. To obtain these results, a random matrix theory technique adapted to matrices with Gaussian elements is used. This technique is believed to be simple, efficient, and of broad interest in wireless communications. Simulations are provided, and show that the proposed technique yields in general a good accuracy, even for small system dimensions. |
---|---|
ISSN: | 2331-8422 |