Derivatives of Mutual Information in Gaussian Vector Channels with Applications
Authors
Abstract
In this paper, derivatives of mutual information for
a general linear Gaussian vector channel are considered.
It is first shown how the corresponding gradient relates to the
the minimum mean squared error (MMSE) estimator and its error matrix.
We then determine the directional derivative of mutual information
and use this geometrically intuitive concept to characterize
the capacity-achieving input distribution
of the above channel subject to certain power constraints.
The well-known water-filling solution is revisited and
obtained as a special case. But also for shaping constraints on the
maximum and the Euclidean norm of mean powers explicit solutions are
derived. Moreover, uncorrelated sum power constraints are
considered. The optimum input can here always be achieved by linear
precoding.
BibTEX Reference Entry
@inproceedings{FeHaMa07, author = {Anke Feiten and Stephen Hanly and Rudolf Mathar}, title = "Derivatives of Mutual Information in {G}aussian Vector Channels with Applications", booktitle = "{IEEE} ISIT 2007", address = {Nice}, year = 2007, hsb = RWTH-CONV-223563, }
Downloads
Download paper Download bibtex-file
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights there in are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.