Derivatives of mutual information in Gaussian vector channels with applications

Anke Feiten*, Stephen Hanly, Rudolf Mathar

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

4 Citations (Scopus)

Abstract

In this paper, derivatives of mutual information for a general linear Gaussian vector channel are considered. We consider two applications. First, it is shown how the corresponding gradient relates to the minimum mean squared error (MMSE) estimator and its error matrix. Secondly, we determine the directional derivative of mutual information and use this geometrically intuitive concept to characterize the capacity-achieving input distribution of the above channel subject to certain power constraints. The well-known water-filling solution is revisited and obtained as a special case. Also for shaping constraints on the maximum and the Euclidean norm of mean powers explicit solutions are derived. Moreover, uncorrelated sum power constraints are considered. The optimum input can here always be achieved by linear precoding.

Original languageEnglish
Title of host publicationProceedings - 2007 IEEE International Symposium on Information Theory, ISIT 2007
Place of PublicationPiscataway, NJ
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages2296-2300
Number of pages5
ISBN (Print)1424414296, 9781424414291
DOIs
Publication statusPublished - 2007
Externally publishedYes
Event2007 IEEE International Symposium on Information Theory, ISIT 2007 - Nice, France
Duration: 24 Jun 200729 Jun 2007

Other

Other2007 IEEE International Symposium on Information Theory, ISIT 2007
CountryFrance
CityNice
Period24/06/0729/06/07

Fingerprint

Dive into the research topics of 'Derivatives of mutual information in Gaussian vector channels with applications'. Together they form a unique fingerprint.

Cite this