Article quick-view

Research paper by

Bernhard C. Geiger, Tobias Koch

Indexed on

2nd Feb 2017

Published on

2nd Feb 2017

Published in

arXiv - Computer Science - Information Theory

This paper is not available for free

You can find the original article on the journal website.

Go to sourceJalali and Poor ("Universal compressed sensing," arXiv:1406.7807v3, Jan. 2016) have recently proposed a generalization of R\'enyi's information dimension to stationary stochastic processes by defining the information dimension rate as the information dimension of $k$ samples divided by $k$ in the limit as $k\to\infty$. This paper proposes an alternative definition of information dimension rate as the entropy rate of the uniformly-quantized stochastic process divided by minus the logarithm of the quantizer step size $1/m$ in the limit as $m\to\infty$. It is demonstrated that both definitions are equivalent for stochastic processes that are $\psi^*$-mixing, but may differ in general. In particular, it is shown that for Gaussian processes with essentially-bounded power spectral density (PSD), the proposed information dimension rate equals the Lebesgue measure of the PSD's support. This is in stark contrast to the information dimension rate proposed by Jalali and Poor, which is $1$ if the process's PSD is positive on any set with positive Lebesgue measure, irrespective of its support size.