Article quick-view

On the Information Dimension of Stochastic Processes


Jalali and Poor ("Universal compressed sensing," arXiv:1406.7807v3, Jan. 2016) have recently proposed a generalization of R\'enyi's information dimension to stationary stochastic processes by defining the information dimension rate as the information dimension of $k$ samples divided by $k$ in the limit as $k\to\infty$. This paper proposes an alternative definition of information dimension rate as the entropy rate of the uniformly-quantized stochastic process divided by minus the logarithm of the quantizer step size $1/m$ in the limit as $m\to\infty$. It is demonstrated that both definitions are equivalent for stochastic processes that are $\psi^*$-mixing, but may differ in general. In particular, it is shown that for Gaussian processes with essentially-bounded power spectral density (PSD), the proposed information dimension rate equals the Lebesgue measure of the PSD's support. This is in stark contrast to the information dimension rate proposed by Jalali and Poor, which is $1$ if the process's PSD is positive on any set with positive Lebesgue measure, irrespective of its support size.