Approximating the noise sensitivity of a monotone Boolean function

Research paper by Ronitt Rubinfeld, Arsen Vasilyan

Indexed on: 16 Apr '19Published on: 14 Apr '19Published in: arXiv - Computer Science - Data Structures and Algorithms


The noise sensitivity of a Boolean function $f: \{0,1\}^n \rightarrow \{0,1\}$ is one of its fundamental properties. A function of a positive noise parameter $\delta$, it is denoted as $NS_{\delta}[f]$. Here we study the algorithmic problem of approximating it for monotone $f$, such that $NS_{\delta}[f] \geq 1/n^{C}$ for constant $C$, and where $\delta$ satisfies $1/n \leq \delta \leq 1/2$. For such $f$ and $\delta$, we give a randomized algorithm performing $O\left(\frac{\min(1,\sqrt{n} \delta \log^{1.5} n) }{NS_{\delta}[f]} \text{poly}\left(\frac{1}{\epsilon}\right)\right)$ queries and approximating $NS_{\delta}[f]$ to within a multiplicative factor of $(1\pm \epsilon)$. Given the same constraints on $f$ and $\delta$, we also prove a lower bound of $\Omega\left(\frac{\min(1,\sqrt{n} \delta)}{NS_{\delta}[f] \cdot n^{\xi}}\right)$ on the query complexity of any algorithm that approximates $NS_{\delta}[f]$ to within any constant factor, where $\xi$ can be any positive constant. Thus, our algorithm's query complexity is close to optimal in terms of its dependence on $n$. We introduce a novel descending-ascending view of noise sensitivity, and use it as a central tool for the analysis of our algorithm. To prove lower bounds on query complexity, we develop a technique that reduces computational questions about query complexity to combinatorial questions about the existence of "thin" functions with certain properties. The existence of such "thin" functions is proved using the probabilistic method. These techniques also yield previously unknown lower bounds on the query complexity of approximating other fundamental properties of Boolean functions: the total influence and the bias.