Quantcast

A Universal Scaling Theory for Complexity of Analog Computation

Research paper by Yaniv S. Avizrats, Joshua Feinberg, Shmuel Fishman

Indexed on: 05 Aug '05Published on: 05 Aug '05Published in: Physics - Other



Abstract

We discuss the computational complexity of solving linear programming problems by means of an analog computer. The latter is modeled by a dynamical system which converges to the optimal vertex solution. We analyze various probability ensembles of linear programming problems. For each one of these we obtain numerically the probability distribution functions of certain quantities which measure the complexity. Remarkably, in the asymptotic limit of very large problems, each of these probability distribution functions reduces to a universal scaling function, depending on a single scaling variable and independent of the details of its parent probability ensemble. These functions are reminiscent of the scaling functions familiar in the theory of phase transitions. The results reported here extend analytical and numerical results obtained recently for the Gaussian ensemble.