Indexed on: 17 Nov '09Published on: 17 Nov '09Published in: Quantum Physics
In former work, we showed that a quantum algorithm is the sum over the histories of a classical algorithm that knows in advance 50% of the information about the solution of the problem - each history is a possible way of getting the advanced information and a possible result of computing the missing information. We gave a theoretical justification of this 50% advanced information rule and checked that it holds for a large variety of quantum algorithms. Now we discuss the theoretical justification in further detail and counter a possible objection. We show that the rule is the generalization of a simple, well known, explanation of quantum nonlocality - where logical correlation between measurement outcomes is physically backed by a causal/deterministic/local process with causality allowed to go backward in time with backdated state vector reduction. The possible objection is that quantum algorithms often produce the solution of the problem in an apparently deterministic way (when their unitary part produces an eigenstate of the observable to be measured and measurement produces the corresponding eigenvalue - the solution - with probability 1), while the present explanation of the speed up relies on the nondeterministic character of quantum measurement. We show that this objection would mistake the nondeterministic production of a definite outcome for a deterministic production.