Sharpening Occam’s Razor with Quantum Mechanics

Arxiv – Sharpening Occam’s Razor with Quantum Mechanics

Researchers have discovered a new way in which computers based on quantum physics could beat the performance of classical computers. The work, by researchers based in Singapore and the UK, implies that a Matrix-like simulation of reality would require less memory on a quantum computer than on a classical computer. Researchers know how to calculate the amount of information transferred inherently in any stochastic process. Theoretically, this sets the lowest amount of information needed to simulate the process. In reality, however, classical simulations of stochastic processes require more storage than this.

Gu, Wiesner, Rieper and Vedral, who is also affiliated with the University of Oxford, UK, showed that quantum simulators need to store less information than the optimal classical simulators. That is because quantum simulations can encode information about the probabilities in a “superposition”, where one quantum bit of information can represent more than one classical bit.

What surprised the researchers is that the quantum simulations are still not as efficient as they could be: they still have to store more information than the process would seem to need.

That suggests quantum theory might not yet be optimized. “What’s fascinating to us is that there is still a gap. It makes you think, maybe here’s a way of thinking about a theory beyond quantum physics,” says Vedral.

Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. Yet, for almost all stochastic processes, even the provably optimal classical models waste information. For each bit of predictive output, they generally require more than a single bit of input. We systematically construct quantum models that break this classical bound. This indicates that the system of minimal entropy that simulates such processes must necessarily feature quantum dynamics, and that many observed phenomena could be signifi cantly simpler than classically possible should quantum effects be involved.

In this article, we have demonstrated that any stochastic process with no reversible classical model can be further simpli fied by quantum processing. Such stochastic processes are almost ubiquitous. Even the statistics of perturbed coins can be simulated by a quantum system of reduced entropy. In addition, the quantum reconstruction can be remarkably simple. The capacity to make single photon measurements, for example, allows construction of a quantum epsilon machine that simulates a perturbed coin with reduced energy expenditure. This allows potential for experimental validation with present day technology.

This result has signi ficant implications. Stochastic processes play an ubiquitous role in the modeling of dynamical systems that permeate quantitative science, from climate fluctuations to chemical reaction processes.

Any computation is physical. To implement the mathematical model of a particular observable phenomena, we (a) encode the inputs of the model within a suitable physical system, (b) evolve the system according to some physical laws and (c) retrieve the predictions of model by appropriate measurement. Thus, if a model demands a particular piece
of data is input, the physical implementation of that model must have the capacity to store that data.

Classically, the statistical complexity C is employed as a measure of how much structure a given process exhibits. The rationale is that the optimal simulator of such a process requires at least this much memory. The fact that this memory can be reduced quantum mechanically implies the counterintuitive conclusion that quantizing such simulators can reduce their complexity beyond this classical bound, even if the process they’re simulating is purely classical. Many organisms and devices operate based on the ability to predict and thus react to the environment around them, the fact that it is possible to make identical predictions with less memory by exploiting quantum dynamics implies that such systems need not be as complex as one originally thought.

This improved efficiency has direct physical consequences. Landauer’s principle implies that recording information incurs a fundamental energy cost (since this information will eventually need to be erased). In addition, we may exploit the non-orthogonality of quantum causal states to directly reduce the cost of encoding the past in certain physical implementations (such as the quantum optical regime). Thus, the potential for quantum systems to simulate a certain statistical output with reduced system entropy lowers the fundamental energy requirements of a stochastic computation.

This leads to the open question, is it always possible to find an ideal predictive model? Certainly,

shows that our construction, while superior to any classical alternative, is still not wholly reversible. While this irreversibility may indicate that more efficient, ideal quantum models exist, it is also possible that ideal models
remain forbidden within quantum theory. Both cases are interesting. The former would indicate that the notion of stochastic processes `hiding’ information from the present is merely a construct of inefficient classical probabilistic models, while the latter hints at a source of temporal asymmetry within the framework of quantum mechanics; that it is fundamentally impossible to simulate certain observable statistics reversibly.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks