Can Quantum Physics Compute Everything?
Why Quantum Mechanics might be the ultimate algorithmic boundary
Let me start by wishing you a great beginning to the year.
The final months of last year pulled me, almost against my will, back into questions I thought I had already filed away as “foundational but settled”.
Long nights rereading old papers, bouncing between physics and computer science, circling again around quantum mechanics: not as a toolbox, but as a worldview.
What does reality actually do?
Not metaphorically. Not operationally. But at its core.
My fascination with quantum mechanics has always been less about weirdness and more about structure: the rigid logic beneath the strangeness. Hilbert spaces, operators, amplitudes, interference.
Then quantum computation entered the picture; not as science fiction, but as a brutally precise theory of what physical law allows computation to be.
At the same time, I found myself drifting outward. Pulsars ticking with absurd regularity.
Gamma-ray bursts dumping more energy in seconds than stars emit in years.
Cosmic radiation arriving at Earth as silent witnesses from violent, distant processes.
Everywhere I looked, the universe appeared not just dynamic: but algorithmic. Rule-driven. Constrained. Relentlessly lawful.
And that’s where this thought experiment begins.
Forget simulations.
Forget consciousness.
Forget metaphors about the universe “being like” a computer.
Ask something more precise, and much more dangerous:
Is every physically possible process also computationally possible?
Not in the engineering sense. Not “can we build it.”
But in the strict theoretical sense of what kinds of functions nature itself can compute.
Because if physics can compute more than a Turing machine, then computer science is not the theory of computation, it’s just a form of local approximation.
And if physics cannot, even with quantum mechanics, even with infinite precision, even with infinite patience, then there exist truths about the universe that are, in principle, unreachable.
No algorithm.
No experiment.
No clever encoding.
Just hard limits.
That is the terrain I want to explore with you in the next months.
The Church–Turing Thesis
Every computer scientist grows up with the Church–Turing thesis. Roughly stated:
Anything that can be effectively computed can be computed by a Turing machine.
This statement often gets treated as a law, but it isn’t one. It’s not even a theorem. It’s a convergence claim, an empirical observation about formal systems.
Church proposed it in the language of lambda calculus. Turing proposed it via abstract machines.
Gödel, Kleene, Post, and others arrived at equivalent notions through recursion theory and formal logic.
Different motivations.
Different formalisms.
Same computational power.
That convergence is the real content of the thesis. It suggests that “effective computation” is a robust concept, largely insensitive to the details of representation.
If a procedure is mechanical, finitely describable, and stepwise, then it collapses into the same equivalence class.
So far, the thesis has survived every purely mathematical attempt to escape it.
But here’s the subtle point that often gets glossed over: Church–Turing is not a statement about reality. It is a statement about what follows from a certain idea of algorithmic procedure.
It assumes:
discrete symbols
finite programs
local, step-by-step execution
exact rules applied indefinitely
None of these assumptions are derived from physics. They are imposed a priori. A Turing machine is, in fact, not a model of nature. It’s a model of formal reasoning.
The moment you ask whether the universe itself computes, something shifts.
You are no longer asking what functions are definable.
You are asking what functions are physically realizable.
This is where the Church–Turing thesis quietly mutates into a much stronger claim:
Anything that can be physically computed can be computed by a Turing machine.
This is the Physical Church–Turing Thesis.
And unlike its mathematical ancestor, it is neither obvious nor stable.
Because now you’re no longer protected by abstraction. You inherit all the messiness of the real world: continuity, noise, quantum uncertainty, relativistic spacetime, infinite state spaces, singularities.
Physics does not operate in discrete steps.
It does not respect clean symbol boundaries.
It does not come with a clocked execution model.
So the real question becomes: does nature secretly implement something equivalent to a Turing machine, or does it do something strictly more powerful?
Historically speaking, many physicists assumed the answer was yes, mostly by default.
Classical mechanics seemed simulable.
Quantum mechanics, despite its strangeness, was still linear, unitary, and describable by differential equations. Even randomness could be treated as probabilistic computation.
Then cracks started to appear.
General relativity allows spacetimes with extreme causal structures. Some theoretical solutions suggest observers who experience infinite proper time while the external universe evolves finitely.
In such spacetimes, the usual assumptions about step-by-step computation break down.
Analog systems raise similar problems. Differential equations over real numbers can, in principle, encode infinite information in initial conditions.
If nature truly instantiated real numbers with infinite precision, then it could compute functions that no discrete machine ever could.
And quantum mechanics complicates things even further.
Measurement, entanglement, and nonlocal correlations don’t map cleanly onto classical algorithmic models. They don’t obviously violate Turing limits—but they don’t obviously respect them either.
The Physical Church–Turing thesis survives here not as a theorem, but as a working hypothesis.
A very strong one. But, why does this matter?
Because if the Physical Church–Turing thesis is false, then computation theory is incomplete as a theory of nature. There would exist physically realizable processes whose behavior no algorithm can predict, simulate, or decide.
Not because they’re chaotic.
Not because they’re complex.
But because they compute outside the Turing boundary.
And if the thesis is true, if even the universe itself cannot escape those limits, then undecidability and incompleteness are not quirks of formal systems.
They are woven into reality.
That’s not just a technical distinction.
It’s a statement about what can be known, predicted, or explained, even in principle.
And once you accept that framing, the question is no longer whether computers can simulate physics.
The question is whether physics itself is bound by computation.
Quantum Mechanics Enters the Chat
Quantum computation forced us to confront a subtle but profound idea: the universe may compute, but it computes differently than classical intuition predicts.
A quantum computer does not increase the set of computable functions beyond what a classical Turing machine can compute.
In particular, it still cannot solve undecidable problems such as the Halting Problem or determine the truth of arbitrary Gödelian statements. Quantum mechanics does not overthrow the Church–Turing thesis in principle.
But it radically reshapes computational complexity, the resources required to perform computation.
Problems that are classically intractable, like integer factorization, discrete logarithms, certain simulations of physical systems, collapse into polynomial time with Shor’s algorithm.
Even unstructured search, which classically requires O(N) steps, is quadratically accelerated by Grover’s algorithm.
These improvements are not just mathematical tricks: they reflect a deep physical truth: the structure of quantum mechanics itself is a computational resource.
Superposition allows a single quantum system to “exist” in many states at once. Interference allows amplitudes to reinforce correct answers and cancel incorrect ones.
Entanglement links distant subsystems in ways that classical bits cannot. In other words, the universe itself, at its core, does more than sequential symbol manipulation; it allows parallelism, coherence, and interference to shape computation directly.
This raises the unavoidable question:
if nature already bends computational complexity in such a profound way, why should it stop at what quantum mechanics allows?
Could there be deeper physical processes that are, in principle, even more computationally powerful than quantum mechanics itself?
Analog Computation and Infinite Precision
One obvious escape route from classical Turing limits is analog computation. If a physical system could represent real numbers with infinite precision, it could encode solutions to non-recursive problems.
For example, an infinite-precision real number could encode the solution to the Halting Problem or an undecidable statement in its infinite decimal expansion.
Historically, this idea appears in several forms:
Continuous dynamical systems, where trajectories can encode computation in phase space.
Real-number computational models, studied by Blum, Shub, and Smale, which formalize computation over R\mathbb{R}R.
Differential equation solvers with unbounded precision, where initial conditions carry infinite information.
At first glance, these models seem to suggest that physics could, in principle, compute beyond Turing limits.
But nature pushes back. Quantum mechanics imposes the Heisenberg uncertainty principle, limiting simultaneous precision of conjugate variables.
Thermodynamics limits information density: no finite energy system can encode arbitrarily many bits. Noise, decoherence, and classical chaos amplify tiny perturbations, making infinite precision operationally impossible.
So while hypercomputation via naïve continuity is mathematically plausible, it appears physically unattainable. Real numbers may exist in equations, but nature never hands us all their digits.
Quantum Measurement as Computation?
A more subtle possibility arises from quantum measurement itself. Unlike unitary evolution, measurement is irreversible and probabilistic. The wavefunction collapses in ways that are not fully formalized as a deterministic algorithm.
Some physicists and philosophers have speculated: could measurement itself perform computation beyond a Turing machine?
In other words, could a cleverly prepared quantum system “decide” problems that no algorithm can resolve? Could it act as a hypercomputer embedded in the laws of nature?
So far, evidence points to no. Quantum mechanics does provide genuine randomness, but randomness is not hypercomputation.
A random oracle samples outcomes; it does not guarantee solutions to undecidable problems. You can’t force the universe to give you the answer to the Halting Problem just by measuring entangled particles.
Quantum mechanics gives us new tools, but within strict rules. There is no free lunch, no hidden oracle tucked inside a wavefunction.
Measurement is very powerful, but still Turing-bound.
Complexity as a Physical Law
This leads to a subtle but profound shift in perspective: perhaps the lesson is not that physics can compute more, but that physics itself enforces computational limits.
Traditionally, we think of complexity as an abstract mathematical notion, a way of measuring how hard it is to solve a problem on paper or with a computer.
But if computation is grounded in physical law, then complexity becomes a physical quantity. It is limited by energy, entropy, spacetime geometry, and the fundamental constants of nature.
Laws of physics are no longer merely differential equations describing how systems evolve, they are constraints on what can be physically computed, and how efficiently.
Concrete examples of this principle are abundant:
Black holes appear to be the ultimate information scramblers. The “fast-scrambling conjecture” suggests that black holes mix information as quickly as any quantum system allowed by physics.
In practical terms, this means that even if a black hole encodes a computation perfectly, the information becomes effectively irretrievable almost immediately. This is a limit on computation imposed by spacetime and quantum mechanics simultaneously: physics dictates how fast and efficiently information can propagate and mix.
Quantum gravity seems to enforce hard upper bounds on information density. The Bekenstein-Hawking formula for black hole entropy shows that the maximum entropy, or information content, of a region of space scales with the area of its boundary, not its volume. This is not just a curiosity: it implies a fundamental ceiling on how much computation can occur per unit of spacetime, a limit written directly into the geometry of the universe itself. There is no way to “cheat” by packing more bits into a smaller volume; physics forbids it.
Spacetime geometry and holography hint at even deeper computational constraints. The AdS/CFT correspondence suggests that the dynamics of an entire volume of spacetime can be fully described by degrees of freedom on its boundary.
In other words, the “bulk” of the universe is computationally reducible to its “skin,” implying that space itself encodes the limits of computation. This is not merely metaphorical: causal structure, curvature, and horizon dynamics all contribute to how information is processed, stored, and transmitted.
Viewed through this lens, undecidability is not a bug of mathematics; it is a feature of reality.
Some questions are unknowable not because we lack clever algorithms, computing power, or patience, but because physical law itself forbids the existence of a process that could answer them.
No matter how powerful a quantum computer, how intricate a simulation, or how infinite our patience, some computations are simply beyond nature’s ability to realize.
For instance, there may be dynamical evolutions of quantum fields that are provably irreducible: no sequence of operations could predict them faster than waiting for them to unfold.
Chaotic systems may be intrinsically computationally irreducible: predicting their future states exactly may require simulating every step in real time.
And even the universe’s most extreme conditions, like the interiors of black holes or the very early inflationary epoch, may encode patterns and outcomes forever hidden from any observer, because the computational processes required to extract them cannot exist in our spacetime.
In short, the universe may be a computation, but it is a computation under strict physical rules, and those rules set absolute boundaries on what can ever be known or predicted.
Understanding these limits is not merely an academic exercise, it reshapes our conception of physics, complexity, and even the ultimate scope of science itself.
The Nightmare Scenario
The implications are, frankly, unsettling.
If the universe respects Turing limits, there exist well-defined physical questions that cannot be answered, even in principle.
Not because we are technologically limited, not because we lack intelligence, but because no physical process exists that could compute the answer.
Some patterns may be irreducible, some laws may be uncompressible, and some predictions are fundamentally out of reach.
Reality itself may be computationally opaque: vast regions of possible knowledge are forever hidden from any observer, regardless of ingenuity, energy, or patience.
Where Does This Leave Physics?
If physics is constrained by computation, the dream of a “Theory of Everything” appears naive.
Equations may exist that perfectly describe reality, but solving them, predicting exact outcomes, may be impossible, even in principle. We could simulate approximations indefinitely without ever converging to exact truth.
Classical understanding, as a guarantee of full predictability, may be fundamentally impossible.
In Part II, I want to push these ideas further:
Could undecidability appear directly in physical law?
Is chaos just computational irreducibility in disguise?
And if prediction itself is bounded, what does that mean for science as an enterprise?
Because if nature computes, but only within strict algorithmic limits, then the universe is not just mysterious.
It is provably unknowable in places.
And that is not philosophy. That is computer science catching up with physics.




This article is surprisingly close in spirit to my latest article published just yesterday! We touch on some of the same ideas but from different angles so I guess that's a very happy coincidence. I also have a follow up coming up on some of the bizarre consequences of interpreting limiting results in theoretical CS as fundamental epistemic limits of physics. Can't wait to see what you guys come up with next! Keep up the good work.
The framing of computational complexity as a physical law rather than just an abstract mathematical concept is profoundly insightful. The observation that black holes scramble information at the theoretically maximum rate allowed by physics, and that Bekenstein-Hawking entropy limits scale with area not volume, shows that spacetime geometry itself enforces computational boundaries. What strikes me most is the nightmare scenario, that some physical questions are answerable not due to technological limits but because no physical process can compute the answer. Last year when working on simmilar problems involving quantum field dynamics we kept hitting walls that we attributed to computational resources, but this suggests some predictions might be intrinsicaly irreducible regardless of how much compute power we throw at them. The idea that undecidability isn't a mathematical quirk but woven into physical reality fundamentally reshapes what we can expect from a theory of everything.