Can Time Be Computed? Part I
Why causality might be the real computational primitive
I didn’t arrive at this question through theoretical physics. I arrived through debugging.
Late afternoon. Half-broken distributed system. A race condition that only appeared under load, vanished under tracing, and refused to reproduce deterministically. A value read before it had causally stabilized. A future leaking into the present.
Not a logic bug. A temporal one.
And I remember thinking….Not dramatically, just with the quiet unease of someone who has seen this movie too many times:
Every serious failure mode in computation is a failure of ordering, not of truth.
Deadlocks.
Non-termination.
Eventual consistency anomalies.
Heisenbugs.
They’re not about wrong values, they’re about wrong partial orders.
Which slowly pushed me toward a question that felt, at first, like category error:
Is time something computation happens inside?
Or is there something physics itself must compute?
Because every algorithm presupposes a causal structure.
And physics, increasingly, does not.
That’s the fault line I want to explore.
Not metaphorically.
Not operationally.
But where computation stops being a tool, and starts being a claim about reality.
The unspoken Axiom of computation
Every formal model of computation begins with a hidden axiom:
There exists a well-founded causal ordering over computational steps.
A Turing machine defines a sequence
A circuit defines a directed acyclic graph with layers. Lambda calculus assumes β-reduction sequences.
Even concurrent and nondeterministic models rely on partial orders, things like Mazurkiewicz traces, event structures, causal cones.
But nowhere in computation theory do we define time.
We assume it and we treat temporal succession as primitive, not derived.
Which is fine, unless we believe computation is physical, because physics does not give you a global ordering for free.
When physics revoked the clock
Classical mechanics offers a global time parameter (t \in \mathbb{R}), and evolution equations of the form:
This fits computation beautifully. Discretize time. Simulate. Iterate.
But relativity removes global simultaneity. Spacetime is actually a Lorentzian manifold ((M,g)) with only local causal structure: lightcones, not layers.
There exists no preferred foliation into spacelike hypersurfaces. Different observers disagree on temporal order for spacelike-separated events.
Still, maybe computation survives by picking a frame.
Quantum mechanics destabilizes that further. Time evolution is unitary:
but measurement is discontinuous, stochastic, and not generated by the Schrödinger equation. There is no closed-form dynamical law for collapse; already a fracture between time and dynamics.
Then quantum gravity removes time entirely. In canonical quantum gravity, the Wheeler–DeWitt equation reads:
with no time derivative. The universe is described by a stationary wavefunctional over spatial geometries and matter fields.
No evolution parameter.
No (t).
No “next”.
Which leaves us staring at something structural:
Computation assumes time as a crucial metric. Fundamental physics does not.
So either computation is not fundamental, or time is not. Either way, something we thought primitive isn’t.
Can you compute without temporal order?
At first glance, this seems incoherent. Computation is ordered state transition:
Remove well-founded succession, and execution collapses. But physics forces us to consider precisely that regime.
In quantum gravity and emergent spacetime models:
Geometry emerges from entanglement structure (AdS/CFT, tensor networks).
Causal relations fluctuate.
The distinction between “earlier” and “later” may not exist at Planck scale.
Which means there may be physically admissible regions of reality where no global or even local temporal ordering exists.
And in such regions, computation, as defined by state transition systems indexed by N, is simply undefined.
Not because machines fail.
Because the semantic preconditions of “execution” are absent. This pushed me toward an unsettling reframing:
Maybe computation is not primitive.
Maybe causality is.
Not clocks.
Not time parameters.
Not sequences.
Just the partial order relation:
And everything else (time, algorithms, dynamics) is emergent structure layered on top of that relation when it happens to be acyclic, well-founded, and stable.
Which is not guaranteed by physics.
Maybe time is computation
There’s a competing thesis.
Maybe time doesn’t underlie computation. Maybe time is computation.
This idea recurs across physics:
Thermodynamics: entropy increase corresponds to irreversible information erasure.
Quantum information: unitary evolution corresponds to reversible computation.
Holography: spacetime geometry emerges from entanglement patterns.
Tensor network models: geometry is literally computational wiring.
From this view, temporal ordering is nothing but dependency ordering between informational degrees of freedom.
Time is not background, it is output.
Computation presupposes time, and time emerges from computation, which creates a loop that suggests neither is truly fundamental, leaving only consistency as the primitive.
Not evolution, not process, not execution, but the satisfaction of constraints over relational structures, exactly as modern physics is written.
The universe is not something that unfolds like a movie but a solution that exists because all conditions fit together.
Causal structure as a computational resource
In standard computational models, causal structure is trivial: a total or partial order on steps.
But in physics, causal structure is dynamical.
In relativity, causal order depends on metric structure:
and (J^+(p)) depends on spacetime geometry.
In quantum mechanics, entangled systems violate classical factorization:
producing nonlocal correlations without signaling.
In quantum gravity, even the causal order relation may not be well-defined.
Which means that causality is not free.
It is constrained by physical law.
And if computation is structured causality (state transitions respecting dependency) then physics is not merely the substrate of computation.
Physics defines what computation means.
Which leads to a sharper version of the Physical Church–Turing Thesis:
Every physically realizable causal structure admits an equivalent computational representation.
And that claim is radically non-obvious.
Because some causal structures are cyclic.
Some are indefinite.
Some are globally inconsistent.
And in such worlds, computation (in the Turing sense) does not merely become inefficient. It becomes ill-posed.
Closed timelike curves and the collapse of execution
General relativity admits solutions with closed timelike curves (CTCs), where causal order contains cycles:
In such spacetimes, execution semantics fail. There is no acyclic dependency graph. No notion of “before state” and “after state”.
But when computer scientists studied computation in the presence of CTCs, notably Deutsch (1991), and later Aaronson–Watrous, something unexpected happened.
Computational power increased, not incrementally, but qualitatively.
Problems in PSPACE collapse into polynomial time.
Not because you can iterate faster, but because computation no longer proceeds sequentially.
Instead, the system must satisfy a fixed-point constraint:
where
is a completely positive trace-preserving map representing the circuit interacting with its own past state.
The computation is not:
It is:
Execution is replaced by global consistency, which is precisely how fundamental physics operates.
Einstein’s equations:
do not evolve geometry: they constrain the entire spacetime manifold.
Quantum path integrals:
do not generate trajectories; they sum over all histories consistent with boundary conditions.
Physics does not run but satisfies, suggesting that the deepest form of “computation” in nature is not algorithmic but static, constraint-based, and atemporal.
The day I stopped thinking the universe executes
I used to imagine the universe as a machine.
Initial conditions.
Evolution law.
Future states.
A cosmic program. Then I suddently noticed: that’s not how the equations are written.
They don’t say:
They say instead:
a global constraint over fields on spacetime.
The universe does not step forward; it satisfies equations over four-dimensional structures.
What we call time is not execution but traversal, a worldline slicing a static solution, and what we call dynamics is not computation but projection.
Time is not the medium of physics; it is the interface exposed to embedded observers.
What does “halting” mean without time?
Now things become truly unstable: the Halting Problem assumes a machine starts in state (s0), executes a sequence of transitions, and may or may not reach a halting state (sh).
In a timeless universe, there is no start, no sequence, no eventually: only the question of whether a globally consistent configuration exists.
Halting becomes satisfiability, non-halting becomes inconsistency, and there is no semi-decidability, no asymmetry, no waiting: just the existence or non-existence of fixed points.
This suggests something radical: classical undecidability, like the Halting Problem, may not be a fundamental feature of physical law.
This is not because physics violates Turing limits, but because physics does not instantiate the temporal semantics those limits presuppose.
Undecidability may emerge only from timeful computation, not from reality itself.
The limits of execution
Where does this leave us?
If computation presupposes time, and time is not fundamental, then “algorithm” becomes provisional.
It works inside pockets of reality where causality is well-behaved, decoherence stabilizes information, and entropy defines an arrow.
Outside those pockets, stepwise execution may simply not exist.
The universe may not “run” at all.
It may simply exist: a globally consistent solution to relational constraints, where evolution, dynamics, and causality are emergent features perceived by observers.
Execution is projection. Halting is satisfiability. Complexity is constraint.
Classical computer science tells us what can be computed if time flows like a river.
Physics tells us what is consistent if time is undefined.
Between the two lies a gap:
one about algorithms, the other about existence.
No algorithm, however clever, can escape the possibility that time itself is emergent, and computation exists only because the universe provides the scaffolding to support it.
In Part II, we will explore whether undecidability, chaos, and computational irreducibility are artifacts of timeful computation; or whether they emerge from the very structure of reality itself.




Edit : the substack latex editor for formulas on the phone screen looks really awful, sorry for that, I just recommend reading this article on a larger screen. Thanks so much my friends
Interesting analysis. Looking forward to part II