Abstract
Computation separates time from space: nondeterministic problems are exponential in time (the "Time Dragon") but polynomially simulable in space (the "Space Dragon"), as formalized by Savitch's theorem (NPSPACE⊆PSPACE). We propose that the brain physically instantiates this theorem through Recursive Condensation, a topological mechanism that converts intractable high-dimensional search into efficient low-dimensional navigation. Drawing on Urysohn's Lemma, we demonstrate that separability is a property of connectivity, not volume; a stable decision boundary exists independent of ambient dimension provided the underlying manifolds are topologically disjoint. To manufacture this disjointness, the cortex employs a parity alternation strategy: it alternates between odd-parity metric expansion (exploratory search) to untangle local geometry, and even-parity topological contraction (closure/condensation) to lock in validated invariants. This cycle acts as a biological "Savitch Machine," mediating a Topological Trinity Transformation (TTT), Search→Closure→Navigation, that compiles high-entropy exploration paths into low-energy quotient tokens. Under Memory-Amortized Inference (MAI), the cortex slays the Space Dragon by collapsing vast state spaces into compact metric singularities, and tames the Time Dragon by memoizing these traversals as structural priors. Evolution's "cheat code," linear cortical growth yielding exponential cognitive gain, emerges as a physical law of topological inference: exponential search in time becomes polynomial reuse in space via recursive metric collapse.