The two dragons of cognition: recursive condensation for predictive processing

认知的两大支柱:用于预测处理的递归凝聚

阅读:1

Abstract

Computation separates time from space: nondeterministic problems are exponential in time (the "Time Dragon") but polynomially simulable in space (the "Space Dragon"), as formalized by Savitch's theorem (NPSPACE⊆PSPACE). We propose that the brain physically instantiates this theorem through Recursive Condensation, a topological mechanism that converts intractable high-dimensional search into efficient low-dimensional navigation. Drawing on Urysohn's Lemma, we demonstrate that separability is a property of connectivity, not volume; a stable decision boundary exists independent of ambient dimension provided the underlying manifolds are topologically disjoint. To manufacture this disjointness, the cortex employs a parity alternation strategy: it alternates between odd-parity metric expansion (exploratory search) to untangle local geometry, and even-parity topological contraction (closure/condensation) to lock in validated invariants. This cycle acts as a biological "Savitch Machine," mediating a Topological Trinity Transformation (TTT), Search→Closure→Navigation, that compiles high-entropy exploration paths into low-energy quotient tokens. Under Memory-Amortized Inference (MAI), the cortex slays the Space Dragon by collapsing vast state spaces into compact metric singularities, and tames the Time Dragon by memoizing these traversals as structural priors. Evolution's "cheat code," linear cortical growth yielding exponential cognitive gain, emerges as a physical law of topological inference: exponential search in time becomes polynomial reuse in space via recursive metric collapse.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。