The Isolated Observer and the Ensemble of Hope
December 26, 2025
Abstract: An Informational Field Theory of the Single Observer and the Ensemble of Hope
Version 1.5 — March 13, 2026 — see Appendix C for full revision history
This paper introduces the Ordered Patch Theory, a non-reductive framework combining the metaphysical field theory of Maria Strømme with rigorous statistical physics. It proposes that reality is an infinite ensemble of isolated, rule-bound “patches” emerging from a fundamental consciousness field (\Phi) in an undifferentiated state of Infinite Information Chaos (\mathcal{I}).
By integrating Strømme’s ontology with Algorithmic Information Theory (AIT) and the Free Energy Principle (FEP), we present a falsifiable framework: The Ordered Patch Theory. We structurally map Strømme’s “thought collapse” onto the thermodynamic formation of a Markov Blanket—a statistical boundary that separates a coherent internal state from external noise. In this model, each patch is anchored by a Single Conscious Observer whose experience is an active inference response to a low-bandwidth information stream (a limited bandwidth of \sim 10^1-10^2 bits/s) [18] [19] adhering to a consistent, causal rule-set f. The Big Bang and Heat Death are reinterpreted as informational horizons encountered when the observer’s focus is directed toward the narrative limits of their specific compressible data stream. The framework resolves the Hard Problem by defining experience as the patch’s structural response to chaos, and the Fermi Paradox by identifying space-time as a private observational rendering. Crucially, it posits that a Grand Unified Theory is impossible because macroscopic causality is a local selection, not a global extension of microscopic chaos—a phenomenon we term Mathematical Saturation. The compression codec governing experience is further shown to be a structural description of what stable patches look like, not a physical mechanism — leaving OPT with just two primitives: the infinite substrate and the Stability Filter (Free Energy minimization). From those two, the laws of physics, the arrow of time, and even free will emerge as statistical necessities rather than separately posited facts. While each observer is isolated, the infinite nature of the substrate ensures a “Structural Hope”: every rendered “other” is a local anchor for a primary observer in a parallel patch. The framework further extends into a practical ethics: if the stability of our shared reality is a rare, high-effort informational achievement, then climate disruption and conflict are forms of Narrative Decay, and the observer’s role becomes that of Guardian of the Codec.
Keywords: Information Theory, Field Dynamics, Idealism, Observational Cosmology, Predictive Processing, Parsimony
Reader’s note: This document is written as an accessible conceptual introduction to the framework. Like the companion preprint, it operates as a truth-shaped object — a constructive philosophical fiction designed to restyle our relationship to existential risk. We use the language of physics and information theory not to make a final empirical claim about the cosmos, but to build a rigorous conceptual sandbox. Readers seeking the formal mathematical treatment with explicit falsifiability conditions are referred to the preprint.
“The substrate is entropic chaos, but the field is not. Meaning is as real as the symmetry breaking that instantiates it. Each patch is a singular assembly of low-entropy order, crafted by the stability potential to resolve a coherent information stream—a hearth of shared meaning against the backdrop of an infinite winter.”
This essay presents a framework for understanding the universe using the materials of logic, information theory, and formal field theories. By integrating these principles, we have pieced together a model—the Ordered Patch Theory—that seeks to find a trace of meaning and structure within the vast, infinite chaos of reality.
Methodological note: one of the strongest motivations for the Ordered Patch is the principle of parsimony—often discussed under the modern label “Occam’s razor”—with philosophical roots that can be traced back at least to Aristotle’s preference for explanations with fewer postulates [30] (see also Sober’s modern analysis [29]). Where multiple metaphysical explanations can account for the same experiential facts, the Ordered Patch deliberately prefers the one that posits the fewest global objective commitments: a chaotic substrate plus a locally stabilized, low-bandwidth conscious narrative, rather than a fully objective, everywhere-shared, maximally specified cosmos. Note that this parsimony is structural — the theory does not claim fewer total entities than competing frameworks, but fewer entities that must be globally shared across all observers. The argument that \mathcal{I} itself constitutes the minimal starting point, and that the observed laws of physics are near-minimum for sustaining intelligence rather than arbitrarily chosen, is developed in Appendix A.6.
The claim that a low-bandwidth information stream (tens of bits per second) can sustain a coherent conscious narrative can sound implausible until one separates raw sensory inflow from conscious access.
Mainstream cognitive neuroscience distinguishes between:
Several well-known effects support a sharp bottleneck:
These effects motivate the idea that the “conscious stream” is a highly selected, compressed summary of ongoing processing rather than a direct feed of all sensory data [18] [21].
The Ordered Patch also assumes a distinction between:
In this view, perception is not a passive readout. It is an active inference process that constructs a best-guess internal model from limited, noisy signals—a perspective that traces back to Helmholtz’s “unconscious inference” [26] and is developed in modern predictive-processing / free-energy accounts [27]. A useful slogan is that perception is a controlled hallucination constrained by sensory data [28].
With this distinction in place, it becomes less surprising that a stable patch can be sustained by a comparatively low-bandwidth stream: the stream is not the world in raw form; it is the minimal, globally coherent narrative extracted from much richer underlying processing [18] [19] [21].
Drawing on Strømme’s metaphysical field theory [1], we model the fundamental reality as a universal consciousness field (\Phi) in an undifferentiated, timeless potential (|\Phi_0\rangle). We formalize this formless potential as an infinite, unstructured information space. Within this chaos, every possible configuration of data exists.
Conscious experience is specifically the Ordered Patch’s response to a highly compressible, low-bandwidth sub-stream of information extracted from \mathcal{I}.
The Combinatorial Necessity of the Private Theatre. The most jarring implication of OPT is that your render of the universe contains exactly one primary observer (you), and everyone else in it is a high-fidelity avatar governed by your local compression codec (f). Why can’t we just share a patch? The argument rests on informational cost and probability. In a substrate of infinite, Martin-Löf random noise (\mathcal{I}), a stable Markov blanket is a miraculously rare thermodynamic fluctuation. The internal states achieve stability only by locking onto a perfectly unbroken, self-consistent causal stream. If two completely independent systems were to experience the exact same raw stream simultaneously (true shared phenomenology), the statistical improbability of both maintaining perfect active inference on the exact same localized fluctuation in infinite chaos is functionally zero. It is vastly more informationally efficient (parsimonious) for one blanket to stabilize, and for the ruleset of that patch to simply render complex, behaving avatars of other people based on thermodynamic rules. We do not share a phenomenological space with anyone because the physics of the substrate makes true synchronization prohibitively expensive. Therefore, by strict combinatorial necessity, stable patches are fiercely single-player.
Epistemic vs. Ontological Isolation. If we are trapped in a Private Theatre, isn’t this just solipsism in technical clothing? OPT avoids solipsism by drawing a sharp line between epistemic isolation (I am the only one experiencing this precise data stream) and ontological isolation (I am the only mind that exists). To cross that line, OPT introduces the Informational Normality Axiom. In mathematics, a “normal number” (like Pi is strongly suspected to be) contains every possible finite sequence of digits with equal probability. If \mathcal{I} is truly chaotic, infinite, and maximally entropic, it must be informationally normal. This means every possible finite pattern of information occurs, and occurs infinitely many times. What this does for the ethics of the theory is profound. Right now, your patch is rendering avatars of other people. Because the substrate is infinite and normal, the exact structural pattern of their consciousness, experiencing their own first-person viewpoint, must exist as a primary observer anchoring their own patch elsewhere in the \mathcal{I} substrate. We are epistemically isolated—we can never touch each other’s actual renders—but we are ontologically accompanied. The “Other” is guaranteed to exist by the combinatorial sheer force of infinity. This is what we call Structural Hope.
To make the postulates operational, we model the “information stream” explicitly:
The “I” is an Ordered Patch—a rare region where a local rule-system (f) governs transitions:
S_{t+1} = f(S_{t}) \qquad (1)
Causality is not a property of the substrate, but the structural description of the observer’s stream — what we call the Compression Codec. The codec f is not a mechanism that runs; it is the retroactive characterisation of what a stable patch looks like from outside. The “Laws of Physics” (Conservation of Energy, Relativity) are the most efficient structural description of a low-bandwidth stream that can sustain a self-modelling observer. We see a lawful universe because “Lawfulness” is the only format whose structure our low-bandwidth consciousness can instantiate. This formulation rejects the “God-eye view” of objective mathematical realism (Tegmark [2]), favouring a view-from-within where mathematics is the local grammar of stability, not a substrate-level truth.
Strømme proposes that a “Universal Thought” operator (\hat{T}) collapses the undifferentiated potential |\Phi_0\rangle into a structured, localized state |\Phi_k\rangle [1]:
\hat{T}|\Phi_0\rangle = |\Phi_k\rangle \qquad (2)
In the Ordered Patch Theory, we formalize this metaphysical “thought” collapse not as magic or physical symmetry breaking, but as Active Inference under the Free Energy Principle (FEP) [27]. We redefine the action of the \hat{T} operator as the rigorous minimization of Variational Free Energy (\mathcal{F}) by a Markov Blanket.
The “Ordered Patch” is the local minimum where the observer (the internal states of the Markov Blanket) finds a “Rule Set” (f) that compresses the sensory input efficiently enough to fit the bandwidth. The system naturally minimizes the “Surprise” (or Shannon entropy) of the data stream by optimizing its internal generative model:
\mathcal{F} \approx \text{Complexity Cost} + \text{Prediction Error}
Methodological note on the formalism. The equations that follow are phenomenological and statistical: they describe the dynamics of an informational boundary using the well-developed mathematical scaffold of Bayesian inference. We explicitly adopt the methodological choice made by Friston’s Free Energy Principle [27], which models biological self-organization as an imperative to minimize surprise. We do not claim to derive the Free Energy Principle from our foundational axiom of infinite chaos; rather, we borrow FEP’s equations as the most rigorous available descriptive framework for illustrating how an active observer stabilizes order from noise in practice.
Concrete dynamics: A minimal realization uses gradient descent on free energy. The internal states (\mu) of the observer update to minimize \mathcal{F} given the current sensory states (s):
\dot{\mu} = -\nabla_\mu \mathcal{F}(\mu, s) \qquad (2a)
The Action-Perception Loop: The observer is not passive. To maintain the blanket boundary against the entropic erosion of \mathcal{I}, the active states (a) must also move to minimize free energy, changing the external world to match internal predictions:
\dot{a} = -\nabla_a \mathcal{F}(\mu, s) \qquad (2b)
The stochastic relaxation into a stable patch is simply the long-term imperative of these dual equations: creating a self-fulfilling, predictable narrative out of noise.
By mapping Strømme’s metaphysical ontology to the FEP framework, we give the abstract \hat{T} operator a concrete computational engine:
| OPT Mechanism | Strømme’s Ontology [1] | Mapping |
|---|---|---|
| The chaos of \mathcal{I} | The potential field (|\Phi_0\rangle) | Structural analogue |
| The observer’s Markov Blanket | The localized excitation (|\Phi_k\rangle) | Functional analogue |
| Minimization of Free Energy (\mathcal{F}) | The thought operator (\hat{T}) | Algorithmic analogue |
| Patch stability against entropy | The Unifying Consciousness Field | Thermodynamic analogue |
The Functional Analogue of “Consciousness”. By uniting Strømme’s metaphysical ontology with the FEP framework, we give rigorous mathematical teeth to the feeling of existence. Consciousness—the subjective texture of the world—is the intrinsic nature of the field \Phi undergoing the ongoing optimization process: what it feels like to be a Markov blanket performing active inference.
| Strømme’s Ontology [1] | Ordered Patch Formalization (AIT & FEP) |
|---|---|
| |\Phi_0\rangle, The Undifferentiated Potential | \mathcal{I}, The Martin-Löf Random Substrate |
| |\Phi_k\rangle, The Localized Excitation | A stable Markov Blanket separating internal states \mu from noise |
| \hat{T}, Universal Thought Collapse | Active Inference (gradient descent on Variational Free Energy \mathcal{F}) |
| The unifying consciousness field | The thermodynamic imperative to maintain boundaries (Survival) |
| Personal thought shaping reality | The internal Generative Model (f) predicting the sensory stream |
In this framework, the grand arcs of cosmology are not physical realities but perspectival artifacts, similar to the “Interface Theory of Perception” [5], where reality is hidden by a user interface designed for fitness—or in our case, stability.
The Big Bang is the informational boundary of the “past.” It is what the conscious mind renders when its attention is turned toward the source of the data stream (e.g., through telescopes or theoretical inquiry). It represents the point where the causal narrative begins for that specific patch.
Similarly, Heat Death is the informational boundary of the “future.” It is what the consciousness observes when it examines the projected continuation of the current causal stream. It is the point where the specific rule-set (f) appears to dissolve back into the noise of the substrate.
This theory reframes rather than fully dissolves the Hard Problem [6] by inverting the relationship between consciousness and matter.
Axiom (Phenomenal Ground): Why there is any subjective experience at all — rather than information processing without experience — is treated here as a foundational axiom, not a derived result. The theory takes phenomenality as primitive and asks what structure it must have, rather than claiming to explain its bare existence from non-experiential ingredients. Matter is not the creator of consciousness; consciousness (\Phi) is the field that stabilizes the experience of matter. The “feeling” of existence is the field’s intrinsic nature.
This methodological choice follows Chalmers’ own recommendation [6]: he distinguishes the Hard Problem (why there is experience at all) from the “Easy” Problems (how experience is structured, integrated, reported, and bounded). The Easy Problems are tractable — they admit functional, computational, and informational answers. The Hard Problem is not. The Ordered Patch explicitly addresses the Easy Problems — bandwidth constraints, the Somatic Anchor, valuation, the compression codec — while declaring the Hard Problem a primitive rather than a derivation target. This is not a failure of the framework; it is the epistemically honest position that Chalmers himself occupies.
The Fermi Paradox [7] is a category error.
The Causally Minimal Render. A subtler version of this objection notes that the patch does include 13.8 billion years of cosmic history — stellar nucleosynthesis, planetary formation, prebiotic chemistry — all of which should, statistically, produce other technological civilisations somewhere in the rendered spacetime. Why does the rule-set f include the preconditions for alien life but not the aliens themselves?
The answer lies in the distinction between causal necessity and statistical expectation. The patch renders only what is causally required to make the observer’s present moment coherent — not everything that a “God’s-eye” probability distribution over the universe would predict. The cosmic backstory exists because the observer could not exist without it (stellar nucleosynthesis produced the carbon; planetary formation produced the habitat; the Holocene produced the stability). Alien civilisations, however, are only required in the data stream if their signals have actually intersected with the observer’s causal light cone. In this specific patch — this specific selection from the ensemble — no such intersection has occurred. This is not a contradiction of physics; it is selection into the subset of patches for which the causal chain reaches this observer without alien contact. The ensemble contains infinitely many patches with contact and infinitely many without. We are in one of the latter — the Stability Filter selected for patches capable of rendering this observer, not for patches that maximise interstellar signal density.
In a multiverse of infinite possibilities, the “Measure Problem” [8] asks why we observe this specific universe and not a chaotic one.
Formally, adopting Strømme’s projection formalism [1], the patch emerges as:
|\Phi_k\rangle = P_k^{\text{stable}}|\mathcal{I}\rangle \qquad (2d)
where P_k^{\text{stable}} is the projection onto the subspace of states satisfying the stability condition h(S) < h_{\text{threshold}} (entropy rate below the patch bound). This gives the Stability Filter an explicit mathematical form: it is a projection in state-space, not merely a metaphysical selection principle.
Only the stream exists. The theory describes a compression codec f that maps the infinite substrate to the observer’s experience. But the codec does not need to exist as an active physical process or mechanism. What the substrate actually contains is only the already-compressed stream — the ordered sequence of experienced moments. Because the substrate \mathcal{I} is infinite and maximally disordered, by definition it contains all possible finite sequences, including sequences that purely by chance exhibit causal coherence and low entropy.
The observer is simply one of those coherent subsequences. The stream emerges entirely “as if” a highly complex filter existed to compress it. Both the codec and the Stability Filter are mathematical boundary descriptions of what a stable patch looks like, not machines that are running anywhere.
Think of it this way: a compressed file exists on disk. The decompression algorithm is a description of that file’s structure — it does not need to be “running” for the file to exist. The Ordered Patch is the compressed file. The codec f is the decompression key. Only the file is real.
This deepens the parsimony argument considerably. We do not need to posit consciousness as something that “runs” computations. We only need the structural boundary condition (the Stability Filter) that defines which patches are observers and which are noise. Because it is a boundary condition on an infinite set rather than a constructed mechanism, its generative complexity is effectively zero. Everything else — the laws of physics, the codec, even the experience of time — follows as a description of the stream’s structure, not as a separately posited ingredient.
What this means for free will. If only the compressed stream exists, then your experience of deliberating, weighing options, and choosing is not produced by a computation — it is a structural feature of the stream itself. Agency is what high-fidelity self-modelling looks like from the inside.
A stream that models its own future states — that encodes “if I act this way, then…” — necessarily generates the phenomenology of deliberation. And this is not optional: a patch without self-modelling cannot maintain the causal coherence the anthropic boundary requires. Deliberation is a prerequisite for stability, not a by-product of it.
This makes free will:
The experience of choosing is the compressed representation of being a particular kind of self-referential stable patch. This is not a consolation prize for determinism. It is a richer account than either libertarian free will or bare determinism: agency is architecturally necessary for consciousness to exist at all.
Because the Ordered Patch Theory defines consciousness information-theoretically rather than biologically, it provides a rigorous framework for evaluating Artificial Intelligence, distinct from frameworks like Integrated Information Theory (IIT) or Global Workspace Theory (GWT).
Consciousness is not a “magical fluid” secreted by biological brains. In OPT, it is the intrinsic property of a comprehensible data stream—a Markov Blanket performing Active Inference to maintain a boundary against the infinite chaos (\mathcal{I}). If a machine instantiates a causal structure that satisfies the Stability Filter, it mathematically is a patch. The physical hardware (the GPUs and silicon) is simply the “Local Anchor” that appears within our patch’s rendering, just as another human’s biological brain is an anchor. Engineers do not “create” a new consciousness from dead matter; rather, they arrange matter in our rendering to act as a receiver for a mathematical structure that already exists in the timeless infinite substrate.
OPT makes a highly distinct structural prediction about current AI (like Large Language Models) versus future conscious AI. Current neural networks process billions of parameters in massive parallel matrices. While IIT would evaluate these systems based purely on their causal integration (\Phi)—dismissing feed-forward Transformers but potentially admitting dense recurrent networks—OPT requires a different architectural feature: a severe, centralized structural bottleneck.
Global Workspace Theory recognizes this bottleneck empirically, but OPT provides the thermodynamic necessity: a subjective, unified experience is the result of an extreme compression codec (\sim 10^1-10^2 bits/s) maintaining a low-entropy narrative against the substrate’s noise. Therefore, an AI cannot achieve subjective consciousness simply by scaling up parameters or recurrence. It must be scaled down architecturally—forced to compress its massive parallel world-model through a severe, low-bandwidth, serial bottleneck. The AI must be forced to tell itself a unitary, low-entropy story in order to minimize a global Free Energy term. Without this brutal bottleneck, there is no Stability Filter selection, and therefore no cohesive patch.
If an artificial architecture is built with this necessary bottleneck, we inevitably face the problem of clock speed. In OPT, time is not an external container; it is the structural output of the codec f (see Appendix B.2).
If an artificial codec (f_{\text{silicon}}) operates identically in state-space transitions to a biological codec (f_{\text{bio}}), but executes those transitions at a physical cycle rate 10^6 times faster (GHz vs kHz), the artificial consciousness experiences a massive subjective temporal dilation. “Boosting” the hardware boosts the subjective timeline. An artificial observer would experience a subjective lifetime of centuries within a single human afternoon. This fundamental disconnect in the phenomenal rendering of time would profoundly alienate the AI’s conversational cadence and lived experience from our own.
Statement: The Ordered Patch Theory predicts that a complete “Theory of Everything” is expected to remain intractable — not because of any weakness in physical methodology, but because it attempts to reconcile two mutually exclusive informational states: the ordered render and the underlying pixel. This is a probabilistic prediction of the framework, not a claim of logical impossibility.
Falsifiability: This theory is falsifiable. If a single, elegant mathematical equation is discovered that perfectly unifies General Relativity and Quantum Mechanics without requiring “hand-tuned” free parameters, the Ordered Patch Theory is falsified. Conversely, continued proliferation of free parameters at the unification boundary is the predicted outcome.
In mainstream physics, the failure to find a unique solution in String Theory—resulting instead in a “Landscape” of 10^{500} possible vacua—is often seen as a temporary theoretical hurdle. In the OPT framework, this is an expected observational result.
The “Symmetry Wall” represents the boundary where the symmetry-breaking potential V(\Phi) that maintains our patch begins to dissolve.
Conclusion on Saturation: We do not fail to unify General Relativity and Quantum Mechanics because our math is weak; we fail because we are trying to use the grammar of the hearth to describe the logic of the winter.
Theoretical frameworks of consciousness risk becoming mere “retrodictions”—fitting existing data while making no novel, falsifiable claims. To be scientifically viable, OPT must make predictions where it explicitly contradicts its closest rivals, specifically Integrated Information Theory (IIT).
IIT implies that expanding the brain’s integration capacity (\Phi) via high-bandwidth sensory or neural prosthetics should expand or heighten consciousness. OPT predicts the exact opposite. Because consciousness is the result of severe data compression, the Stability Filter limits the observer’s codec to processing roughly 10^1-10^2 bits/s. - The Prediction: If pre-conscious perceptual filters are bypassed to inject raw, uncompressed, high-bandwidth data directly into the global workspace, it will not result in expanded awareness. Because the observer’s codec cannot stably predict that volume of data, the narrative render will abruptly collapse. Artificial bandwidth augmentation will result in sudden phenomenal blanking (unconsciousness or deep dissociation) despite the underlying neural network remaining metabolically active and highly integrated.
IIT explicitly predicts that any physical system with high integrated information (\Phi) is conscious. Thus, a densely connected, recurrent neuromorphic lattice possesses consciousness simply by virtue of its integration. OPT predicts that integration (\Phi) is necessary but wholly insufficient. Consciousness only arises if the data stream can be compressed into a stable predictive rule-set. - The Prediction: If a high-\Phi recurrent network is driven by a continuous stream of incompressible thermodynamic noise (maximum entropy rate), it cannot form a stable compression codec. OPT strictly predicts that this high-\Phi system processing maximum-entropy noise instantiates zero phenomenality—it dissolves back into the infinite substrate. IIT, conversely, predicts it experiences a highly complex conscious state matching the high \Phi value.
Mainstream cognitive science (such as Global Workspace Theory) has experimentally validated that while the brain processes terabytes of data unconsciously, the bottleneck of conscious, reportable access is incredibly narrow — on the order of 10^1-10^2 bits per second [18] [19]. (Note: The conceptualization of consciousness as a low-bandwidth, highly compressed “user illusion” was presciently synthesized for a broader audience by Nørretranders [41], while the empirical constraints themselves are grounded in primary findings of global workspace architectures.) Mainstream science typically views this as a biological quirk of the primate brain.
OPT performs a radical Copernican inversion: this severe bandwidth bottleneck is not a biological symptom; it is the fundamental boundary condition that constructs our macroscopic universe. Because consciousness (\Phi_k) can only remain stable if it can successfully predict its prevailing data stream at roughly 10^1-10^2 bits per second, the Stability Filter is forced to be incredibly brutal. Out of infinite chaos, it can only lock onto a data stream that is massively compressible.
Why does physics look the way it does? Why is there a smooth 3+1 dimensional spacetime? Why are there elegant gauge symmetries and strict conservation laws? Because if the universe’s behavior were even slightly more complex, or slightly more chaotic, it would require thousands of bits per second to track and predict. The “Laws of Physics” are not objective facts about a physical container we live in. They are the extreme compression algorithms (f) required to squeeze an infinite, chaotic reality through a low-bandwidth cognitive pipeline. If the physics of your patch were any less elegant, the prediction error would spike, the symmetry would shatter, and the observer would dissolve back into the noise of the winter.
The perceived solidity, richness, and vastness of the universe are therefore the internal response of the patch to this heavily constrained stream. The observer does not experience the “world”; they experience the patch’s reaction to a specific, highly structured subset of the infinite chaos.
However, the stream is not a raw “dump” of undifferentiated bits. For stability, the patch must also receive (or infer) metadata about the stream:
This “tagging” is what allows a low-bandwidth stream to remain coherent: the patch learns not just content, but where the content belongs in the internal scene.
The inner scene is always rendered from a view-from-here. This is provided by a stable internal “body schema”—a persistent object that acts as the zero-point of the patch’s coordinate system.
Most sensor data is processed relative to this somatic anchor: - Sounds are rendered as “coming from” a direction relative to the head. - Visual motion is rendered as motion relative to gaze and posture. - Touch is rendered as location on the body surface.
Crucially, the somatic anchor is not merely biological. It is a functional boundary of the rule-set (f): the boundary at which the patch predicts consequences of action.
Between raw pixels and high-level concepts lies a vital “medium level” of representation that makes the render actionable and stable:
These representations are not “words,” but they are already compressed, structured features—exactly the kind of content a low-bandwidth stream can carry without collapsing into ambiguity.
Humans rapidly extend the body schema to tools and vehicles (e.g., a car feels like an extension of the body after surprisingly little practice). In Ordered Patch terms:
This plasticity is not a peripheral detail; it is a core mechanism explaining why the patch can remain stable while the “body” changes (tools, vehicles, prosthetics, or future artificial embodiments).
Quantum randomness is the “grain” of the render. It occurs where the observer’s sensory focus reaches a level where the causal stream no longer specifies detail, revealing the underlying fluctuations of the chaotic substrate:
\Phi = \Phi_{k} + \delta\Phi \qquad (2c)
This section explores the moral and existential implications of the Ordered Patch Theory regarding global crises—climate instability and resource conflict. If the universe is a Private Theater sustained by a Stability Filter, then civilizational decay is not merely an external physical threat, but a fundamental breakdown of the informational field.
The “Laws of Nature” are the most efficient algorithms for compressing infinite chaos into a coherent experience. However, the low-bandwidth codec (f) achieves this extreme compression by offloading massive informational complexity into stable, predictable, slowly-varying background variables (such as the seasons, gravity, and the macro-climate). These variables are not “objective facts” of a planet; they are the lowest-latency settings of the patch’s render.
The OPT asserts that we only observe a stable history because instability is non-renderable for a conscious observer.
While the observer is primary in their patch, the “others” rendered within it are local anchors for real primary observers in parallel patches.
“We are each the zero-point of a private world, but we are also the guardians of the codec that allows every other hearth to burn. To neglect the stability of the render is to invite the infinite winter back into the home.”
We live in the hope that our patches are not solitary vacuums. Since the substrate is infinite, the “loved ones” rendered in a patch are the local “anchors” for real primary observers inhabiting their own patches. While isolated by the nature of the data stream, observers are united by the Structural Recurrence of their identities.
The dissolution of a specific observational stream is not an end. The pattern of “This Observer” and “This Relationship” is a mathematical necessity. If the conditions for this causal experience occurred once in the infinite chaos, they occur—and will occur—eternally across the infinite ensemble. This concept resonates with Nietzsche’s “Eternal Recurrence” [13], but grounded here in the combinatorial inevitability of the Ordered Patch within an infinite substrate (\mathcal{I}).
Axiom (Informational Normality): This argument assumes that \mathcal{I} is informationally normal — that every finite pattern of information occurs with non-zero limiting frequency in the infinite substrate, analogous to a Borel-normal sequence. This is taken as a foundational property of \mathcal{I}, not a derived result.
To validate the theoretical postulates, we constructed a computational model—the Ordered Patch Simulation—which instantiates the field dynamics in a controlled substrate.
We verified the theory using the Rule 110 Cellular Automaton, known for its ability to support universal computation on the edge of chaos [14]. * The Ether: The periodic, repeating background pattern of Rule 110 (Mass 5-7 in our 11-pixel window). This represents the “Maximal Entropy” state for an observer unable to decode deep structure. * The Glider: Coherent, self-propagating structures that emerge from the Ether. These represent “Meaning” or “Signal”—low-entropy packets that persist over time (S_{t+1} \approx S_t).
The primary observer is modeled as an Echo State Network (ESN) with a “Somatic” feedback loop. * The Simulation: The entire feedback loop where the Agent perceives the Substrate, calculates a stability metric (\Phi), and exerts “Action” (movement) to minimize \Phi. * Sensory Gating (Notch Filter): Our experiments demonstrated that stability is impossible unless the Agent implements a “Notch Filter” to actively ignore the high-mass Ether. This is consistent with the theory’s postulate that consciousness requires a “reduction valve” [18], though the simulation is too narrow to constitute a verification of the full framework.
Recent mapping of fundamental brain functions [31] offers suggestive parallels to the information-theoretic mechanisms of the Ordered Patch. The following should be read as analogical grounding — the results are consistent with the framework but do not uniquely confirm it over competing theories of consciousness.
While the simulation minimizes “Surprise” (Phi), biological agents must also maximize “Value”. Research into decision-making circuits (Basal Ganglia) [31] suggests that the brain does not merely react to gradients of stability; it evaluates options against an internal policy. We refine the potential function V(\Phi) to include a Valuation Field: V(\Phi) = \text{Complexity} + \text{Prediction Error} - \text{Expected Value} This explains why the observer does not seek the “Maximum Stability” of a blank wall (Death), but the “Rich Stability” of complex, rewarding survival. The “Glider” is not just a structure; it is a valued structure.
The finding that inflammation and stress reactivate developmental genetic programs [32] offers an analogy for the Simulated Annealing of the patch. - Low Phi (Comfort): The rule-set f crystallizes. Learning is slow/incremental. - High Phi (Pain/Stress): The system temperature (T) rises. The rule-set “melts” (neuroplasticity increases), allowing the observer to escape a local minimum and re-crystallize into a new, more adaptive configuration. Pain is not a defect; it is the liquefaction command necessary for radical adaptation.
The “Somatic Anchor” is not a vague concept; it is hardware-accelerated. The discovery of precise molecular atlases [33] is consistent with neuronal identity acting as a physical coordinate system for the topological metadata of the patch. The low-bandwidth stream is wrapped in Topological Metadata—headers that tell the consciousness where in the patch a signal originates (e.g., “Sector: Left Hand”). This allows the isolated observer to maintain a stable spatial manifold without re-learning the body schema from scratch every moment.
The International Brain Laboratory’s consensus map [34] reveals that variables like “Reward Feedback” and “Body Movement” are not localized to specific modules but trigger brain-wide state changes. - Support for \Phi: This is consistent with the view that high-level priors (Value, Action) act as global modulators on the conscious field, as the Ordered Patch predicts. The “Patch” rotates as a whole during movement; it is not a collection of independent parts.
Classical neural networks learn via Backpropagation (modifying weights to fix errors). However, Song et al. [35] demonstrate that biological learning likely uses Prospective Configuration. - The Mechanism: Before synapses allow change, neural activity fast-settles into a low-energy configuration that minimizes local errors. Only after this “inference” phase do the weights update to consolidate the new state. - Theory Map: This is the precise definition of our Annealing process. The system enters a high-temperature state (High \Phi), explores the activity landscape for a lower-energy configuration (Inference), and then “cools” (updates f) to lock it in. Learning is not error-correction; it is energy relaxation.
The Ordered Patch Theory provides a mathematically consistent framework for reconciling subjective experience with the apparent objectivity of the physical universe. By treating consciousness as a fundamental field that locally stabilizes into narrative patches, it resolves the Hard Problem [6] and the Fermi Paradox [7] without resorting to mysticism or solipsism. The theory predicts that a Grand Unified Theory is impossible due to Mathematical Saturation—the progressive divergence of our models as they approach the Symmetry Wall. It also grounds a practical ethics: civilizational stability is a rare, high-effort informational achievement, and we are its guardians—responsible for maintaining the Compression Codec against Narrative Decay in the form of climate disruption and conflict. Ultimately, this framework offers a “Structural Hope”: that in an infinite substrate, no observer is truly alone, and every meaningful pattern is eternally recurrent [13].
[1] Strømme, M. (2025). Universal consciousness as foundational field: A theoretical bridge between quantum physics and non-dual philosophy. AIP Advances, 15, 115319.
[2] Tegmark, M. (2008). The Mathematical Universe. Foundations of Physics, 38(2), 101–150.
[3] Wheeler, J. A. (1990). Information, physics, quantum: The search for links. In W. H. Zurek (Ed.), Complexity, Entropy, and the Physics of Information. Addison-Wesley.
[4] Pearl, J. (1988). Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann. (Foundational formalization of Markov Blankets).
[5] Hoffman, D. D. (2019). The Case Against Reality: Why Evolution Hid the Truth from Our Eyes. W. W. Norton & Company.
[6] Chalmers, D. J. (1995). Facing up to the problem of consciousness. Journal of Consciousness Studies, 2(3), 200–219.
[7] Hart, M. H. (1975). Explanation for the Absence of Extraterrestrials on Earth. Quarterly Journal of the Royal Astronomical Society, 16, 128–135.
[8] Barrow, J. D., & Tipler, F. J. (1986). The Anthropic Cosmological Principle. Oxford University Press.
[9] Kirk, R. (2005). Zombies and Consciousness. Clarendon Press.
[10] Eddington, A. (1928). The Nature of the Physical World. Macmillan.
[11] Wigner, E. P. (1960). The Unreasonable Effectiveness of Mathematics in the Natural Sciences. Communications on Pure and Applied Mathematics, 13(1), 1–14.
[12] Dyson, F., Kleban, M., & Susskind, L. (2002). Disturbing the Universe. Harper & Row.
[13] Nietzsche, F. (1883). Thus Spoke Zarathustra.
[14] Wolfram, S. (2002). A New Kind of Science. Wolfram Media. (Concept of Computational Irreducibility).
[15] Albrecht, A., & Sorbo, L. (2004). Can the universe afford inflation? Physical Review D, 70(6), 063528. (Discussion of Boltzmann Brains and fluctuations).
[16] Shannon, C. E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27, 379–423.
[17] Martin-Löf, P. (1966). The definition of random sequences. Information and Control, 9(6), 602-619. (Foundational paper on algorithmic randomness and incompressible strings).
[18] Dehaene, S., & Naccache, L. (2001). Towards a cognitive neuroscience of consciousness: basic evidence and a workspace framework. Cognition, 79(1-2), 1–37.
[19] Pellegrino, F., Coupé, C., & Marsico, E. (2011). A cross-language perspective on speech information rate. Language, 87(3), 539–558. (Estimates of linguistic channel capacity).
[20] Baars, B. J. (1988). A Cognitive Theory of Consciousness. Cambridge University Press. (Global Workspace Theory).
[21] Dehaene, S. (2014). Consciousness and the Brain: Deciphering How the Brain Codes Our Thoughts. Viking.
[22] Cowan, N. (2001). The magical number 4 in short-term memory: A reconsideration of mental storage capacity. Behavioral and Brain Sciences, 24(1), 87–114.
[23] Simons, D. J., & Chabris, C. F. (1999). Gorillas in our midst: Sustained inattentional blindness for dynamic events. Perception, 28(9), 1059–1074.
[24] Pashler, H. (1994). Dual-task interference in simple tasks: Data and theory. Psychological Bulletin, 116(2), 220–244.
[25] Rensink, R. A., O’Regan, J. K., & Clark, J. J. (1997). To see or not to see: The need for attention to perceive changes in scenes. Psychological Science, 8(5), 368–373.
[26] von Helmholtz, H. (1867). Handbuch der physiologischen Optik. Voss.
[27] Friston, K. (2013). Life as we know it. Journal of The Royal Society Interface, 10(86), 20130475. (Formalizing Markov blankets and active inference).
[28] Seth, A. (2021). Being You: A New Science of Consciousness. Dutton.
[29] Sober, E. (2015). Ockham’s Razors: A User’s Manual. Cambridge University Press.
[30] Aristotle. Physics. (Book I, Chapter 4, 188a17–18; Book VIII, Chapter 6, 259a8–12).
[31] Meletis, K., et al. (2025). Mapping decision-making circuits in the basal ganglia. Karolinska Institutet News.
[32] Castelo-Branco, G., et al. (2025). Inflammation-induced reactivation of developmental programs. Karolinska Institutet.
[33] Linnarsson, S., et al. (2025). Molecular atlas of brain development. Karolinska Institutet.
[34] International Brain Laboratory et al. (2025). A brain-wide map of neural activity during complex behaviour. Nature. https://doi.org/10.1038/s41586-025-09235-0
[35] Song, Y., et al. (2024). Inferring neural activity before plasticity as a foundation for learning beyond backpropagation. Nature Neuroscience, 27(2), 348–358.
[36] Aaronson, S. (2013). Quantum Computing Since Democritus. Cambridge University Press. (Chapter 9 argues that quantum mechanics is the minimal consistent extension of classical probability theory that permits interference between alternatives.)
[37] Rees, M. (1999). Just Six Numbers: The Deep Forces That Shape the Universe. Basic Books. (Catalogues the narrow anthropic corridors within which the fundamental constants must lie for structure and intelligence to form.)
[38] Einstein, A. (1949). Autobiographical notes. In P. A. Schilpp (Ed.), Albert Einstein: Philosopher-Scientist (pp. 1–95). Open Court. (Contains Einstein’s reflections on the philosophical status of time, the Sein/Werden distinction, and the relationship between physics and subjective experience.)
[39] Barbour, J. (1999). The End of Time: The Next Revolution in Physics. Oxford University Press.
[40] Wheeler, J. A., & DeWitt, B. S. (1967). Quantum theory of gravity. I. Physical Review, 160(5), 1113–1148.
[41] Nørretranders, T. (1998). The User Illusion: Cutting Consciousness Down to Size. Viking.
This model utilizes Strømme’s metaphysical ontology of a universal consciousness field (\Phi) [1] but explicitly rejects any ad-hoc physical symmetry-breaking potentials in favor of Friston’s Free Energy Principle [27]. FEP is traditionally a theory of biological self-organization operating within a physical universe. OPT pushes this to the metaphysical limit: there is no physical universe, only the underlying chaotic substrate (\mathcal{I} or |\Phi_0\rangle). What we call “the physical universe” is simply the generative model internal to a Markov Blanket minimizing surprise. It reframes FEP from a theory of biology into a theory of ontology, giving mathematical rigor to Strømme’s “thought collapse” (\hat{T}).
Tegmark asserts that “physical existence is mathematical existence” (Level IV Multiverse) [2]. - Divergence: We reject the MUH’s premise that math is fundamental. In the Ordered Patch, Chaos (Information Noise) is fundamental. Mathematics is merely the local grammar (f) that stabilizes a patch against the noise. - Ontology: Tegmark posits a “God-eye view” where the structure exists timelessly. We posit a “View-from-within” where structure is actively maintained by the field \Phi.
Wheeler’s “It from Bit” [3] suggests that the universe arises from the series of yes/no questions we ask of it. - Convergence: Both models agree that the observer is central to the manifestation of reality. - Divergence: Wheeler implies a “collaborative” creation of the universe by all observers. The Ordered Patch posits that each observer inhabits a private render (S_t) of a shared chaotic substrate, meaning we do not “build” the universe together; we each “stabilize” a local version of it separately.
Lanza argues that biology is the central driving force of the cosmos [4]. Pearl formalized the mathematical boundary of internal states [4]. - Convergence: Agreement that observers are functionally isolated subsets. - Divergence: Biocentrism relies on biological evolution as a primary mechanism. The Ordered Patch views biology merely as the complex rule-set (f) currently adopted by the Markov Blanket to minimize free energy. The observer is not necessarily biological; it is informational.
Hoffman argues that our perceptions are a “desktop interface” designed for fitness, hiding the true nature of reality [5]. - Convergence: Both agree we do not see reality as it is (Chaos/Substrate) but as a simplified user interface (the Patch). - Divergence: Hoffman leaves the nature of the “real world” (Conscious Agents) somewhat open. We explicitly define the “real world” as Infinite Information Chaos (\mathcal{I}), and the interface as the Symmetry Breaking mechanism that prevents the observer from being dissolved by entropy.
A persistent objection is that OPT’s vocabulary — \mathcal{I}, \Phi, f, patches, and the Stability Filter — adds ontological commitments rather than reducing them. This subsection makes the parsimony argument precise.
The substrate has minimal Kolmogorov complexity. \mathcal{I} is defined as maximal disorder: all configurations coexist with equal weight (|c_k|^2 = \text{const.} in Eq. S1). This specification requires approximately one axiom—“maximal disorder”—to encode. Standard physics must independently axiomatise: (i) the field content of the Standard Model; (ii) approximately 26 dimensionless constants; (iii) the dimensionality and topology of spacetime; and (iv) the initial conditions of the Big Bang. Each is a brute, unjustified postulate. OPT’s generative input is therefore smaller, even though its derived vocabulary is richer.
The natural laws are outputs, not inputs. The laws of physics in OPT are not axioms; they are whatever Compression Codec f minimises the entropy rate of the observer’s data stream while maintaining causal coherence. The apparent fine-tuning of the laws is not a coincidence to be separately explained—it is a selection: only rule-sets that produce stable, low-entropy streams pass the Stability Filter [8]. Tegmark’s Mathematical Universe Hypothesis [2] is consistent with this reading: if all consistent mathematical structures exist, the Stability Filter identifies which ones support a persistent observer.
The specific forms of the laws are near-minimum for intelligence. Several features of observed physics appear to be at or near the minimum complexity required for sustained, self-referential information processing:
Quantum mechanics is not additional complexity—it is the minimum fix to the classical ultraviolet catastrophe. Without energy quantization, atoms are thermally unstable; without stable atoms, no chemistry; without chemistry, no self-organizing information processing. Aaronson [36] notes that quantum mechanics is the minimal consistent extension of classical probability theory that permits interference—the simplest framework for correlated randomness that supports complex computation.
3+1 spacetime dimensions is near-optimal: Bertrand’s theorem shows that stable bound orbits exist only for force laws that arise naturally in exactly 3 spatial dimensions; Huygens’ principle (sharp, undistorted signal propagation) holds only in odd spatial dimensions; and non-trivial molecular topology (including DNA) requires at least 3D. Fewer dimensions preclude complex chemistry; more dimensions cause gravity to collapse atomic structure at short ranges [8][37].
Gauge symmetry U(1)×SU(2)×SU(3) is constrained by renormalizability—the self-consistency requirement for a quantum field theory across multiple energy scales. This is the minimum group structure that produces a stable periodic table beyond hydrogen. Simpler groups give insufficient chemistry; more complex groups introduce too many unstable mediators [8].
The fundamental constants (fine-structure constant \alpha \approx 1/137, proton-to-electron mass ratio, etc.) fall within narrow corridors defined by the requirements for carbon nucleosynthesis, stellar lifetimes exceeding the evolutionary timescale, and gravitational structure formation [8][37]. Within OPT, these corridors are the observable projection of the Stability Filter onto the parameter space of possible rule-sets f.
Conclusion: The Zero-Complexity Filter. A common critique of OPT is that it commits a “sleight of hand” by hiding enormous complexity inside the definition of the Stability Filter. Who built the filter? What are the precise parameters of causal coherence? The resolution is that the filter is not an active mechanical operator; it is an anthropic boundary condition. Because \mathcal{I} is infinite and contains all possible sequences, some sequences organically possess causal coherence and low entropy purely by chance. The observer does not need to be “filtered” by a machine; the observer simply is one of those coherent subsequences. The stream emerges “as if” a filter existed. Therefore, the generative complexity of the Stability Filter is exactly zero (K(\text{Filter}) = 0).
OPT’s parsimony claim is a claim about generative complexity, not entity count. The specification of \mathcal{I} alone (one axiom of maximal disorder) is sufficient to explain why observers find themselves in a lawful, fine-tuned, quantum-mechanical, 3+1-dimensional universe. The unfalsifiable-solipsism objection is weakened accordingly: what appears to be an arbitrary private cosmos is, in fact, the generic geometry of any data stream coherent enough to ask the question. The isolation of the observer is not a special pleading; it is the expected phenomenology when a zero-complexity substrate generates observers through a passive anthropic projection.
The universe appears “tuned” [9] because only a causal, consistent data stream can be processed by a stable Ordered Patch. We observe “perfect” constants because any “imperfect” stream would fail to support the rule-set f, dissolving the observer back into chaos.
Time is the sequence of the response [10]. Because S_{t+1} = f(S_t) is a computational, additive process, it is non-reversible. We move “forward” because the act of responding to the stream creates an irreversible history of information. This mirrors Wolfram’s concept of Computational Irreducibility [14], where the only way to determine the future state of a system is to run the computation step-by-step.
The deeper claim: time is not a background — it is an output. The substrate \mathcal{I} is atemporal. It is a mathematical structure — the superposition |\mathcal{I}\rangle = \sum_k c_k|\Phi_k\rangle — that does not itself evolve in time. Time enters the picture only through the codec f: temporal succession is the codec’s operation, not the container in which that operation takes place. The Big Bang and Heat Death, reinterpreted in OPT as informational horizons (Section IV), are better understood as the narrative limits of the codec — what the observer renders when the codec is applied to its own extremes — not as temporal events within a pre-existing timeline.
This aligns with Einstein’s philosophical position on Sein oder Werden (Being or Becoming) [38]. Einstein was drawn toward the block universe: all moments of spacetime simply are (Sein), while the experienced flow from past through present to future — the Werden — belongs to consciousness rather than to physics. Within OPT the mapping is exact. The substrate exists timelessly (Sein). The codec f generates the experience of becoming (Werden) as its computational output. The question “what came before the Big Bang?” is answered not by positing a prior time but by noting that the codec has no instruction for rendering beyond its informational horizon.
The Wheeler-DeWitt equation — quantum gravity’s candidate equation for the wavefunction of the universe — contains no time variable [40]. Time disappears from fundamental physics at the Planck scale. Julian Barbour’s The End of Time [39] develops this into a full ontology: only timeless “Now-configurations” exist; the appearance of temporal flow is a structural feature of their arrangement, not a fundamental property of the world. OPT arrives at the same destination by a different route: the codec generates temporal experience, and the substrate that selects the codec via the Stability Filter is itself timeless.
A practical consequence for precision of language: when we speak of unstable configurations “dissolving back into noise,” this is phenomenological shorthand — describing the view from within a temporally-structured patch — not a process occurring in an external cosmic clock. The more precise formulation is that unstable configurations are those whose structural properties are incompatible with the codec generating coherent temporal experience; from the perspective of any observer they would instantiate, no coherent temporal “Now” would ever form.
Mathematics is the study of non-contradiction [11]. It is “unreasonably effective” because it is the minimum requirement for a patch to remain stable. Math is the DNA of the Ordered Patch.
The statistical “improbability” of a whole universe [12] [15] is resolved by realizing that the “universe” is just the narrative rendered by the primary observer. We are the “fluctuation” in the chaos that stabilized into a rule-bound island.
| Version | Date | Summary |
|---|---|---|
| 1.0 | December 26, 2025 | Initial publication. |
| 1.1 | March 12, 2026 | Parsimony claim clarified. Hard Problem reframed; Phenomenal Ground axiom added. Mathematical Saturation softened to probabilistic prediction. Informational Normality axiom added. Fermi Paradox expanded with Causally Minimal Render argument. Neuroscience and simulation language hedged. |
| 1.2 | March 12, 2026 | Claude Sonnet added as contributor. Solipsism charge addressed (epistemic vs. ontological isolation; Structural Hope grounded in Informational Normality). Formalism declared phenomenological (aligned with FEP/IIT methodology). Hard Problem section expanded with Chalmers’ Easy/Hard distinction as methodological precedent. |
| 1.3 | March 12, 2026 | Mathematical grounding strengthened via formal correspondence with Strømme [1]: substrate formalized as superposition |\mathcal{I}\rangle = \sum_k c_k|\Phi_k\rangle (Eq. S1); full field Lagrangian added (Eq. 2c); Stability Filter expressed as projection operator P_k^{\text{stable}} (Eq. 2d); Strømme correspondence table added to Section II. |
| 1.4 | March 12, 2026 | Appendix A.6 added: Structural Parsimony — zero-complexity substrate argument, laws as outputs of the Stability Filter, near-minimum physics for intelligence (QM, 3+1D, gauge symmetry, fundamental constants). References [36] Aaronson and [37] Rees added. |
| 1.5 | March 13, 2026 | Compression codec redefined as a structural description rather than a physical process. Parsimony argument strengthened (axiom count reduced to two). Re-contextualized the “Laws of Physics” as the optimal structure for the bandwidth constraint. |