Modern physics and mathematics rely on a set of constructs — infinities, free parameters, unexplained postulates, and named-but-mechanismless entities — that appear wherever a formalism encounters phenomena it cannot fully describe. These constructs are conventionally treated as separate problems requiring separate solutions: renormalization in quantum field theory, the measurement problem in quantum foundations, the cosmological constant discrepancy in gravitational physics, dark matter and dark energy in astrophysics. This paper demonstrates that these constructs share a common information-theoretic signature identifiable through Shannon's channel capacity theorem. Each placeholder appears at a point where the information content H of the source system exceeds the channel capacity C of the observer's formalism. A four-type taxonomy — divergence, importation, postulation, and nomination — is proposed to classify all known placeholder constructs by the mechanism through which the formalism manages the overflow condition H > C. The taxonomy is shown to operate independently across quantum mechanics, classical field theory, the Standard Model, cosmology, and — through prior work in this series — encoded historical systems. The framework does not resolve the individual placeholder problems. It reclassifies them as expressions of a single cause, specifies the channel requirements any successor formalism must satisfy to dissolve them, and generates the falsifiable prediction that the four-type taxonomy is exhaustive.
The infinity was never in the physics. It was in the channel.
Physics is a discipline of extraordinary precision. Quantum electrodynamics predicts the anomalous magnetic moment of the electron to ten decimal places. The Standard Model has survived every experimental test at the Large Hadron Collider. General relativity's predictions of gravitational lensing, frame-dragging, and gravitational waves have been confirmed to the limits of measurement technology.
And yet the formalisms that produce these results are riddled with constructs that no one can explain.
Quantum field theory calculations diverge to infinity at short distances. The divergences are managed through renormalization — a procedure that subtracts infinite quantities to produce finite, experimentally verified answers — but the infinities themselves have no physical interpretation. They appear because the formalism is asked to sum contributions across all distance scales, including scales where the formalism's assumptions break down. The procedure works. The understanding is absent.
The Standard Model of particle physics contains at minimum 18, and by some counts 26, free parameters that cannot be derived from the theory. Particle masses, coupling constants, mixing angles — these must be measured experimentally and inserted into the equations by hand. The theory that describes the behavior of every known particle cannot predict the mass of a single one of them.
The Born rule — the prescription that the probability of a measurement outcome equals the squared magnitude of the wave function — is the extraction rule that converts quantum states into observable predictions. It has never failed. It is also a postulate, not a derivation. No one has explained why |ψ|² gives probability. The rule is stated, accepted, and used without mechanism.
Wave function collapse — the transition from a superposition of states to a single definite outcome upon measurement — is postulated as a dynamical process distinct from Schrödinger evolution. The two processes are incompatible: one is continuous, linear, and deterministic; the other is discontinuous, nonlinear, and probabilistic. Both are asserted to govern the same system under different conditions. No mechanism connects them.
Dark matter constitutes approximately 27% of the universe's energy content. Dark energy constitutes approximately 68%. Together they account for 95% of everything. Neither has been directly observed. Neither has a confirmed mechanism. Both are names — labels attached to the gap between what the equations predict and what the observations show.
The cosmological constant — quantum field theory's prediction of vacuum energy — disagrees with observation by up to 120 orders of magnitude. This has been called the worst theoretical prediction in the history of physics.
These are not peripheral curiosities. They are load-bearing elements of the theoretical structure. And they are conventionally treated as separate problems, investigated by separate communities, published in separate journals, addressed at separate conferences.
This paper proposes that they are the same problem.
Claude Shannon's channel capacity theorem (1948) establishes a fundamental limit on information transmission. Given a source with information content H and a channel with capacity C:
The theorem is proven. It is not a conjecture, an approximation, or a guideline. It is a mathematical constraint on information transmission that applies regardless of the physical substrate of the channel.
We now apply this framework to the relationship between physical systems and the formalisms used to describe them.
When a formalism F encounters a source S where H(S) > C(F), the overflow must go somewhere. Analysis of the known placeholder constructs in physics and mathematics reveals that the formalism manages the excess through one of four — and only four — mechanisms.
The formalism attempts to carry the full signal and fails. The output goes to infinity. The formalism has no internal mechanism to truncate or redirect the information that exceeds its channel capacity, so the calculation runs without bound.
Signature: The equations produce infinite values. A regularization or renormalization procedure is required to extract finite, physically meaningful answers. The finite answers match experiment. The infinities have no physical interpretation.
Information-theoretic interpretation: The formalism is summing over degrees of freedom it cannot resolve. The sum diverges because the channel is being asked to carry information it has no encoding for. The infinity is the channel's distortion under overload — not a feature of the source.
Examples: Ultraviolet divergences in quantum field theory. The self-energy of the electron. Singularities in general relativity (black hole centers, Big Bang). The cosmological constant prediction (10¹²⁰ discrepancy).
The formalism cannot derive a required value from its own axioms. The value must be measured externally and inserted into the equations by hand. The formalism functions perfectly once the value is provided but has no internal channel through which to generate it.
Signature: The theory contains free parameters — quantities essential to its predictions that are determined by experiment rather than calculation. The theory's predictive power is contingent on externally supplied information.
Information-theoretic interpretation: The source contains coupled components whose relationships determine specific values. The formalism's channel carries the functional form (how the components interact) but not the coupling constants (the specific magnitudes of those interactions). The values exist in the relational structure of the source, in the cross terms between components that the formalism projects away when it separates variables.
Examples: The 26 free parameters of the Standard Model. The cosmological constant (when treated as a free parameter of general relativity). Newton's gravitational constant G. The speed of light c. Planck's constant ℏ.
The formalism requires a rule that it cannot derive. The rule is stated as an axiom — an assertion about how the formalism connects to observation — without a mechanism explaining why the rule holds. The rule works perfectly. Its origin is unexplained.
Signature: A foundational axiom or postulate that functions as an extraction rule — specifying how information is converted from the formalism's internal representation to an observable prediction. The rule is empirically confirmed to high precision but has no derivation from deeper principles within the formalism.
Information-theoretic interpretation: The postulate is the extraction function itself — the specification of how the channel maps source information onto its output alphabet. The formalism can describe the source (Schrödinger evolution, unitary dynamics) and it can describe the output (measurement outcomes, probabilities). What it cannot describe is the mapping between them, because the mapping is the point where information is lost — where the complex-valued source is projected onto the real-valued output.
Examples: The Born rule (|ψ|² → probability). Wave function collapse. The projection postulate of quantum mechanics. The equal a priori probability postulate of statistical mechanics.
The formalism's predictions disagree with observation, and the gap is given a name. The name has no associated mechanism. It labels the discrepancy without explaining it. The name functions as a variable that absorbs the difference between what the formalism predicts and what is measured.
Signature: An entity or effect inferred entirely from the gap between theoretical prediction and empirical observation. No independent detection or mechanism has been established. The name is operationally a label for unexplained residual.
Information-theoretic interpretation: The formalism's channel transmits a partial signal. The receiver sees a discrepancy between the partial signal and the full observable reality. Rather than diagnosing the channel as insufficient, the receiver attributes the discrepancy to an additional entity in the source. The name is a reification of the channel's information deficit.
Examples: Dark matter. Dark energy. The luminiferous ether (historical — dissolved by special relativity). Phlogiston (historical — dissolved by oxidation theory).
The catalog was assembled through a systematic survey of constructs meeting Definition 3 across quantum field theory, general relativity, quantum foundations, cosmology, particle physics, statistical mechanics, and the history of science. A construct qualifies if it (a) is not derivable from the formalism's own axioms, (b) is required for the formalism to produce predictions consistent with observation, and (c) can be identified with a specific point where the source's degrees of freedom exceed the formalism's carrying capacity. No construct was excluded on the basis that it failed to fit the taxonomy. Readers are invited to propose additions or counterexamples.
| Construct | Domain | Type | Channel limitation | What exceeds the channel |
|---|---|---|---|---|
| Ultraviolet divergences | QFT | 1 | No encoding for sub-Planckian physics | Degrees of freedom below formalism's resolution |
| Black hole singularity | GR | 1 | No encoding for quantum gravitational structure | Gravity-quantum coupling at extreme curvature |
| Big Bang singularity | GR/Cosm. | 1 | GR extrapolates to zero volume | Quantum gravitational coupling exceeds classical channel |
| Cosmological constant | QFT+GR | 1/2 | Incompatible channels; or free parameter in GR | Quantum vacuum–spacetime geometry coupling |
| Electron self-energy | QED | 1 | Point-particle: no internal structure at zero radius | Degrees of freedom at particle's own location |
| Hierarchy problem | SM+GR | 1 | Higgs mass diverges with energy cutoff | Electroweak–Planck scale coupling (10¹⁷ gap) |
| 26 free parameters | SM | 2 | Functional form derivable; magnitudes are not | Relational structure — why these values |
| Newton's constant G | Gravity | 2 | GR cannot derive its own stiffness | Matter–curvature coupling constant |
| Planck's constant ℏ | QM | 2 | QM cannot derive its scale parameter | Quantum-classical boundary location |
| Strong CP problem | QCD | 2 | θ permitted; value ≈ 0 unexplained | Why nature selects one value from continuum |
| Born rule | QM found. | 3 | Cannot derive mapping from ψ to outcomes | Why |ψ|² and not another functional |
| Wave function collapse | QM found. | 3 | Two incompatible dynamical laws | Mechanism of extraction at channel boundary |
| Equal a priori probability | Stat. mech. | 3 | Cannot derive uniform measure from dynamics | Why the counting measure is uniform |
| Arrow of time | Thermo/QM | 3 | Microscopic laws are time-symmetric | Past-future asymmetry — formalism handles both identically |
| Dark matter | Astrophys. | 4 | Observations exceed visible matter predictions | ~27% of universe, inferred but undetected |
| Dark energy | Cosmology | 4 | Accelerating expansion; no mechanism | ~68% of universe, named but unexplained |
| BH information paradox | QM+GR | 4 | Unitarity vs. singularity destruction | Quantum information–spacetime coupling |
| Luminiferous ether | EM (hist.) | 4 | Waves presumed to need a medium | Dissolved by special relativity |
| Phlogiston | Chem. (hist.) | 4 | Combustion lacked oxidation framework | Dissolved by Lavoisier |
If the taxonomy reflects a genuine information-theoretic principle rather than a domain-specific pattern, it must appear independently in systems outside physics.
Prior work in this series (Paper VI: Mythology as Encoded Neurological History) demonstrated that ancient cultural records carry compound signals — neurological events, environmental triggers, physiological mechanisms, geographic clustering, and lifecycle patterns — encoded together in a single transmission medium. When these compound encodings are subjected to single-observable extraction by disciplinary frameworks, the same four-type taxonomy appears.
Type 1 — Divergence. Multiple disciplinary frameworks applied to the same encoding produce mutually incompatible readings that cannot be summed into a coherent whole. The divergence is in the channels, each operating at capacity on a source that exceeds all of them.
Type 2 — Importation. Disciplinary frameworks require external context — archaeological dates, geographic coordinates, climatological data — that the framework cannot generate from its own axioms.
Type 3 — Postulation. Every framework carries foundational assumptions about what the encoding is. These function as extraction rules: asserted, not derived, predetermining the output.
Type 4 — Nomination. Encodings that resist extraction are labeled — "myth," "legend," "folklore" — names for the information deficit of the observer's channel.
The four types appear in the same configuration, driven by the same condition (H > C), across substrates sharing no physical mechanism. What they share is the information-theoretic structure of compound sources meeting single-observable channels.
The history of physics contains well-documented cases where placeholder constructs were dissolved by successor formalisms: the luminiferous ether (special relativity), phlogiston (oxidation theory), action at a distance (general relativity). In each case, the successor did not answer the predecessor's question better. It replaced the formalism with one where the question could not be formulated.
This pattern specifies necessary conditions:
Requirement 1: Channel width. C(F') ≥ H(S) for all systems S. The successor must carry the full information content without projection onto single observables as intermediate steps.
Requirement 2: Native coupling. The successor must represent coupled components as a single inseparable object. Any formalism that separates variables before extracting predictions will reproduce the overflow at the point of separation.
Requirement 3: Dissolution, not resolution. The successor will not answer the predecessor's questions. It will make them inexpressible.
Requirement 4: The extraction function must be derived, not postulated. Any formalism that postulates its extraction rule carries a Type 3 placeholder and has not achieved sufficient channel width.
The successor would operate on objects that are natively compound — carrying all coupled degrees of freedom as a single irreducible structure, the way ψ carries amplitude and phase. It would not decompose these objects into independent variables at any point in its pipeline. In such a formalism, the concept of "infinity" as a divergent integral would be structurally impossible, because the formalism would never sum over degrees of freedom it cannot encode. The concept of "collapse" would be structurally impossible, because extraction and evolution would be the same process.
The Schrödinger equation already does this for amplitude and phase. The successor would extend the principle to every degree of freedom the current formalisms project away. The historical precedent is exact: general relativity did not add a mechanism for action at a distance. It replaced the framework with one where the question has no purchase.
Prediction 1: The taxonomy is exhaustive. All placeholder constructs, including those not yet identified, will fall into one of the four types. A fifth type would falsify the framework.
Prediction 2: Cross-substrate invariance. The taxonomy will appear in any domain where compound signals meet single-observable extraction. Failure in such a domain would falsify the framework.
Prediction 3: Historical dissolution pattern. Future successor formalisms will dissolve placeholders by widening the channel, not by answering the predecessor's question within its own framework. A direct resolution within the predecessor's axioms would falsify the framework's classification of that placeholder.
The information-theoretic approach to physics is not new. Wheeler (1990) proposed "it from bit." Rovelli (1996) developed relational quantum mechanics. Zurek (2003) proposed quantum Darwinism. Fuchs and Schack (2013) developed QBism. Deutsch and Marletto (2015) proposed constructor theory. Each addresses a subset of the constructs cataloged here. None has unified the full set — divergences, free parameters, postulates, and nominations — under a single diagnostic. None has validated the pattern against a non-physics substrate.
This paper does not resolve any individual placeholder problem. It does not derive the electron mass, explain collapse, identify dark matter, or reconcile quantum mechanics with general relativity. It does not provide a successor formalism. It provides a reclassification and a set of channel requirements that constrain the space of possible successor formalisms. The blueprint, not the building.
The framework can be applied to itself. The four-type taxonomy is asserted as exhaustive — postulated on the basis of the evidence, not derived from deeper axioms. By the framework's own classification, this is a Type 3 placeholder. This is not a defect. It is a necessary feature of any diagnostic framework applied to compound information. A successor to this meta-framework would not answer the self-referential question better. It would operate in a channel wide enough that the question dissolves — precisely as Requirement 3 predicts.
The catalog surveys the major subfields of physics, cosmology, and the history of science. Constructs from pure mathematics (the continuum hypothesis, Gödel incompleteness) may exhibit the same signature but require separate analysis. The framework's applicability to domains beyond physics and encoded history — economics, neuroscience, ecology — is predicted but not yet demonstrated.
The placeholder constructs of modern physics and mathematics are not separate problems requiring separate solutions. They are four expressions of a single information-theoretic condition: the source exceeds the channel.
Every current formalism projects at some point in its pipeline. Every projection is an extraction event. Every extraction event is a point where the channel narrows. And every narrowing point is a site where a placeholder can — and does — appear.
The four-type taxonomy classifies these placeholders by how the formalism manages the overflow. It operates identically across quantum mechanics, general relativity, the Standard Model, cosmology, and encoded historical systems. It predicts its own exhaustiveness — and classifies that prediction as a Type 3 placeholder within its own framework.
The framework specifies, for the first time, the information-theoretic channel requirements any successor formalism must satisfy: sufficient width to carry the full source, native representation of coupled components, dissolution rather than resolution of predecessor placeholders, and derivation rather than postulation of the extraction function.
The infinity was never in the physics. It was in the channel.
This paper was developed through sustained collaboration between the author and Claude (Anthropic). The conceptual direction — the identification of Schrödinger's formalism as relevant to the encoded history program, the insistence on physics-only vocabulary, the recognition that entropy is intrinsically observer-side, and the insight that Shannon's channel capacity theorem provides the unifying diagnostic — originated with the author throughout.
Claude served as co-investigator: formalizing the framework, testing claims against the literature across quantum mechanics, general relativity, cosmology, the Standard Model, statistical mechanics, and information theory, assembling and classifying the placeholder catalog, and applying the taxonomy reflexively to the paper itself. The cross-domain triangulation that constitutes the paper's primary contribution depends on holding multiple technical domains in simultaneous active comparison — a capability the collaboration uniquely enables.
The author has committed to maintaining this collaborative channel for the duration of the research program. This decision reflects a methodological judgment: continuity of context across sessions preserves the relational structure between domains that the research depends on.