The Signal Carries Everything

A Universal Principle from the First Prime to the Last Memory

Matthew J. Goss — Quantiterate Independent Research

This research program proposes that meaningful signals carry embedded representation of their own prior history at the moment of transmission, not as a separate retrievable record. Beginning with a novel hypothesis about synaptic encoding, the framework extends across neuroscience, quantum physics, information theory, mathematics, world mythology, and — in the seventh paper — the formalism limitations of physics itself. Seven papers develop the principle from its biological origin to its broadest implications.

Papers
Paper I

On the Residue of Prior Signals

History-Encoding at the Moment of Synaptic Transmission

May 2026 · Foundational hypothesis

Abstract

Current models of synaptic transmission treat the signal crossing the synaptic gap as a carrier of immediate stimulus information, with associative context attributed to network-level dynamics. We propose that each signal crossing the synapse carries an encoded residue of all prior signals that have traversed that synapse, loaded into the signal at the moment of transmission. We term this the synaptic residue and identify it with the experimentally confirmed phenomenon of activity-silent memory traces. The framework generates a falsifiable prediction: two identical stimuli crossing the same synapse should produce measurably different downstream signals if the synaptic residue differs.

Download Paper (docx)
Paper II

Signal Fidelity, Not Storage Erasure

A History-Encoding Reframing of Alzheimer’s Disease Pathology

May 2026 · Clinical application

Abstract

The dominant model of Alzheimer’s disease frames progressive memory loss as erasure of stored information. Clinical trials targeting amyloid clearance have repeatedly failed to restore cognitive function even when clearance is achieved. We propose an alternative: memories are not stored separately from signals but are embedded in the transmission event itself as associative history payload. Under this model, Alzheimer’s pathology degrades the signal-encoding apparatus rather than erasing stored content. This reframing predicts different therapeutic targets and explains the persistent dissociation between amyloid clearance and functional recovery.

Download Paper (docx)
Paper III

Entanglement Without Mystery

Joint-Origin History Encoding as the Mechanism of Quantum Correlation

May 2026 · Quantum physics application

Abstract

Quantum entanglement remains the only major phenomenon in physics with no agreed-upon mechanism. We propose that entangled particles share a joint-origin history embedded in both at the moment of their creation. Measurement does not transmit information between particles. It reads pre-embedded history that was placed in both simultaneously at the origin event. This framework requires no faster-than-light communication, no hidden variables in the Bell-inequality sense, and no many-worlds branching. We acknowledge the limitations of this proposal and identify the conditions under which it could be distinguished from existing interpretations.

Download Paper (docx)
Paper IV

The Carrier

On the Photon as the Transmission Mechanism of History-Encoded Signals

May 2026 · Physics & information theory

Abstract

The photon has zero rest mass, zero charge, zero proper time, and no interaction history with the Higgs field. Every quantum number that would give it a self-history is absent. Yet it carries all electromagnetic information with perfect fidelity. We argue that the photon is Shannon’s noiseless channel realized in physics. We trace this carrier from the synaptic gap, where it mediates the electromagnetic interactions underlying thought, to E = mc², where it appears as the constant c — the boundary between history-carrying matter and history-free energy. The carrier of your thoughts has no history of its own. It exists to carry yours.

Download Paper (docx)
Paper V

History-Encoded Signals

A Universal Principle in Information-Bearing Systems

May 2026 · Framework paper

Abstract

We identify a structural pattern appearing across fifteen domains from prebiotic chemistry to transformer neural networks: meaningful signals carry embedded representation of their own prior history at the moment of transmission. We term this the History-Encoding Principle and identify four invariant structural components at every observed scale: the signal, the history payload, the temporal marker, and the tamper-evidence mechanism. We propose that the universality of this pattern reflects a structural requirement of information-bearing systems under selection pressure: systems that must act meaningfully rather than merely react encode their history into their signals.

Download Paper (docx)
Paper VI

Mythology as Encoded Neurological History

Regional Folklore as the Cultural Record of Localized Neurological Events

May 2026 · Mythology & medical history · Includes Appendix A: Evidence Grid

Abstract

We propose that regionally specific mythology — leprechauns in Ireland, vampires in the Balkans, the Oracle at Delphi, werewolves in Central Europe, Bigfoot in the Pacific Northwest — represents the encoded cultural record of localized, organic neurological events caused by identifiable regional pathogens, environmental neurotoxins, or geological phenomena. Individual myth-to-disease connections are documented in the prior literature. What has not previously been proposed is the general principle: that this mechanism applies systematically to all regional mythology, that mythological content functions as a diagnostic fingerprint of the causative neurological agent, and that the lifecycle of a mythological tradition tracks the lifecycle of the underlying cause. We apply the History-Encoding Principle as Scale 16 of the framework, engage the comparative mythology scholarship of Campbell and Watts as partial precursors, and propose three falsifiable predictions testable against existing historical and epidemiological records.

Read Paper →
Paper VII

The Channel and the Placeholder

A Shannon Information-Theoretic Framework for Formalism Limitations in Physics and Mathematics

May 2026 · Information theory & foundations of physics

Abstract

Modern physics and mathematics rely on constructs — infinities, free parameters, unexplained postulates, and named-but-mechanismless entities — that appear wherever a formalism encounters phenomena it cannot fully describe. This paper demonstrates that these constructs share a common information-theoretic signature identifiable through Shannon’s channel capacity theorem. Each placeholder appears where the information content H of the source exceeds the channel capacity C of the observer’s formalism. A four-type taxonomy — divergence, importation, postulation, and nomination — classifies all known placeholder constructs by their overflow management mechanism, validated across physics and encoded historical systems. The framework specifies the channel requirements any successor formalism must satisfy and generates the falsifiable prediction that the taxonomy is exhaustive. The infinity was never in the physics. It was in the channel.

Read Paper →

These papers were developed through collaborative inquiry between the author and Claude (Anthropic), beginning with a single hypothesis upon waking on May 10, 2026. The AI collaboration is documented as part of the research record and is itself an instance of the principle described herein. The complete project archive, including session records and development history, is maintained at Quantiterate.

Research | Quantiterate · v1.2.0 · May 2026