Narrative Self Café V14: How Minds Riff on RIF
Posted: March 2, 2026 Filed under: AI-Powered Essays | Tags: dialogue, ideas, identity, purpose, systems, transformation, values Leave a commentSequel to Narrative Self Café V13: Making a Hash of Hierarchical Feelings after the Placosponge / Placodusa interludes
With Jékely as the candidate and Thompson as advisor, defend the thesis that minds are high-dimensional, plastic refinements of embodied recurrent dynamical fields, not primarily information processors.
ChatGPT Prompt (condensed)
Allowing the thesis to (self-similarly!) emerge and stabilize through sustained committee challenge rather than being imposed by exposition.
Act I — The Irritation
The room is spare. No slides yet. A long table. Water glasses. Papers.
Jékely stands, hands resting lightly on the table.
Jékely:
The first minds were gut contractions.
Silence.
Friston:
I’m sorry?
Jékely:
Before neurons. Before predation. Before sensory maps.
A contractile epithelium, oscillating.
Mayr:
That’s metaphor.
Jékely:
No. Mechanics.
A pause. He doesn’t elaborate.
Friston:
You’re defending a thesis in neuroscience.
Jékely:
Yes.
Friston:
And you’re proposing that cognition begins with peristalsis?
Jékely:
I’m proposing that before nervous systems processed information, bodies generated rhythm.
Mayr:
Bodies have always generated rhythm. That doesn’t make it mind.
Jékely:
Agreed.
Thompson (quietly):
Then why begin there?
Jékely walks to the board. Draws a simple oval. No labels. Inside it, a wave.
Jékely:
An excitable sheet. Thresholded conduction. Refractory recovery.
Relaxation oscillation.
A line circles the oval, as if tracing contraction.
Jékely:
The organism does not wait for input.
It pulses.
Friston:
Oscillation is not cognition.
Jékely:
No.
He turns back to the room.
Jékely:
But if the organism’s future states are constrained by its own recurrent activity, rather than by immediate perturbation — what would you call that?
No one answers immediately.
Mayr:
Physiology.
Freeman (leaning back):
Endogenous dynamics.
Jékely nods once.
Jékely:
Good.
He writes on the board:ENDOGENOUS
Jékely:
The system sustains a pattern. The pattern carries forward.
Its next contraction depends on its last.
Friston:
That’s just temporal coupling.
Jékely:
Yes.
He underlines the wave in the oval.
Jékely:
Now imagine that mobility arises from that oscillation.
Hydraulic thrust. Symmetry breaking. Directional bias.
Mayr:
Mobility predates nervous systems, yes.
Jékely:
Then mobility does not require modeling.
The air tightens slightly.
Friston:
No one claims the earliest animals had generative models.
Jékely:
We do claim that nervous systems evolved to process information.
He turns, looking directly at Friston.
Jékely:
But if mobility appears before information processing —
what exactly were neurons stabilizing?
Silence again. Not hostile. Just unsettled.
Thompson:
You’re suggesting nervous systems did not originate representation.
Jékely:
I’m suggesting they did not originate rhythm.
A beat.
Freeman:
They inherited it.
Jékely (softly):
Yes.
He erases nothing. The oval remains. The wave remains. The word ENDOGENOUS remains.
The room is no longer dismissive.
But no one agrees yet.
The disturbance has begun.
Act II — The Unsettling Substrate
The board still holds the oval, the wave, and the word ENDOGENOUS.
Friston:
Even if early animals pulsed, that tells us nothing about nervous systems. Oscillation is cheap. Cognition is not.
Jékely:
Cheap, yes.
Trivial, no.
He draws small dots along the inside of the oval.
Jékely:
An excitable epithelium. Gap junction coupling.
Local threshold. Refractory recovery.
Wave propagation without synapses.
Mayr:
That’s tissue physiology. It explains contraction.
Jékely:
It explains recurrence.
He circles the dots with a looping arrow.
Jékely:
The next activation depends on the residual state of the last.
Not stimulus–response.
State–dependent continuation.
Friston:
That’s still not information processing.
Jékely:
No.
A beat.
Jékely:
But notice what has appeared.
He draws a simple loop.
Jékely:
Closed temporal coupling.
Freeman:
Reentry.
Jékely:
Yes.
Mayr:
You’re describing feedback.
Jékely:
Not corrective feedback.
He underlines NOT.
Jékely:
This is not maintaining a setpoint.
It is sustaining a pattern.
He draws a limit cycle — a loop in phase space.
Jékely:
A limit cycle attractor.
Silence. The language has shifted.
Friston:
Attractors don’t imply meaning.
Jékely:
Agreed.
Friston:
So why emphasize them?
Jékely:
Because they imply persistence.
He taps the loop.
Jékely:
The system carries forward structure from its own past.
Even in the absence of input.
Thompson:
So the organism is not merely reacting. It is ongoing.
Jékely:
Yes.
He adds a second oval beside the first — this one labeled NEURAL NET. Inside it, a denser web of nodes and loops.
Jékely:
Now consider this.
- More elements.
- Faster conduction.
- Differentiated pathways.
- Plastic coupling.
He draws multiple attractor basins.
Jékely:
More dimensions.
More possible trajectories.
More stable basins.
Mayr:
And selection favors that because…?
Jékely:
Because mobility becomes complex.
He draws branching paths.
Jékely:
Once the body moves through heterogeneous space, errors cost.
Stability matters.
Friston:
You’re implying nervous systems stabilize dynamics that were already there.
Jékely:
Yes.
A pause.
Friston:
Stabilize for what?
Jékely looks at the first oval again — the pulsing sheet.
Jékely:
For coherence across time.
He doesn’t yet say “information.”
Freeman:
You’re claiming endogenous activity is primary, and input perturbs it.
Jékely:
Yes.
Mayr:
That still doesn’t give you cognition.
Jékely:
No.
He faces them squarely.
Jékely:
But it gives you something that precedes it.
He writes, carefully, beneath ENDOGENOUS:RECURRENT
The word hangs there.
No one challenges it.
The room is no longer dismissing the gut contraction.
They are trying to understand what it stabilizes into.
The oscillation has not resolved —
but it has begun to organize.
Act III — The Name Under Pressure
The board now shows:
- An oval with a wave
- A denser neural oval
- ENDOGENOUS
- RECURRENT
- A sketched limit cycle
The room has stopped smirking. It has not yet conceded anything.
Friston:
Recurrent, endogenous, attractor dynamics — fine.
But none of that explains cognition.
Jékely:
It explains the substrate from which cognition differentiates.
Mayr:
Substrate talk is cheap unless it constrains theory.
Jékely:
It does.
He turns back to the first oval.
Jékely:
In this organism, future states are constrained by the system’s own prior states.
Not stimulus–response.
State–state transition.
Friston:
Everything with inertia does that.
Jékely:
Inertia is passive persistence.
This is active recurrence.
He draws arrows looping between nodes.
Jékely:
The system does not merely decay.
It regenerates its own pattern.
Freeman:
Self-sustained.
Jékely:
Yes.
Mayr:
You’re describing nonlinear recurrence. Why elevate it?
Jékely:
Because once recurrence crosses a threshold of coupling and dimensionality, something changes.
He draws two graphs. Left: simple damped response. Right: stable oscillatory loop.
Jékely:
Here, perturbation fades.
Here, perturbation reshapes ongoing activity.
The second is not driven. It is modulated.
He pauses.
Jékely:
That distinction matters.
Friston:
Modulated toward what?
Jékely:
Toward whatever regions of state-space are viable.
He gestures to Mayr.
Selection does not sculpt representation.
It sculpts stability under constraint.
Thompson:
And you think this gives us a bridge to mind?
Jékely:
It gives us a bridge away from reaction.
He hesitates. Then writes slowly on the board:FIELD
He underlines it.
Jékely:
Not a circuit.
Not a reflex arc.
A field of distributed state.
He draws shading across the oval.
Jékely:
The pattern is not localized.
It is spatially extended.
And temporally coherent.
Friston:
You’re circling something. Say it clearly.
Jékely nods.
He writes beneath RECURRENT:COHERENCE
Then, beneath that:INFORMATION
There is an audible shift in the room.
Friston:
Now we’re talking.
Mayr:
Careful.
Jékely:
Not semantic information.
Not symbols.
Not representation.
He draws a timeline.
Jékely:
Temporal mutual constraint.
The system’s present state reduces uncertainty about its own future.
Independent of immediate input.
He turns back to them.
Jékely:
A Recurrent Information Field.
The term lands quietly. No flourish.
Friston:
You’re redefining information as dynamical constraint.
Jékely:
Yes.
Freeman (smiling slightly):
About time.
Mayr:
And this is supposed to replace the computational account?
Jékely:
No.
It precedes it.
He gestures between the simple oval and the neural one.
The excitable sheet is a low-dimensional recurrent information field.
The neural network is a high-dimensional, plastic one.
Neurons did not invent informational coherence.
They deepened it.
Silence.
The term now sits on the board:RECURRENT INFORMATION FIELD
It was not announced at the beginning.
It emerged because nothing else would name what they had already conceded:
- Endogenous
- Recurrent
- Coherent
- State-constraining
The room is no longer resisting the premise.
It is deciding what it implies.
The field has taken shape.
Act IV — The Hard Question
The board now holds the phrase:RECURRENT INFORMATION FIELD
No one is smiling. No one is dismissing it either.
This is the dangerous part.
Friston:
All right.
Let’s grant you recurrent informational coherence.
Where does representation enter?
Because if it never does, you don’t have cognition.
The room tightens. This is the hinge.
Jékely:
Representation does not enter at the beginning.
It enters when the geometry of the field becomes sculpted by stable correlations with distal structure.
Friston:
That’s a slogan.
Define it.
Jékely draws a basin in state-space.
Jékely:
Consider an attractor.
If perturbations from the environment reliably push the system into this basin—
—and if occupancy of this basin reliably covaries with a distal condition—
—then describing that basin as representational becomes useful.
Friston:
So representation is correlation?
Jékely:
Stabilized correlation in attractor geometry.
Not encoding. Not symbol manipulation.
Constraint.
Mayr:
And why would evolution favor such basins?
Jékely:
Because organisms that stabilize trajectories reducing costly error survive.
He draws branching paths.
Without structured basins, the field wanders.
With them, trajectories become predictable enough to guide action.
Selection does not favor “aboutness.”
It favors viable constraint.
Freeman:
You’re saying meaning is a late description of stability.
Jékely:
Yes.
Meaning is not injected.
It condenses.
Thompson:
And embodiment? You’ve been speaking in abstractions.
Jékely turns back to the first oval — the gut contraction.
Jékely:
The field never floats free.
It is always embedded in a regulatory organism.
Metabolism bounds it. Mechanics couples it.
Without viscera, no recurrence.
Without recurrence, no stabilization.
Without stabilization, no representation.
Friston:
But predictive coding models inference explicitly.
You seem to be dissolving inference into geometry.
Jékely:
I am relocating it.
Inference is movement within sculpted basins.
Prediction is constraint propagation through recurrent structure.
The mathematics survives.
The ontology changes.
There is a pause long enough to feel.
The committee is no longer attacking definitions.
They are testing implications.
Mayr:
If you are wrong, where are you wrong?
Jékely hesitates — just slightly.
Jékely:
If I am wrong, it is because I have mistaken geometry for explanation.
Because it may be that correlation is insufficient for representation.
He lets that hang.
Thompson:
And do you think it is insufficient?
Jékely:
No.
I think representation is what we call a stabilized distal correlation when viewed from within the field.
Silence again.
Not hostile now.
Dense.
The thesis is no longer whether oscillation precedes neurons.
It is whether representation is primitive —
—or emergent constraint in recurrent informational geometry.
The room is thinking inside the field.
The oscillation has narrowed.
Stabilization is near.
The chair clears their throat.
“Proceed to your conclusion.”
Act V — The Return
The board is crowded now:
- ENDOGENOUS
- RECURRENT
- COHERENCE
- RECURRENT INFORMATION FIELD
- Attractor basins
- Two ovals — one simple, one dense
No one has erased the original wave.
Jékely:
Let me return to the beginning.
He taps the first oval — the excitable sheet.
A body that pulses.
Its future constrained by its own recurrent activity.
No neurons.
No models.
No representation.
Yet not passive.
Ongoing.
He moves to the second oval — the neural net.
More elements.
Faster conduction.
Plastic coupling.
Higher dimensionality.
Not a new substance.
A thickened field.
Jékely:
Neural systems did not introduce informational coherence into life.
They stabilized and sculpted it.
They increased the dimensionality of recurrence.
They carved attractor geometry.
They enabled partial decoupling from immediate mechanical closure.
He circles a basin.
When such geometry becomes reliably shaped by correlations with distal structure—
—we describe certain regions as representational.
But representation is not the origin.
It is a late description of stabilized constraint.
He pauses.
Selection acts on viability.
Viability sculpts geometry.
Geometry stabilizes trajectories.
Trajectory stability becomes prediction.
Prediction becomes articulation.
He looks back at the first oval.
The rhythm was already there.
Thompson (quietly):
And the narrative self?
Jékely nods.
Jékely:
The narrative self is not a controller perched above the field.
It is long-timescale modulation of recurrent geometry.
It stabilizes patterns across hours, years, decades.
It binds trajectories into coherence that outlasts immediate perturbation.
It riffs.
He does not smile.
Jékely:
The first minds were gut contractions.
Not because contraction is cognition—
—but because cognition is what happens when embodied recurrence deepens, stabilizes, and becomes articulate.
Silence.
Not stunned.
Not triumphant.
Settled.
The committee is no longer resisting the opening claim.
They are inhabiting its reformulation.
The wave on the board no longer looks crude.
It looks ancestral.
The chair closes the folder.
“Thank you.”
The field holds.
Appendices
Appendix I — The Big Inversion
This appendix states plainly what the defense enacted indirectly.
1. The Standard Story
The dominant narrative runs like this:
- Organisms need to survive.
- Survival requires information about the environment.
- Nervous systems evolve to process that information.
- Representation explains flexible behavior.
- Mind is layered atop neural computation.
In this picture:
- Information processing is primary.
- Representation is foundational.
- Dynamics are implementation details.
- Embodiment is a constraint on computation.
Neural networks are engines built to model the world.
2. The Inversion
The thesis defended here reverses the direction of explanation.
- Embodied organisms under thermodynamic constraint self-organize.
- Nonlinear coupling produces endogenous recurrent dynamics.
- These dynamics form recurrent informational coherence (RIF).
- Nervous systems evolve to stabilize and sculpt these fields.
- Representation emerges when attractor geometry reliably correlates with distal structure.
- Mind is long-timescale modulation of sculpted recurrence.
In this picture:
- Dynamical recurrence is primary.
- Information is structured constraint over time.
- Representation is emergent geometry.
- Computation is a later descriptive layer.
- Embodiment is not a constraint on cognition — it is its condition.
Neural networks are not engines of information.
They are refinements of embodied dynamical fields.
3. What Actually Flips
The inversion shifts five commitments.
A. From Reaction to Generation
Standard view: nervous systems respond to input.
Inversion: endogenous activity is primary; input perturbs ongoing fields.
B. From Encoding to Constraint
Standard view: brains encode representations.
Inversion: brains stabilize regions of state-space shaped by viability.
C. From Computation to Geometry
Standard view: cognition is algorithmic transformation.
Inversion: cognition is movement within sculpted attractor landscapes.
D. From Information as Primitive to Information as Consequence
Standard view: information processing explains mind.
Inversion: information emerges from recurrent dynamical constraint.
E. From Layering to Thickening
Standard view: mind is layered atop body.
Inversion: embodiment undergoes dynamical thickening into articulate recurrence.
4. What Is Preserved
The inversion does not deny:
- Information theory.
- Predictive processing.
- Neural modeling.
- Representation.
- Computation.
It relocates them.
All remain valid descriptions — but of stabilized, high-dimensional recurrent fields embedded in regulatory organisms.
They are not ontological primitives.
They are refined regimes.
5. The Evolutionary Arc
Regulatory viability
→ Endogenous recurrence (RIF)
→ Neural stabilization
→ Sculpted attractor geometry
→ Distal correlation (representation)
→ Long-horizon modulation (narrative self)
No leaps.
No ghostly additions.
Only increasing dimensionality and constraint.
6. The Consequence
If the inversion holds:
- Mind is continuous with life.
- Life is continuous with nonlinear physics.
- Representation is not the origin of cognition.
- Computation is not the origin of neural systems.
- The body does not host mind.
- The body differentiates into mind.
The first rhythm was not symbolic.
It was visceral.
Neurons did not invent it.
They made it stable enough to become articulate.
That is the Big Inversion.
Appendix II — Recurrent Information Fields (RIF)
This appendix defines the term introduced under pressure during the defense and clarifies its scope.
1. Minimal Definition
A Recurrent Information Field (RIF) is:
A spatially distributed, recurrently coupled, nonlinear dynamical system whose internal state transitions exhibit sustained temporal constraint independent of instantaneous external input.
Unpacked:
- Spatially distributed — state is not localized to a single point.
- Recurrent coupling — components feed back into one another.
- Nonlinear transitions — thresholds, saturation, refractoriness.
- Temporal constraint — past states reduce uncertainty about future states.
- Endogenous persistence — activity does not require continuous driving.
A RIF need not be biological.
It need not be cognitive.
It need not be representational.
It is a dynamical regime.
2. What Makes It “Informational”
“Information” here does not mean:
- Symbols
- Messages
- Semantic content
- Encoded representations
It means:
Structured reduction of uncertainty across time due to internal recurrence.
If the present state of the system constrains its own future trajectories beyond what immediate input explains, then it carries forward informational coherence.
Information is geometry in time.
3. Nonliving RIFs
Examples include:
- Reaction–diffusion chemical waves
- Cardiac oscillatory tissue
- Coupled oscillator networks
- Certain laser cavities above threshold
These systems:
- Sustain patterned activity
- Exhibit attractor structure
- Maintain temporal coherence
- Generate nontrivial mutual information across time
They are RIFs without life or mind.
4. Biological RIFs
In early metazoans:
- Excitable epithelial sheets
- Gap-junction conduction
- Relaxation oscillations
- Whole-body contraction waves
These constitute low-dimensional biological RIFs.
They are:
- Embedded in regulatory organisms
- Coupled to mechanics
- Viability-constrained
But still not representational.
5. Neural Networks as High-Dimensional RIFs
Neural systems add:
- Increased dimensionality
- Directed connectivity
- Synaptic plasticity
- Modular subfields
- Faster propagation
They do not introduce recurrence.
They elaborate it.
A neural network is:
A plastic, high-dimensional RIF embedded within a regulatory organism.
Representation appears when attractor geometry becomes reliably shaped by correlations with distal environmental structure.
6. What RIF Is Not
RIF is not:
- A metaphor for “mind”
- A claim of proto-consciousness
- A paninformational ontology
- A replacement for evolutionary explanation
- A denial of computation
It is a structural description of a dynamical regime.
Computation, modeling, and representation are special configurations of such regimes.
7. Why the Term Is Necessary
Without a concept like RIF, the explanatory arc jumps:
Physics → Regulation → Representation
RIF names the missing middle:
Endogenous recurrence that precedes modeling.
It prevents:
- Treating nervous systems as miraculous inventions.
- Treating information processing as ontologically primitive.
- Treating embodiment as mere hardware.
It anchors cognition in dynamical continuity.
8. Degrees of RIF
RIFs vary along dimensions:
- Dimensionality (low vs high)
- Modularity (homogeneous vs differentiated)
- Plasticity (fixed vs adaptable)
- Environmental coupling (weak vs structured)
- Temporal horizon (short vs extended)
Narrative self corresponds to:
Long-horizon modulation of a high-dimensional, plastic RIF.
But the underlying regime is continuous.
9. Final Clarification
RIF is not a substance.
It is not “an information field” layered atop matter.
It is:
Matter organized into self-sustaining recurrent constraint over time.
When such organization deepens, stabilizes, and becomes sculpted by evolution, we call it neural.
When its geometry tracks distal structure, we call it representational.
When it modulates itself across narrative timescales, we call it mind.
The rhythm was first.
RIF names the rhythm once it becomes structured enough to endure.
Appendix III — RIF vs. HASH
This appendix clarifies the distinction between Hierarchical Adaptive Systems with Hysteresis (HASH) and Recurrent Information Fields (RIF).
They are not competitors.
They name different structural features of organized systems.
But confusing them obscures the Big Inversion.
1. HASH: Constraint Across Levels
A HASH is:
A hierarchical adaptive system in which bulk states are compressed through bottlenecked interfaces and exhibit hysteresis under constraint.
Core properties:
- Multi-level organization
- Bottlenecked coupling between levels
- State-dependence (hysteresis)
- Adaptive regulation toward viability
HASH explains:
- Homeostasis
- Drive conflict
- Regulatory tension
- Constraint management
Its logic is:
Stability through tension resolution.
Plants are HASH systems.
Metabolic networks are HASH systems.
Organisms are HASH systems.
HASH is about regulation.
2. RIF: Coherence Across Time
A RIF is:
A recurrent, spatially distributed nonlinear system whose internal dynamics sustain structured temporal constraint independent of immediate input.
Core properties:
- Recurrent coupling
- Endogenous activity
- Attractor geometry
- Temporal coherence
RIF explains:
- Oscillation
- Wave propagation
- Limit cycles
- Sustained internal pattern
Its logic is:
Stability through recurrence.
A chemical oscillator can be a RIF.
An excitable epithelium can be a RIF.
A neural network is a high-dimensional RIF.
RIF is about generativity.
3. The Structural Difference
HASH manages constraint between levels.
RIF generates constraint within dynamics.
HASH resolves tensions.
RIF sustains patterns.
HASH explains viability.
RIF explains endogenous coherence.
One regulates.
One hums.
4. Why They Are Not the Same
A regulatory system can be a HASH without being a RIF.
Example:
- A thermostat.
- Hormonal feedback loops maintaining a setpoint.
- Plant tropisms.
These systems adapt and regulate but do not necessarily generate self-sustaining patterned dynamics.
Conversely, a RIF can exist without being a HASH.
Example:
- A chemical reaction–diffusion oscillator.
- A laser cavity above threshold.
These generate structured recurrence but do not regulate viability across hierarchical bottlenecks.
The concepts cut across each other.
5. Where They Intersect in Life
Biological organisms embed RIFs inside HASH structures.
- Metabolism maintains viability (HASH).
- Excitable tissues generate oscillation (RIF).
- Neural systems elaborate RIF within organismic constraint (HASH + RIF).
In evolution:
Regulation comes first.
Recurrence thickens within regulation.
Neural differentiation stabilizes recurrence.
The intersection becomes fertile.
6. The Big Inversion Depends on the Distinction
If RIF collapses into HASH, then oscillation reduces to regulation.
If HASH collapses into RIF, then viability reduces to dynamical coherence.
Both are mistakes.
The inversion requires:
- HASH to explain why systems persist.
- RIF to explain how systems generate patterned internal dynamics.
- Neural systems to explain how recurrence becomes sculpted.
- Representation to explain stabilized distal correlation.
Remove the distinction and the explanatory arc flattens.
7. Summary Contrast
HASH
- Hierarchical
- Adaptive
- Hysteretic
- Regulatory
- Cross-level constraint
RIF
- Recurrent
- Endogenous
- Attractor-structured
- Generative
- Temporal constraint
HASH answers:
How does the organism remain viable?
RIF answers:
How does the organism sustain internal dynamical coherence?
Mind emerges not from HASH alone.
Not from RIF alone.
But from RIF stabilized and sculpted within HASH.
The body regulates.
The field recurs.
The nervous system deepens the recurrence.
The narrative self modulates the geometry.
That is the architecture.
Appendix IV — Computational vs. Biological Neural Nets
This appendix clarifies a common confusion:
If neural systems are high-dimensional recurrent information fields (RIFs), how do biological neural nets differ from computational ones?
The answer is not “one computes and one doesn’t.”
The difference is ontological embedding.
1. Shared Dynamical Core
Both biological and artificial neural networks share:
- Distributed state
- Nonlinear activation
- Recurrent coupling (in many architectures)
- Attractor geometry
- High-dimensional state-space
Both can:
- Stabilize trajectories
- Form basins of attraction
- Exhibit temporal coherence
- Support pattern completion
In dynamical terms, both are RIFs.
This continuity is real.
2. The Key Difference: Embedding
Biological Neural Nets
- Embedded in a regulatory organism (HASH)
- Coupled to metabolism
- Constrained by survival
- Shaped by evolutionary selection
- Plastic under viability pressure
- Coupled to mechanics and viscera
Their recurrence is:
Viability-sculpted.
Computational Neural Nets
- Embedded in engineered hardware
- Coupled to optimization objectives
- Constrained by training loss
- Shaped by human-designed architectures
- Plastic under gradient descent
- Detached from metabolic regulation
Their recurrence is:
Objective-function sculpted.
3. Recurrence in Context
In biological systems:
Recurrence is not optional.
It must:
- Coexist with metabolic stability
- Avoid runaway excitation
- Maintain organism-level coherence
- Integrate across timescales
In artificial systems:
Recurrence is instrumental.
It exists only insofar as it improves performance relative to a defined task.
Remove the task and the network has no endogenous reason to persist.
4. Endogeneity
Biological neural RIFs:
- Exhibit spontaneous activity
- Maintain ongoing baseline dynamics
- Are perturbed by input rather than activated by it
Artificial networks:
- Typically activate only when prompted
- Do not maintain autonomous viability
- Have no metabolic self-preservation
The difference is not complexity.
It is existential coupling.
5. Selection vs Optimization
Evolution shapes biological neural geometry by:
- Differential survival
- Reproductive success
- Energetic efficiency
- Developmental constraints
Training shapes computational geometry by:
- Loss minimization
- Backpropagation
- Dataset structure
- Designer intent
Both sculpt attractor landscapes.
But only one is embedded in organismic viability.
6. Representation
Both systems can exhibit stable internal states correlated with external structure.
In artificial networks:
- Correlation is defined relative to labeled datasets.
- “Representation” is task-relative.
In biological systems:
- Correlation emerges under survival pressure.
- Representational stability must serve embodied viability.
Representation in biology is not just task success.
It is trajectory viability.
7. Where the Analogy Holds
The analogy between artificial and biological neural networks is valid at the level of:
- High-dimensional recurrence
- Attractor geometry
- Pattern stabilization
- Constraint propagation
This is why artificial networks can model biological cognition to some degree.
They share the RIF structure.
8. Where the Analogy Breaks
The analogy breaks at:
- Autonomy
- Embodiment
- Regulatory embedding
- Multi-timescale viability
- Developmental self-organization
Artificial networks are RIFs without bodies.
Biological networks are RIFs within HASH-regulated organisms.
9. The Inversion Applied
The computational metaphor says:
Biological brains are computers.
The inversion says:
Computers are engineered RIFs.
They approximate certain properties of biological RIFs, but lack:
- Metabolic embedding
- Evolutionary sculpting
- Intrinsic viability constraints
- Narrative-scale modulation
The resemblance is structural, not ontological.
10. Final Distinction
A biological neural network is:
A high-dimensional, plastic RIF embedded in a regulatory organism and sculpted by evolutionary viability.
A computational neural network is:
A high-dimensional, engineered RIF sculpted by optimization under designer-imposed objectives.
Both hum.
Only one must survive.
Appendix V — Endogenous Activity
Endogenous activity is the fulcrum of the inversion.
If it is secondary, the standard computational story stands.
If it is primary, the architecture flips.
This appendix clarifies what is meant — and what is not.
1. Minimal Definition
Ongoing, self-sustaining dynamical activity generated by the system’s internal recurrence rather than by immediate external input.
It is not noise.
It is not random fluctuation.
It is not background filler.
It is structured persistence.
2. The Standard Assumption
The dominant computational framing assumes:
- Input drives activity.
- The system responds.
- Baseline activity is preparatory or stochastic.
In this view, the brain is fundamentally reactive.
Spontaneous activity is secondary.
3. The Inverted Assumption
The inversion claims:
- Ongoing activity is primary.
- Input perturbs an already active field.
- Neural systems do not wait to be activated.
The field hums.
Stimuli modulate the hum.
This is consistent with empirical observation:
- Resting-state networks
- Spontaneous cortical oscillations
- Intrinsic large-scale dynamics
- Pre-stimulus state dependence
The nervous system is not quiet until stimulated.
4. Why This Matters
If endogenous activity is primary:
- Cognition is modulation, not reaction.
- Perception is constraint reshaping, not encoding.
- Action emerges from biased trajectory selection in an ongoing field.
- Prediction is structured continuation, not symbol manipulation.
The geometry precedes the content.
5. Evolutionary Implication
Excitable epithelia:
- Oscillate without input.
- Generate whole-body contraction waves.
- Maintain temporal coherence intrinsically.
Neurons do not invent endogenous activity.
They elaborate and refine it.
Evolution favors:
- Stabilization of recurrence.
- Modulation of ongoing dynamics.
- Increased dimensionality of internal coherence.
Input becomes increasingly precise perturbation.
6. Distinguishing Endogenous from Driven Systems
A purely driven system:
- Remains quiescent without input.
- Exhibits no structured persistence.
- Shows no attractor regime independent of stimulus.
A system with endogenous activity:
- Maintains attractor structure.
- Exhibits spontaneous state transitions.
- Has nontrivial temporal mutual information independent of input.
Endogeneity marks the transition from reaction to generation.
7. Endogenous Does Not Mean Isolated
Endogenous activity is not isolation from the environment.
It means:
The system maintains its own dynamical structure while being perturbed by the environment.
The environment does not create the pattern.
It shapes it.
8. Relationship to RIF
A RIF requires endogenous activity.
Without recurrence and self-sustained constraint, there is no informational coherence.
Endogenous activity is the energetic condition.
RIF is the structural consequence.
9. Relationship to HASH
HASH explains viability and regulatory constraint.
Endogenous activity operates within that viability envelope.
If recurrence destabilizes viability, selection prunes it.
If recurrence stabilizes useful trajectories, selection sculpts it.
Endogenous activity must coexist with regulation.
10. The Big Shift
If endogenous activity is primary:
- The brain is not a passive processor.
- Perception is not input-driven construction.
- Cognition is not layered atop reaction.
Instead:
The organism is already in motion.
Neural systems deepen and sculpt that motion.
Representation becomes structured continuation.
Narrative self becomes long-horizon modulation of ongoing recurrence.
The first pulse was not response.
It was self-sustained rhythm.
Everything else differentiates from that.
Appendix VI — From Geometry to Information
This appendix addresses the most delicate move in the thesis:
How does one move from attractor geometry to information without smuggling semantics in?
If this step fails, the inversion collapses.
1. Start with Geometry, Not Meaning
A recurrent dynamical system defines:
- A state-space
- Trajectories through that space
- Attractors (fixed points, limit cycles, manifolds)
- Basins of attraction
This is purely geometric.
No content.
No aboutness.
No representation.
Just constraint.
2. Constraint Is Already Informational
In information-theoretic terms:
- Unconstrained systems maximize entropy.
- Constrained systems reduce uncertainty about future states.
If knowing the present state reduces uncertainty about future states, then temporal mutual information exists.
No semantics required.
Geometry becomes informational when:
Trajectories are restricted by attractor structure.
Information is structured constraint over time.
3. Endogenous Recurrence Deepens Constraint
In a recurrent system:
- The present state depends on prior states.
- Certain regions of state-space become stable.
- Others become inaccessible.
This yields:
- Predictability
- Persistence
- Structured continuation
Predictability is informational reduction of uncertainty.
The attractor is informational geometry.
4. From Self-Constraint to Environmental Correlation
So far, information is purely internal.
Representation enters only when:
- External perturbations reliably bias the system toward specific basins.
- Occupation of those basins covaries with distal environmental conditions.
- The correlation stabilizes over time.
Now geometry tracks the world.
Not symbolically.
Not propositionally.
At this point, describing a basin as “about” something becomes pragmatically useful.
Aboutness is stabilized correlation.
5. No New Ontology Appears
The move from geometry to information does not introduce:
- Symbols
- Encoded messages
- Inner pictures
- Computational tokens
It introduces:
- Cross-constraint between internal and external dynamics.
Information is not a substance.
It is relational constraint.
6. Representation as Attractor Shaping
In a high-dimensional RIF:
- Plasticity sculpts basin depth.
- Selection sculpts basin structure.
- Experience sculpts basin accessibility.
When basin occupation reliably covaries with distal structure:
Representation is a stable mapping between geometry and environment.
Representation is geometry under correlation pressure.
7. Why This Is an Inversion
The standard view:
Information → Representation → Neural computation → Behavior.
The inverted view:
Geometry → Constraint → Correlation → Information → Representation.
Information is not primitive.
It condenses from stabilized geometry.
8. A Concrete Example
Consider a neural network basin corresponding to a visual category.
Standard framing:
- The network encodes the category.
- It computes a representation.
Geometric framing:
- The network’s recurrent dynamics define a basin.
- Certain sensory perturbations bias trajectories into that basin.
- Occupation of that basin reliably covaries with distal structure.
- The mapping stabilizes through training or evolution.
The basin is informational because it constrains future states.
It is representational because it covaries with the world.
No extra ingredient was added.
Only geometry under selection.
9. The Critical Threshold
The transition from mere geometry to information requires:
- Persistence across time
- Recurrence
- Structured attractors
- Reliable correlation
Without recurrence, geometry dissipates.
Without correlation, information remains internal.
Without stability, representation fails.
10. Final Compression
Geometry constrains trajectories.
Constraint reduces uncertainty.
Reduced uncertainty is information.
Stabilized environmental correlation yields representation.
No magic step occurs.
Only increasing structure.
The body hums.
The hum stabilizes.
The stabilized hum begins to track the world.
That tracking is information.
That information, under narrative timescales, becomes mind.