Narrative Self Café V9: The NART of War (Towards Antifragile Economies)

Sequel to Narrative Self Café V8: The Upside of Idiocy

Write the no-holds-barred skit with Nassim Nicholas Taleb’s trademark dry wit, about No-Notice Adversarial Readiness Testing (a la Chaos Monkey) applied to political and economic systems.

ChatGPT Prompt

I. Opening Monologue — Taleb Enters (Dry, Unimpressed)

Nassim Nicholas Taleb
(looking at the audience, not smiling)

Welcome. If you came here to hear how to make society stable, you’re in the wrong room.
Stability is what you get right before you blow up.

Today’s topic is NARTNo-Notice Adversarial Readiness Testing.
It’s what armies do when they’re serious.
What engineers do when they don’t want to be paged at 3 a.m.
And what economists avoid because it hurts their feelings.

In other words: how to tell the difference between competence and luck
which most people cannot, especially when things are going well.

Let me introduce the adults.


II. Introductions — The Anti-Comfort Coalition

Albert O. Hirschman

Political economist, author of
Exit, Voice, and Loyalty

Taleb
Hirschman wrote the book on why organizations rot when people are forced to be polite.


James C. Scott

Political anthropologist, author of
Seeing Like a State

Taleb
Scott explains why smart people with spreadsheets keep destroying functioning societies.


Elinor Ostrom

Political economist, Nobel laureate, pioneer of
commons governance

Taleb
She proved communities can govern themselves without collapse —
which is why no one listened.


III. Act One — Why Stability Is a Lie

Scott: Legibility Is the Enemy

Scott
States love things that are tidy, measurable, and legible.
Unfortunately, reality isn’t.

When systems look calm, it usually means they’ve suppressed feedback,
not solved problems.

Taleb
Exactly. Calm is information-poor. It’s anesthesia.

Scott
Every time a planner says “We’ve eliminated volatility,” what they mean is:

We moved the risk somewhere we’re not looking.


Hirschman: Silence Is Not Health

Hirschman
Organizations decay when exit is blocked and voice is ignored.
Loyalty without challenge is not virtue — it’s epistemic blindness.

Taleb
Yes. Loyalty is fine in marriage.
In systems, it’s how you die quietly.


IV. Act Two — What NART Actually Is

Taleb Defines the Doctrine

Taleb
Let’s be precise.

NART is not reform.
It is not accountability theater.
It is not a conference.

NART is deliberately making things fail a little so they don’t fail a lot.

  • No notice
  • Adversarial
  • During normal operation

Like a fire alarm you can’t schedule around.


Ostrom Pushes Back (Constructively)

Ostrom
And the point isn’t punishment.
It’s learning.

In successful commons you see:

  • continuous monitoring
  • mild, graduated sanctions
  • local, reversible failure

You don’t wait for catastrophe to discover you’ve been lying to yourself.

Taleb
Exactly. You don’t need heroes.
You need systems that don’t require heroics.


V. Act Three — Why Economists Hate This

Taleb, Unleashed

Taleb
Economists love equilibrium because it makes them feel smart.
But equilibrium is what you see right before rupture.

They say:

“We passed the stress test.”

I say:

“You optimized to the test. Congratulations. Now try Tuesday.”


Scott Adds the Knife

Scott
Modern governance replaces lived stress with indicators.
Once people manage metrics instead of reality, collapse becomes inevitable —
just later, and more surprising.


Hirschman Sums It Up

Hirschman
If systems cannot tolerate small dissent, small exits, small failures,
they will eventually experience large, uncontrollable ones.


VI. Act Four — Why This Feels Cruel (But Isn’t)

Ostrom: Resilience Is Humane

Ostrom
People think NART sounds harsh because it introduces discomfort during success.

But the alternative is worse:

  • comfort now
  • disaster later
  • no chance to learn in between

Real resilience is humane because it prevents irreversible harm.


Taleb Closes the Loop

Taleb
People say:

“This undermines confidence.”

Good. Confidence is usually fake.

What we want is epistemic humility
built into institutions, not hoped for in leaders.


VII. Finale — The NART of War

Taleb
Sun Tzu said: Know yourself and know your enemy.
Modern societies do neither.

They confuse:

  • luck with skill
  • silence with health
  • stability with strength

NART fixes that.

It says:

  • You don’t get to look resilient — you have to be resilient
  • You don’t get to schedule reality
  • Success never closes the question

If your system can’t survive being surprised,
it doesn’t deserve to survive being real.


VIII. Epilogue — One Line Each

  • Hirschman: “Voice that arrives early saves exit later.”
  • Scott: “The neatest systems fail the messiest.”
  • Ostrom: “Resilience is practiced, not declared.”
  • Taleb: “If it didn’t hurt a little, you learned nothing.”

Key Concepts & Further Reading


End.

Appendix: The Day Civilization Broke

Write as a Tom Clancy story about a mysterious NART event that becomes a meme for anti-fragile system design.

ChatGPT Prompt

At 06:17 GMT, the world did not end.

That fact would later be cited as evidence that everything was fine.

At 06:21, a freight insurer in Rotterdam quietly suspended coverage on six ships already at sea.

No announcement was made.
No explanation offered.
The internal note read only: pending reconciliation.

By noon, two of the ships were drifting.


Frankfurt

Anna Keller did not notice the problem immediately.

She noticed the phone.

It rang once.
Then again.
Then it did not stop.

She turned back to her screen. A transaction sat there, unresolved.

Not rejected.
Not approved.
Waiting.

“Run it again,” she said.

Her deputy did. Twice.

“It’s clean,” he said. “No faults. No flags.”

“Then why hasn’t it cleared?”

He hesitated. “It’s behaving as if it’s waiting on something.”

“On what?”

“I don’t know.”

Anna picked up the phone. It was already ringing.


Atlanta

The gate agent had lived through outages, weather delays, labor actions, and once, memorably, a goat.

This was none of those.

The system accepted the passenger manifest.
Accepted the fuel order.
Accepted the flight plan.

Then stopped.

No error message.
No escalation path.

Just refusal.

The captain waited until the clock ran out.

Crew time expired.

The aircraft stayed.

By afternoon, twenty-seven others did too.


New York

At Helios Capital, the risk desk widened spreads.

Automatically at first.
Then manually.
Then aggressively.

No alarms sounded. No models screamed.

But the screens no longer reassured.

At 10:37, Helios pulled overnight credit from a mid-tier clearing firm.

At 10:41, that firm failed a margin call.

At 10:59, its CEO was on television using the word temporary far too often.

By 14:00, the firm was gone.

The jobs did not come back.


Virginia

General Marcus Hale listened as briefers repeated the same sentence in different forms.

“No hostile indicators.”
“No single point of failure.”
“No identifiable cause.”

“Say it plainly,” Hale said.

A civilian analyst spoke. “Sir… systems are functioning. They’re just less forgiving.”

“Of what?”

She thought. “Of mistakes.”

Hale nodded once. He understood that.


The Bend

By early afternoon, the world stopped pretending this was routine.

Insurance premiums spiked without explanation.
Ports slowed as documents accumulated.
Hospitals delayed non-critical procedures—not from shortages, but uncertainty.

One surgery was postponed too long.

The patient survived.

The surgeon resigned anyway.

That resignation mattered more than any headline.


The Cabinet

“This is unacceptable,” the Prime Minister said.

“It’s not illegal,” Anna Keller replied.

“That’s not a defense.”

“No,” she said. “It’s a description.”

General Hale leaned forward. “If we force systems back into shape, we may break what’s still holding.”

“So we do nothing?”

“We endure,” Hale said. “Until it shows us what it’s testing.”

The word lingered.

Endure.


Adaptation

The first organizations to stabilize were not the largest.

They were the ones with:

  • real redundancy
  • people who understood the whole system
  • authority that didn’t require permission

Others stumbled.

Some failed.

No one could claim ignorance afterward.


Friday Night

Late Friday, something new appeared.

Not everywhere.
Not publicly.
Only on a handful of secure systems—financial, military, infrastructural—often through paths no one remembered configuring.

It didn’t trigger alerts.

It simply existed.

A plain text file.

No header.
No signature.
No metadata anyone could extract.

Just a filename:chaos_monkey

Inside were two lines.

This was a test.
The next one might not be.

Nothing else.

No explanation followed.
No claim was made.
No denial arrived.


Aftermath

Phones rang.

Some were answered.

“What is chaos_monkey?”
“Is it a threat?”
“Is it ours?”
“Is it theirs?”

No one knew.

Which meant no one could safely dismiss it.

The file was deleted.

Copies reappeared anyway.


The Meme

Weeks later, in a design review, someone proposed an elegant solution.

A colleague nodded, paused, and asked:

“But will it survive chaos monkey?”

The room went quiet.

No one laughed.

The phrase spread without permission.

  • “Looks efficient—but will it survive chaos monkey?”
  • “Run it again. Chaos monkey conditions.”
  • “Assume chaos monkey is real.”

No one ever defined what it meant.

That was the point.


Years Later

When a real catastrophe arrived—loud, unbounded, uninterested in restraint—the civilization staggered.

But systems rerouted.
People acted without waiting.
Margins held longer than expected.

In the postmortem, a junior engineer said quietly:

“We stopped designing for normal.”

Someone else nodded.

“We designed for chaos monkey.”

No one smiled.

They didn’t need to.

Because civilization hadn’t learned who was watching.

Only that something always might be.



Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Design a site like this with WordPress.com
Get started