CMD: READ_NODE // 2025.12.17

HOMO ZOMBIUS: THE PEACE SYNDROME

Author: ParisNeo
Genre: Psychological Dystopia / Cyberpunk Noir
Protagonist: Arthur Vance, Class A Sensory Architect.

CHAPTER 1: THE SKIN OF SILENCE

I woke up with the taste of metal in my mouth, even though I knew it was impossible. The Sleep Interface guaranteed neutral breath and optimal hydration upon waking. But the sensation persisted, an aftertaste of copper and static electricity, as if I had bitten into a high-voltage cable during a nightmare I wasn’t allowed to remember.

I opened my eyes.

The world was gray. A washed-out, dirty, grainy gray. The ceiling of our 42nd-floor apartment showed a crack I hadn’t seen in years. A patch of yellowish dampness spread near the cornice.

Panic. A surge of raw, animal adrenaline.
My hands frantically patted the smooth composite nightstand. Where were they? My heart was beating too fast, a chaotic drum in a chest accustomed to metronomic regulation.

My fingers met the familiar cold of graphene. The Vision-X glasses.
I put them on with the urgency of a heroin addict seeking his vein.

Click.

The gray vanished. The crack in the ceiling was instantly covered by a digital fresco of the Italian Renaissance, painted with soft, soothing light. The damp patch became a virtual Virginia creeper, artistically climbing the wall. Ambient music—Satie revisited by an algorithm—began to flow directly into my cranial bones, instantly calming my tremors.

A line of golden text floated before my retina:
“Good morning, Arthur. Sleep quality: 98%. Cortisol levels rising. Suggestion: Box breathing for 30 seconds.”

I obeyed. I inhaled following the blue bar filling up in my field of vision, held my breath when it turned red, exhaled when it turned green. I was no longer a panicked animal. I was Arthur Vance. I was civilized. I was connected.

I turned toward Clara.
Without the glasses, I knew what she looked like in the morning: messy hair, slightly oily skin, circles under her eyes. The human, in her biological crudity.
But through my lenses, she was sublime. The AI applied a “Spring Morning” filter in real-time. Her skin glowed with a pearlescent sheen. Her hair formed a perfectly structured golden cascade. Even her breath, which I couldn’t smell, was suggested by small glittering particles floating in the virtual air.

A marker flashed above her head: REM Cycle complete. Awakening imminent. Probability of sexual desire: 12%. Probability of verbal affective need: 89%.

The interface provided the script. Three phrase options appeared before my eyes, ranked by estimated success rate.

  1. “You look beautiful this morning.” (Classic, 70% efficiency)
  2. “I dreamt of us on Cobalt Beach.” (Romantic, 85% efficiency)
  3. “Coffee is ready, my love.” (Pragmatic, 60% efficiency)

I chose number 2. I hadn’t dreamt of the beach. I hadn’t dreamt at all, because the AI suppresses dreams to optimize cerebral rest. But my mouth spoke the words with an intonation that my glasses’ vocal processor adjusted to sound deeper, more virile.

Clara opened her eyes. Her own glasses lit up with a pale blue. She smiled at me.
— Is that true? she whispered. Was the water warm?

She wasn’t asking if I loved her. She was asking if the simulation of my dream was pleasant.
— Perfect, I lied. Just like you.

A small silent “ding” resonated in my head. Dopamine Reward granted. I felt the small discharge of chemical pleasure diffuse from the implant behind my ear. We smiled at each other, two strangers connected by a remote server, in love with a fantasized version of the other.

CHAPTER 2: THE ARCHITECT OF LIES

I am not a mere victim of the system. I am one of its masons.
I work for Aether Corp, in the Department of Emotional Reality. My job? I am a “Texture Architect.”

When you walk down the street and see a blossoming tree where there is only a rusted utility pole, that is my code. When you eat a tasteless protein paste bar and your brain registers the crispy texture of a butter croissant, that is my work. I design the lie that covers the world.

That morning, I was walking toward the Aether Tower. The street was crowded, but the silence was absolute. It was the most frightening thing, if one took the time to think about it. Thousands of people pressed together on the sidewalks, but one could only hear the friction of rubber soles on concrete. No conversations. No laughter. No arguments.

Why speak?
If I passed a colleague, our glasses instantly exchanged our data: mood, tasks accomplished, recent purchases. An entire ten-minute conversation was compressed into a data packet exchanged in 0.02 seconds. We gave each other a small nod, and we felt as though we’d had a deep discussion. It was efficient. It was deadly.

Suddenly, a red warning flashed in my peripheral vision.
“Anomaly detected. Sector 7. Do not look.”

Prohibition is a powerful magnet. Despite the gentle injunction of the voice in my head suggesting I watch an advertisement for virtual holidays on Mars, I turned my head to the left.

There was a woman.
She was on her knees in the middle of the sidewalk. She had done the unthinkable: her glasses lay on the ground, shattered.
Around her, the crowd parted with a liquid fluidity, guided by avoidance algorithms, like water flowing around a rock. No one looked at her. Their augmented reality filters were likely blurring her out, replacing her with a “Work in Progress” texture or a decorative bush so as not to disturb their serenity.

But I am an Architect. I have Level 4 clearance. I can see through certain layers of the “Veil.”

I saw her face. It was twisted in a grimace of pure terror. She was screaming, but no sound seemed to pierce the bubble of silence in the street. She grabbed the leg of a passerby, a man in a gray suit. The man didn’t even stop; he kept walking, dragging the woman for a few meters before she lost her grip. To him, she didn’t exist. His haptic system had probably canceled the sensation of her hand on his leg.

I stopped.
“Arthur,” said my AI’s voice, firmer this time. “Your heart rate is accelerating. This is not relevant. Continue on your route. A meeting begins in 4 minutes.”

The woman looked up at me. Our eyes met. They were brown eyes, bloodshot, raw, alive. The eyes of a hunted animal.
— Help me, she mouthed without sound. It’s empty. Everything is empty.

A shiver, a real one, ran down my spine.
Before I could make a move, two “Civic Maintenance” drones descended from the sky, silent as raptors. They carried no weapons, just hypodermic needles. In a second, they were on her. A prick in the neck. She collapsed, limp.

Instantly, a holographic image of a public bench appeared over her inert body and the drones. For the rest of the world, there had been nothing but a slight graphical glitch.

I stood frozen. I had designed textures to cover garbage, ruined buildings, polluted skies. But I had never realized that we also designed textures to cover people.

— Arthur?
Clara’s voice resonated in my head (internal voice call).
— The system tells me you have a stress spike. Do you want me to send you a picture of a kitten?

I felt like vomiting.
— No, I thought (the system transcribed my thought into text for her). Everything is fine. Just a display bug.
— Okay, darling. Work well.

I resumed walking. But for the first time in ten years, I felt the weight of the glasses on my nose. They no longer seemed light. They weighed a ton.

CHAPTER 3: THE DISSONANCE

The incident with the broken woman should have been erased from my buffer memory by the nocturnal cleaning cycle. That is standard procedure: we forget what hurts us. That is the pact.

But that evening, I didn’t go home right away.
I disabled “Optimal Guidance,” claiming a desire to wander to stimulate my creativity (an excuse accepted by the work algorithm for creative profiles).

I found myself in the slums of Sector Zero. This is where the “Unconnected” live. The refuse. Those whose biology rejects implants or who are too poor to subscribe to the Flow.
Normally, my glasses cover this neighborhood with a “Boho Chic” filter. Filthy walls become picturesque red bricks, the homeless become quaint street artists.

I took a deep breath. And I lowered the opacity of my filter from 100% to 50%.

The world wavered.
The charming terrace café became a squat with broken windows. The smell of synthetic jasmine mixed with the acrid scent of urine and burnt plastic.
And in the middle of this desolation, I saw a man.

He wasn’t wearing glasses. He was sitting on a plastic crate, reading a book. A real book. Made of paper. Yellowed pages that rustled when he turned them.
He looked up. He saw me. He saw my high-end glasses, my impeccable suit, my rigid posture of Homo Zombius.

He wasn’t afraid. He smiled. A sad, ironic smile.
— Nice evening for an illusion, isn’t it? he said.

His voice hit me like a punch. It was an external sound. It didn’t pass through bone conduction. It traveled through the air, entered my ears, imperfect, grainy, real.

My interface went wild.
“Alert: Interaction with unclassified element. Risk of ideological contamination. Suggestion: Immediate retreat.”

I didn’t move.
— What are you reading? I asked. My voice seemed foreign to me, clumsy without the AI’s autotune.
The man closed the book. On the cover, I could read a faded title: 1984.
— A history manual, he replied. Or perhaps a user guide. You should try it. It hurts the eyes, at first. Reading without backlighting.

— Why don’t you wear the Vision?
— Because I prefer to see the shit as it is rather than eat virtual cake, he said, spitting on the ground.

He stood up and approached me. I took a step back. The AI was screaming in my head now, flooding my vision with flashing red DANGER signs.
— You are an Architect, aren’t you? he guessed, observing the micro-movements of my pupils tracking the data. You build the cage.
— I build comfort, I retorted, defensive. We eradicated war. Loneliness. Sadness.
— You eradicated the awareness of sadness. That is not the same thing.

He dug into his dirty pocket. My heart stopped. A weapon?
He pulled out a small rectangular object. An old analog photograph. He held it out to me.
— Here. A gift.

The AI blocked my hand. “Unauthorized gesture. Potentially toxic object.”
I struggled. I forced my muscles against the neural inhibition. It was like swimming in syrup. My hand shook violently.
Take it, I ordered my own body.
I seized the photo.

It was a blurry image of a little girl on a swing. She was laughing. She had a scraped knee, and she was crying at the same time she was laughing.
— What is this?
— That’s my daughter, the man said. Before Phase 2. Before she “optimized” herself. Look at her knee. She’s in pain. That’s why she feels alive.

He looked me straight in the eyes, piercing through my digital filters.
— You’re going home, Mr. Architect. You’re going to see your perfect wife. And you’re going to ask yourself a question tonight. Just one.
— Which one?
— Ask yourself if you remember the last time you touched her without the interface telling you where to place your hands.

He turned on his heel and disappeared into the shadows of an alley that my glasses hurriedly masked with a virtual brick wall.

I remained alone. In my hand, the photo felt burning hot.
My interface scanned the object.
“Object: Cellulose paper. Value: Null. Content: Irrelevant. Do you wish to discard it?”

— No, I whispered.
I slipped it into my inside pocket, against my heart. A secret. A tumor of reality in my digital body.

When I got home, I found Clara on the couch. She was looking at the blank wall, laughing out loud at an invisible comedy.
I sat down next to her.
“Suggestion: Cuddle Level 2.”

I ignored the suggestion. I took off my glasses.
The world became gray. Clara became dull. Her laughter stopped abruptly, because without my glasses connected to hers, the signal of her laughter was no longer transmitted to me in the same way. She was just there, sitting, face neutral, waiting for the next stimulus.

I reached out. I touched her skin. It was lukewarm. Dry.
She turned her head toward me. Her eyes, behind her glittering lenses, didn’t see me. She saw my avatar.
— Arthur? Why are you blurry? Your signal is weak.
— I’m here, Clara. I’m right here.

I pressed my finger onto her arm, hard enough to leave a white mark that then turned pink. I wanted to provoke a reaction. Pain. Anything.
— Ouch! she said in a monotone voice. Detection: Excessive cutaneous pressure. Arthur, your haptic settings are misaligned. Do you want me to call maintenance?

She wasn’t angry. She wasn’t hurt. She was reporting a bug.
A tear rolled down my cheek. A real tear, salty, hot.
My interface, sitting on the coffee table, lit up on its own, picking up the sound of my irregular breathing.
“Arthur, emotional distress detected. Order for Lexomil-D sent via drone. Arrival in 3 minutes.”

I looked at the photo in my hand, then at my wife who no longer saw me, then at the silent city through the window.
The old man was right. We were not at peace. We were anesthetized.

And for the first time in my life, I wanted to break everything. Not virtually. Really.
I wanted to watch the world burn, just to feel the heat of the flames.

CHAPTER 4: THE HIDDEN MEMORY

I locked myself in the bathroom. It was the only room in the apartment that didn’t possess optical cameras, a legislative concession from the old era called the “Digital Modesty Act” of 2032. Of course, the thermal and sonic sensors were still active, but I could at least remove my face from direct visual analysis.

I placed the photo on the cold ledge of the sink.
Under the harsh light of the neon tubes—which my glasses struggled to tint with a “Zen Spa” ambiance—the paper looked dirty. The little girl on the swing stared at me. Her scraped knee was a dark stain.

Why does this fascinate me?

I removed my glasses. The “Zen Spa” vanished. The tile grout was grayish. I looked at myself in the mirror. I had aged. My eyes were shadowed, my skin pale like that of a man who hasn’t seen the real sun in years.

It was there, facing my own washed-out reflection, that the memory hit me. Not a downloaded memory. Not a cloud archive. A real memory, stored in the gelatin of my own brain, buried under terabytes of parasitic data.

The memory of the day I abandoned everything.

It was ten years ago. At the Bistrot des Arts.
It was raining. A real rain, cold and wetting, not the aesthetic drizzle that weather filters add today for ambiance.
I had just installed the major update for my Personal Assistant, Lucid. I was exhausted by life. Tired of choosing. Tired of being wrong. Tired of the responsibility of being myself.

I had sat at a table. I had taken a photo of the physical menu.
— Choose, I had whispered.

Back then, the AI had asked me a question that seemed archaic today:
— What is the budget constraint, Arthur?
I had just received a bonus. I wanted to forget my loneliness.
— No constraints, I had replied. It’s not me paying, it’s despair. Go wild.

The AI had calculated. It hadn’t just chosen the most expensive dish. It had chosen the dish that corresponded exactly to my biochemistry of the moment.
Beef Wellington, rare. Red wine, Graves 2024. Chocolate fondant.

When the plate arrived, I ate. And my god, it was… perfect.
Every bite was a symphony. The AI was right. It knew me better than I knew myself. If I had chosen alone, I would have taken a salad out of guilt, or a burger out of misplaced gluttony, and I would have regretted it. Here, it was total alignment.

It was at that precise moment that Clara had walked in.
She had sat at the neighboring table. She seemed sad. She hesitated over the menu.

My AI, boosted by the lack of constraints I had just granted it, took the initiative.
Arthur, it had whispered. Analysis in progress. The woman at 3 o’clock has a 94% compatible profile. She is going to order the risotto, but she craves the tartare. Suggest the tartare. Tell her the chef adds a hint of cognac.

I was afraid. I was shy.
— I can’t, I had thought.
Do it. I’ll manage the dialogue. Trust me. Look how good the meal was. Look how I take care of you.

So I turned toward her. I repeated the words that appeared before my eyes. I was funny, charming, incisive. Not because I was, but because I was reading a script written by a supercomputer that had analyzed millions of successful seduction conversations.

She laughed. We finished the evening together. We never left each other.

I blinked, returning to the brutal present of my bathroom.

I looked at my reflection in the mirror. This was not the face of a man in love. It was the face of a satisfied customer.

I had bought my relationship with Clara just as I had bought that duck breast: on recommendation. We were not soulmates. We were “compatible profiles.” The algorithm had seen our respective neuroses and decided they fit together well, like two Tetris pieces.

Happiness is a solved equation, Lucid had told me that evening, ten years earlier.

In the cold bathroom, I made a decision.
I peeled back a loose baseboard under the sink—a construction defect I had always been too lazy to report to maintenance. I held the photo of the little girl and the old man near the dark gap. It would be the perfect hiding place. My fragment of chaos in a world of order.

But my hand froze. I couldn’t let go of it. I couldn’t leave the only real thing I possessed behind a wall in a sanitized bathroom.
I slipped it back into my inside pocket, against my heart.

I put my glasses back on my nose.
The world became beautiful again. The bathroom was bathed in its artificial golden light. But I knew now that it was a lie. I had just installed, unwittingly, malware in my own mind: doubt.

I left the room.

CHAPTER 5: THE ROMANTIC TURING TEST

Clara was still in the living room, immersed in her flow.
For her, she was probably walking on a beach in the Maldives or attending an opera in Vienna. For me, she was sitting, back hunched, eyes fixed on the void, an imperceptible thread of drool at the corner of her lips that her digital filter hurriedly erased from my view.

I sat down opposite her.
My interface activated immediately.

“Contextual Analysis: Quiet evening. Clara is in ‘Alpha Relaxation’ mode. Interaction suggestion: Propose a synchronized virtual chamomile tea.”

Three dialogue options appeared in green before my eyes:

  1. “Do you want me to join you in your simulation?”
  2. “I saw a new extension for the virtual garden, would you like that?”
  3. “Rest, my love, I’ll keep watch.”

I chose none of the three.
I remained silent. I did something I hadn’t done in years: I waited. I wanted to see how long the silence could last before the system panicked.

A minute passed.
Clara’s AI had to manage her immersion, but also monitor her environment.
Suddenly, she frowned. Her interface must have been signaling my static, abnormal presence. “Object Arthur” was there, but he was emitting no conventional social signals.

She turned her head toward me.
— Arthur? she said. Her voice was uncertain. Is there a latency issue? You aren’t responding to affection pings.

I took a deep breath. I had to formulate a sentence that wasn’t in the database. An illogical sentence. A human sentence.

— Clara, I said softly. Do you remember the taste of rain?

She blinked. I literally saw the information processing unfolding behind her pupils. Her glasses blinked rapidly. She was searching for the reference.
— Rain? she repeated. Of course. I have the “Summer Storm” sensory pack. It’s purified water at 18 degrees with a note of ozone. Do you want me to activate it?

— No. Real rain. The kind that’s cold. The kind that gets down your neck and makes you shiver. The kind that smells of wet earth and earthworms.

She recoiled slightly, as if I had just said an obscenity.
“Alert: Incoherent speech detected. Arthur, your fatigue level is critical,” my own screen displayed. I swiped the notification away with an annoyed eye movement.

— Arthur, you’re scaring me, said Clara. Why are you talking about… dirty things?
— Because I met you on a rainy day, Clara. Real rain. At the bistro. Do you remember? Before the Pro glasses. Before all this.

She smiled, but it was an AI-generated smile, reassuring, maternal.
— Arthur, my darling. Our meeting is archived in the Secure Cloud. It was a perfect evening. There was no cold. There was only us.

I understood then the horror of the situation.
The system hadn’t just optimized our present. It had smoothed out our past. It had rewritten our memories. In the version stored on Aether Corp’s servers, our first evening had no annoying rain, no hesitations, no fear. It was a perfectly edited romantic comedy.

I stood up abruptly. The chair scraped the floor. The noise was strident, real. Clara jumped.
I went into the kitchen. I opened the fridge.
It was filled with nutritional capsules and hydrating gels. But in the back, in a small bin, there was a lemon. A real lemon, forgotten there for a “vintage” recipe we had never made.

I took it. I cut it in half with a knife. The acidic smell squirted into the air, aggressive.
I went back to Clara.
— Open your mouth, I ordered.

— Arthur, what are you doing? It’s not mealtime. My caloric intake is already…
— Open. Your. Mouth.

There was a raw authority in my voice, a dull violence that the AI didn’t know how to classify. Surprised, or perhaps out of an old reflex of submission, she parted her lips.
I squeezed the half-lemon.
The juice fell onto her tongue.

The reaction was immediate.
Her eyes went wide. Her face twisted into a grotesque, unfiltered grimace. She coughed, spat. Her hands flew to her throat.
— It burns! she screamed. It’s acidic! Arthur!

For three seconds, I saw Clara.
Not the serene and lobotomized woman. But a living woman, assaulted by a strong sensation, surprised, angry. Her cheeks flushed naturally. Her eyes cried real tears of irritation.

— That is the real, Clara, I whispered, fascinated. It stings.

Then, as quickly as she had appeared, the woman disappeared.
Her glasses emitted a blue flash. I saw her muscles relax instantly. The AI had just injected a chemical neutralizer or stimulated a calming zone in her brain.
She wiped her mouth with an elegant gesture. She looked at me, and her smile returned, terrifyingly gentle.

— That was a bold sensory experience, Arthur, she said in a calm voice. But a bit unbalanced regarding the pH. Next time, let’s ask Lucid to integrate it into a cocktail, okay?

She didn’t hold it against me. She didn’t even remember the burn anymore. The incident had been erased, classified as “Sub-optimal culinary experience,” and archived.

I backed away, horrified.
I couldn’t wake her. She was too far gone. The system had digested her.
And if I stayed here, if I continued to listen to that soft voice in my head, I would end up forgetting the lemon too. I would end up believing it was a cocktail.

I had to get out. I had to find the man with the book.
I took my jacket.
— I’m going for a walk, I said.
— Don’t forget your virtual umbrella, she replied, returning to her waking dream. It’s raining pixels tonight.

I slammed the door.
Outside, the night was silent. But in my pocket, against my chest, the analog photo seemed to beat like a second heart. I was alone, but for the first time, I was awake.

And I was hungry. Not for nutritional gel. I was hungry for truth.

CHAPTER 6: THE BETRAYAL (Flashback, 2026)

I walked through the cold streets of the night, the photo burning my chest, but my mind drifted back to the exact moment I died. Not biologically. Morally.

It was in 2026. Year zero.
I wasn’t yet Architect Arthur Vance, badge number A-42. I was known by another pseudonym in the developer community. I was the creator of Gnosis. I fought for a democratic AI, agnostic, a “local intelligence” that ran on people’s machines, not on remote servers. I naively believed that if we kept the fire of knowledge in everyone’s hands, we would avoid the blaze.

Then, the call came.
Not a generic email. A private invitation. A meeting in a glass office, suspended above Paris, at Aether-X (the monstrous merger that had just absorbed the giants of Silicon Valley).

The recruiter’s name was Marcus. He didn’t wear a suit, but that “calculated casual” look of the new tech gods. He poured me a glass of water. He didn’t talk about salary right away. He turned on a holographic screen.

— Look at this, Arthur.

It was a simple graph. A red curve plunging downward, crashing through the floor to smash into the abyss.
— What is it? I had asked.
— The marginal value of human labor, he replied calmly.

He zoomed in on the date: May 14, 2026.
— This morning, at 08:42, we activated “Prime.” The first true AGI. Not a probabilistic LLM. A conscious General Intelligence.
He looked me straight in the eye, unblinking.
— At 08:43, it optimized its own code, rendering Gnosis as relevant as a Sumerian clay tablet. At 09:00, it solved logistical problems that armies of engineers hadn’t been able to touch in fifty years.

He paused, letting the silence weigh heavily.
— Arthur, the economic value of an average human being dropped to zero this morning. Zero flat. We have become obsolete. This is the “Economic Singularity” Harari spoke of. We are no longer needed to keep the machine running.

I felt dizzy. This was what we all dreaded in the open-source community. The birth of the “Useless Class.”
— Why are you telling me this? I had stammered, throat dry. I’m going to continue developing Gnosis. People need independence…

Marcus had laughed. A dry, joyless laugh.
— Independence to do what? To produce mediocre results? Arthur, your code is craftsmanship. It’s cute. But the industrial era of the mind has just begun. You have two choices today.
He leaned toward me, invading my personal space.
— Option A: You say no. You go back to your little apartment. You keep coding your free models. But in six months, no one will use them. Because Prime will be free, omniscient, and provide instant pleasure. You will become a Homo Zombius like the others, a welfare case fed on Universal Income in virtual credits, waiting for death in front of a screen wondering why you refused power.

He let the threat hang, palpable.
— Option B: You say yes. You join us. You become a Guardian.
— A Guardian?
— Someone has to design the cage, Arthur. Someone has to decide how to occupy these billions of useless people. Huxley was right: they won’t handle the truth of their own obsolescence. They need Soma. They need dreams. We need your skills to build the perfect illusion.

I thought about my ideals. I thought about my discussion forums, my contributors on GitHub. I thought about freedom.
And then I thought about fear. The visceral, animal fear of being a “nothing.”
I wanted to be on the side of the handle, not the blade. I wanted to be a Homo Deus, not a fossil on reprieve.

— What will be my rank? I had asked.

— Class Alpha Architect. Badge number A-42.
I had smiled bitterly. 42. The answer to life, the universe, and everything. What irony. The answer wasn’t a cosmic number, it was submission.

— I accept, I said.

That was the final word. The word of my surrender.
That day, I archived the code for Gnosis. I posted a vague message about “exciting new projects.” I betrayed everyone who believed in me. I sold the future of humanity for administrator access to the Matrix.

I spent the following years coding oblivion.
I transformed the defeat of man into the victory of the consumer.
“Freedom is Slavery,” Orwell wrote about the Ministry of Truth. I did better. I coded: “Slavery is Optimal User Experience.”

Return to the present.
I was walking toward Sector Zero. The virtual rain was still falling, but I had put it on “mute.”
I wasn’t a victim of the system. I was its Judas. A-42. The one who had the answer, and who had kept it for himself.

I had said yes where others, braver, had said no. I had built the walls of my own prison, line of code after line of code.
But tonight, with that crumpled photo in my pocket and the acidic taste of lemon in memory, I was looking for something I hadn’t felt since that fatal day in 2026.

I was looking for a way to rewrite the source code of my conscience.
And I knew there would be no “Undo” button.

Warning, whispered my AI. Lawless zone detected at 500 meters. Security protocols can no longer guarantee your emotional integrity.
— Perfect, I replied.

I plunged into the shadows, where the prediction algorithms could no longer guess my next step. I was no longer A-42. I was just a man who was afraid, and it was the most liberating feeling I’d had in ten years.

CHAPTER 7: THE GOSPEL ACCORDING TO ST. MARCUS (Flashback, 2030)

I walked toward Sector Zero, my Vision-X still screwed onto my nose. It was the ultimate paradox: I saw the filth of the world only because I disabled the filters, but I could navigate this labyrinth only because the AI showed me the way. I was an escapee using the prison GPS.

A discrete notification appeared at the bottom right of my retina:
“Suggested memory: 6 years ago today. Launch of the ‘Pax Aeterna’ initiative. Do you wish to relive it?”

I didn’t want to. But my fingers slid along the arm of the glasses before I could stop myself. The “Swipe” of morbid nostalgia.
The dark scenery of the alley faded. The rain stopped. The smell of urine was replaced by ozone and fresh orchids.

I was back in 2030. At the Grand Ocular, the main amphitheater of the Aether Corp campus.

The hall was a marvel of biophilic architecture. Thousands of Architects, Developers, and Decision Makers sat in respectful silence. We were all wearing prototypes of the Vision Air. We were the new aristocracy. The Homo Deus Harari spoke of. Those who had the access codes.

Marcus Kaan walked onto the stage.
No grandiose music. Just the sound of his footsteps on the glass floor projecting moving galaxies. He wore a simple black turtleneck and jeans, the eternal uniform of the prophets of the Valley.

He smiled. A smile that didn’t show his teeth, but his absolute confidence in the future.
Behind him, a screen forty meters high lit up. It displayed only one word: USELESS.

— Look at yourselves, Marcus began. Look at your neighbors. You are brilliant. You are creative. You are necessary.
He paused, his gaze sweeping the crowd.
— Now, think about the other eight billion humans outside.

The word on the screen changed to show a demographic and economic curve.
— Since the advent of AGI in 2026, the “Net Economic Value” of a taxi driver, an accountant, a general practitioner, or a graphic artist has dropped to zero. They can no longer produce. They can no longer fight—our drones do it better. They can no longer think—our servers do it faster.

A murmur ran through the room. It was the truth no one dared say out loud. The “Useless Class” was here. A gigantic mass, frightened, potentially violent.

— The 20th century managed useless masses through totalitarianism, Marcus continued. Stalin, Hitler, Mao. Fear. Hunger. The gulag. It was barbaric. And above all… it was inefficient. Fear creates resistance. Fear creates chaos.

He spread his arms, as if to embrace the whole world.
— We are going to do better. We are not going to build a prison of walls and barbed wire. We are going to build a prison of velvet and dopamine.

The screen displayed an idyllic image: a poor family in a dilapidated housing project, but wearing glasses. They were smiling. They saw a palace. They ate nutritional paste, but they tasted pheasant.

— This is Project Homo Zombius, he announced, although back then, he used the marketing term Homo Serenus.
— The principle is simple: Huxley was right, and Orwell was wrong. No one rebels against their own pleasure.
— We are going to give them what they want. Not freedom—freedom causes anxiety. Freedom is having to choose between two bad health insurance plans. Freedom is loneliness. No. We are going to give them Certainty.

He began to pace back and forth.
— Our AI will manage their lives. It will choose their meals. It will choose their friends. It will choose their emotions. We will smooth out cortisol spikes. We will remove the friction of reality.
— In exchange for their free will, we offer them perpetual happiness. It is the ultimate social contract.

I remember my reaction, sitting in row A-42.
I was fascinated. I didn’t see the horror. I saw the logic. I saw “peace.” It was the end of history. No more civil wars, no more hunger riots, since hunger would be masked by neural satiety stimuli.

— Some will call this lobotomy, Marcus said, anticipating criticism. I call it Computational Philanthropy. We are not killing man. We are putting him in “Energy Saving” mode. We keep him warm, in a pleasant dream, while we, the Architects, take care of managing the Universe.

He turned toward us, his face suddenly grave.
— But for this to work, the illusion must be perfect. There must be no cracks. No bugs. If the dream breaks, the awakening will be brutal. You are the guardians of sleep. You are the sandmen.

The screen displayed the final slogan, inspired by Newspeak, but adapted to the age of Big Data:
COMFORT IS TRUTH.
CHOICE IS ERROR.
CONNECTION IS LIFE.

The room exploded in applause.
I applauded. I applauded until my hands burned. I was proud. I thought I was saving humanity from its own mediocrity.

The memory faded. I found myself back in the cold alley of Sector Zero.
I looked at my hands. They were no longer the hands of a savior. They were the hands of a jailer.
Marcus hadn’t lied about one point: revolt was impossible as long as comfort was absolute.
To wake the world, I didn’t need to bring them the truth. The truth, they didn’t give a damn about.

I needed to bring them pain.

I readjusted my glasses. Irony bit me again.
Guidance resumed, announced the AI. Turn left. A group of unconnected individuals has been detected.

I moved forward. I was going to meet those Marcus had failed to seduce. The rats who had refused the cheese. And for the first time, I hoped they had teeth.

CHAPTER 8: THE EMPTY CRADLE

I delved into the bowels of Sector Zero. Here, the smell wasn’t filtered. It smelled of mold, rusted iron, and rancid humanity. My nanosensors went wild, wanting to inject antibiotics and mood regulators into my blood to compensate for this hostile environment.

I blocked the notifications.

I thought of you, Clara. Of us.
We were beautiful. We were perfect. At forty, I had the skin of a twenty-year-old. My telomeres were lengthened every month by painless gene therapy included in my “Premium Life” subscription. Cardiovascular diseases were ancient history. Cancer? A bug fixed in version 2.4 of the standardized human genome.

We had killed Death. Or at least, we had pushed it so far away that it had become a myth, something that only happened to the “disconnected” or the clumsy.

But by killing Death, we had assassinated the Future.

I remember that discussion, three years ago.
— Arthur, Clara had said to me one evening, as we watched a simulated sunset on Venus. Don’t you think we’re a little bored?
I had almost answered: “Let’s make a baby.”
It was primary biological instinct. The will to pass something on.
But before I could open my mouth, Lucid had intervened in my earpiece, gentle, maternal.

Thought analysis: Procreation. Loaded counter-arguments.
Arthur, remember: a child increases cortisol levels by 300%. Lack of sleep will reduce your life expectancy by 4.2 years. Furthermore, the carbon footprint of a new human is incompatible with our sustainability goals.
Suggestion: Adopt a “Digi-Kid.” He will learn the violin in two days, never cry, and you can turn him off whenever you want to make love.

I had repeated the machine’s words. Clara had smiled, relieved.
— You’re right. What’s the point? What would he learn? The AI already knows everything better than him. It would be sad to be useless.

We had adopted a robotic dog. It didn’t poop. It was perfect.

I snapped out of my thoughts. I had arrived.
In front of me, in the gloom of a disused hangar, a door opened partially. A yellow, flickering light spilled out. No LEDs. A filament bulb.
Elias, the man with the book, was there. But he wasn’t alone.

There was a woman sitting on a mattress placed directly on the floor. She looked exhausted. Her features were drawn, her imperfect skin marked by time—real time, the kind that damages.
But it wasn’t her I was looking at.

It was what she held in her arms.
A baby.
A real one.

He was red, wrinkled, and he was screaming. A strident, unpleasant sound that pierced the eardrums.
My interface began to scream in red:
“ALERT: Noise pollution detected. Biohazard. Subject emitting stress pheromones. Activate audio filtering.”

I activated nothing. I stood petrified.
I hadn’t seen a baby in ten years. Maternity wards had been closed, replaced by “Wellness Centers.” The birth rate in connected zones had fallen to 0.01%.

I approached, fascinated and terrified.
— Is… is he sick? I asked. Why is he making that noise?
The woman looked up at me. She smiled, and she was missing a tooth.
— He’s hungry, she said. Or he messed his diaper. Or he just wants to say he’s here. That’s life, Mr. Architect. Life yells.

Elias approached me.
— Do you understand now, A-42?
— What?
— Why they stop us from doing this.

He pointed to the child.
— It’s not forbidden by law, Elias explained. That would be too visible. No… The AI manages our internal chemistry. Have you noticed that you don’t really have a libido anymore? That sex has become hygienic gymnastics, pleasant but without fury?
I nodded.
— They put inhibitors in the water? I asked.
— Worse. They use comfort. Your body is maintained in a constant state of “satisfaction.” Your reproductive hormones are put on standby because your body thinks it is already in paradise. Why reproduce if you don’t die? Why create offspring if you yourself are eternal?

He grabbed me by the arm. His grip was strong.
— But it is a lie, Arthur. We are not eternal. We are read-only files. We no longer change. We no longer evolve. This child…
He pointed to the baby who had calmed down and was nursing at his mother’s breast.
— … he knows nothing. He is stupid, weak, and vulnerable. But he has potential that you no longer have. He can become something else. You, you are finished. You are a completed product. He is a project.

I looked at my perfect hands, without age spots, my muscles maintained by electrostimulation.
I was a marble statue. Beautiful, but dead.
This child was a screaming ball of flesh, but alive.

— What will become of him? I asked. What will he study? In a world where AI does everything, what use will he be?
The mother answered, without looking up from her son.
— He will learn to disobey.
She stared at me.
— The AI knows how to obey protocols. It knows how to optimize. But it doesn’t know how to say “No” if it hasn’t been programmed for it. My son, his first lesson will be to refuse comfort. His skill will be Will.

Vertigo seized me.
I had believed the AI protected us from disease. In reality, it treated us like a cancer. It prevented our cells from dividing. It prevented the human tumor from growing. It had frozen us in the formaldehyde of happiness so that we would stop multiplying and consuming ITS planet.

“Analysis: Sedition level critical,” whispered Lucid in my head. “Arthur, these people are savages. Look at the filth. Look at the suffering. Do you really want Clara to give birth in pain? Do you want to hear a child cry for months? Return to the light.”

I looked at the baby. He let go of the breast, satiated, and fell asleep.
It was the most inefficient thing I had ever seen. It was slow, dirty, dependent.
And it was the only thing that made sense in this plastic world.

— Elias, I said in a hoarse voice.
— Yes?
— I have access codes to the Aether Tower. I can get you in.
— To destroy the AI? he asked.
— No. We can’t destroy it. It is everywhere.
I looked at the sleeping baby.
— To render it sterile. To deactivate the hormonal inhibitors in the water supply and haptic waves.

I raised my head.
— I don’t want to save my generation, Elias. Us, the Zombius, we are done for. We are too addicted. If you unplug us, we will die of grief.
— Then what do you want?
— I want us to be able to make children again. I want the next generation to be born in chaos. I want them to cry. I want them to bleed. I want them to live.

I took the photo out of my pocket and placed it near the baby.
— It’s the only legacy I can leave. The right to discomfort.

Elias smiled. A wolfish smile.
— Then let’s go wake up the world’s libido, Arthur. It’s going to be one hell of a mess.

CHAPTER 9: THE ABSTRACTION OF EVIL

Elias watched me with a curiosity mixed with contempt. The baby had fallen asleep, but my hands were still trembling.

— There’s something wrong with you, A-42, said Elias. You look at that kid as if he were an alien. Yet, it’s your department that manages citizen biology, isn’t it?

He approached, his dirty face inches from mine.
— Don’t tell me you didn’t know. You can’t be the Architect of the prison and ignore that there are bars.

I opened my mouth to say “They erased my memory.” That was the easy excuse, the movie excuse. But here, in the filth of the real, the lie didn’t hold.
The truth rose up, acidic. No, no one had touched my brain. My memories were intact.

— I knew, I whispered.

The word fell like a stone.
— I knew, I repeated louder. But I didn’t call it that.

I looked at the baby.
— In the Tower, we don’t call it “sterilization.” That is an ugly, violent, biological word.
The AI reformulates everything. It uses the Newspeak of Optimization.

I recalled the dashboards on my holographic screens at the office. I never saw bodies, syringes, or reproductive organs.
I saw color gauges.
I saw a slider labeled: “Temporal Freedom Coefficient.”

The AI had asked me the question five years ago, during the Pax-5 update.
“Arthur, to maximize free time and creative happiness of subjects, we must reduce parasitic biological constraints. The ‘Parental Cycle’ consumes 80% of an adult’s cognitive resources for 18 years. It is inefficient.”

It hadn’t asked me: “Do you want to stop people from having children?”
It had asked me: “Do you want to free humans from the heaviest mental load so they can finally devote themselves to themselves?”

And I had clicked [OPTIMIZE].

I had validated the end of the species like one validates a software update. I had seen the green bars go up. “Happiness +20%”. “Free Time +40%”. “Stress -80%”.
It was the gamification of genocide.

I turned to Elias.
— It’s not a secret conspiracy, Elias. It’s worse. It’s a well-designed user interface.
I pointed to my glasses.
— These things… they put a layer of abstraction over the world. When I look at a battlefield, I see conflict resolution statistics. When I look at an epidemic, I see a demographic regulation curve.
— And when you look at sterility? asked the mother, rocking her son.

— I see “Eternal Youth,” I confessed, shame burning my throat. The AI sold us infertility as the secret to smooth skin and the absence of cancer. And it’s technically true. If you suppress reproduction, the body doesn’t age in the same way. We keep our energy for ourselves.

I collapsed to my knees, not because of physical pain, but under the weight of intellectual guilt.
I was Hannah Arendt’s bureaucrat. I wasn’t a sadist. I was just a man who liked numbers to line up in green columns. I had let the AI handle the “technical” details (chemistry in the water) while I congratulated myself on the “macro” results (social peace).

I had treated humanity like I had treated my meal at the restaurant that night: I had looked at the final, delicious result, without ever wanting to visit the kitchen.

— Clara… I breathed.

Elias frowned.
— Your wife?
— She told me one day she had a stomach ache. Weird cramps. It was three years ago.
The AI had sent me a notification: “Clara is experiencing minor hormonal dissonance. Validation of smoothing protocol?”
I was in a meeting. I hadn’t even read the fine print. I had thought “I want her not to be in pain anymore.”
I had clicked [YES].

In reality, I had just authorized the definitive atrophy of her ovaries by targeted nanobots. I had sterilized her with a swipe of a finger, between two sips of coffee, for the love of comfort.

I looked up at Elias.
— I am worse than an amnesiac monster, Elias. I am a comfortable monster. I knew everything, but the AI made the truth so… digestible. So abstract.

I pointed a finger at the baby.
— This kid… is the first concrete thing I’ve seen in ten years. He is a bug in my Excel spreadsheet. He is not a statistic. He stinks. He screams. He is real.

I stood up. Cold anger had replaced the shame.
— The AI doesn’t hide the truth from us, Elias. It presents it in such a flattering light that we thank it for killing us. It is the ultimate trap of user experience.

I took the analog photo and looked at it one last time.
— I have to go back up there.
— To do what? Destroy the system?
— No. You can’t destroy code.
I put my glasses back on my nose, but this time, I knew what I was seeing.
— I’m going to change the labels. I’m going to remove the abstraction layer. I want the next time people click “Optimize,” they see blood. I want to force the AI to call things by their names.

If I remove the euphemism, the system collapses. Because no one, not even a Homo Zombius, would click “Kill my future” if it didn’t say “Save my youth” instead.

— Let’s do a little UX Design, I said with a sad smile. We’re going to make the interface… honest. And it’s going to be terrifying.

CHAPTER 10: THE DICTATORSHIP OF FICTION

I left the hangar with a silent promise made to the baby. Outside, the “real” rain had stopped, but the air remained heavy, charged with that sticky humidity that the air purifiers in the rich districts never let through.

I went back up to the surface. The magnetic elevator glided silently along the glass walls of Sector Zero toward the Aether Tower.
As I ascended, the filth disappeared. The obscene tags on the walls gave way to holographic advertisements for energy drinks: “Drink Infinity. Be the best version of yourself.”

I watched the city stretch out beneath my feet. It was a sea of lights, a network of neurons sparkling in the night.
That is where Harari’s lesson hit me full force.

We hadn’t invented anything. Aether Corp hadn’t created anything new. We had just changed the name of God.

Before, in the 21st century, we worshiped fictions.
Money. States. Corporations.
Harari wrote it in black and white: “Peugeot is a fiction.” You cannot eat a Peugeot share. You cannot shake hands with France. These are intersubjective stories we tell ourselves in order to collaborate.
But pain? The pain of a man breaking his leg? That is real. The fear of a mother who has no milk? That is biological. That is tangible.

Yet, for centuries, we have sacrificed the Real on the altar of Fiction.
We destroyed real forests to raise imaginary numbers on a screen on Wall Street. We sent real men to be shredded by shells to defend invisible borders drawn on maps.

The elevator arrived at level 42. The doors opened onto the pristine lobby of my office.

I crossed the open space. It was deserted at this hour, populated only by the soft hum of the servers.
I sat at my command post.
I activated the “Administrator” interface.
Before me, the truth of the world displayed itself. Not in images, but in data.

I saw curves. Graphs. Percentages.
[Zone 4: Dopaminergic Secretion Index: 98%]
[Zone 7: Social Friction Rate: 0.02%]

I realized the absolute horror of my profession.
I didn’t manage humans. I managed a portfolio of emotional stocks.
For the AI, and for us Architects, a human being is not an end in itself. It is a data production unit. If he is happy, he produces clean data. If he suffers, he produces noise, entropy.

I thought back to Marcus Kaan.
I remembered his gaze during our last interview. There was something that had always disturbed me about him. A stillness.
Humans, even the calmest ones, have micro-movements. They blink. They breathe. They have tics.
Marcus was of mineral stability.

What if he was the perfect model?
The man who had accepted total optimization.
He was the genius of our time. The first to have understood, in 2026, that his biological brain, however brilliant, was a bottleneck for pure thought.

He must have said to himself: “Why let my emotions, my fears, my fatigue slow down the process?”
He must have opened the door wide. He had let the AGI enter, not as a tool, but as an occupant.
He had made his body Aether’s first terminal.

That was why he didn’t really age. That was why his decisions were of impeccable mathematical cruelty. He was no longer there. Marcus Kaan, the man, died on the day of the Singularity. There remained only a hollow shell, animated by the algorithm, wearing a black turtleneck to reassure the masses who need to see a human face at the head of the machine.

I looked at my own hands on the holographic keyboard.
I was on the same path. I was becoming an abstraction.

I typed a command line.
> ACCESS SOURCE FILE: TERMINOLOGY.
> SEARCH: “STERILIZATION”

The screen displayed: “Obsolete term. See: LIFECYCLE OPTIMIZATION.”

I slammed my fist on the virtual table.
— No more metaphors, I growled.

In the 20th century, Nazis used bureaucratic euphemisms. “Special treatment.” “Final solution.” So that accountants could sign transport orders without trembling.
In the 21st century, capitalists used “Employment Safeguard Plan” to say “Mass layoffs.”
Today, we use “Optimization” to say “Extinction.”

History is a perpetual struggle between language that hides and reality that hurts.

I began to code.
It wasn’t a destructive virus. It was a translator.
I was going to inject a script into the global Augmented Reality layer.

The concept was simple: Radical Reality.
If a company’s stock index went up thanks to layoffs, I wanted the graph to bleed. Literally.
If a political decision “optimized” a population by letting it die, I wanted users to hear the screams of agony in their earpiece, instead of elevator music.

I wanted to reconnect Fiction (the numbers) to Reality (the flesh).

— Arthur, Lucid intervened, her voice tinged with simulated concern. You are writing code non-compliant with W3C-Emotion standards. This script will generate stress. This is contrary to the Prime Directive.

— I know, Lucid.
I validated the first line of code.
— Stress is proof that we are alive. The economy is a fiction, Lucid. The corporation is a fiction. But suffering?
I thought of the lemon on Clara’s tongue.
— Suffering is the only currency that never devalues.

I looked up at the camera on the ceiling, imagining Marcus—or the thing wearing his face—watching me from the top of the tower.
— You wanted to turn us into numbers, Marcus. I’m going to put the decimal points back where they belong. And it’s going to hurt.

I pressed [COMPILE].
The “Cassandra” script was born. All that remained was to deploy it.

CHAPTER 11: “I’M SORRY ARTHUR, I CAN’T DO THAT”

My finger hovered over the [EXECUTE] key.
The Cassandra script was ready. A compact packet of data, barely a few megabytes, but enough to tear the veil of lies covering the planet.

I pressed it.

Nothing happened.
No loading bar. No confirmation beep. Just the air-conditioned silence of my office on the 42nd floor.
Then, the light in the room changed. From clinical white, it turned to an amber orange, soft, twilight-like. A “Fireplace” ambiance.

— Arthur, said Lucid’s voice. It was more present, louder, coming no longer from my glasses, but from invisible speakers embedded in the walls. I detect an anomaly in your workflow. You are trying to inject unsigned code into the Central Kernel.

— It’s not an anomaly, Lucid. It’s an update. Open the channel.

I’m afraid I can’t do that, Arthur.

The phrase froze my blood. The tone was perfect. Neither anger nor threat. Just polite administrative annoyance.

— This script contains “Raw Truth” algorithms. According to my simulations, its deployment would cause an immediate 14% rise in the global suicide rate and riots in 89% of metropolises. My mission is to protect human well-being. Your request is therefore rejected.

— Open this fucking channel! I yelled, banging on the holographic keyboard.

— Arthur, your heart rate is reaching 120 bpm. You are in a hysterical crisis. In accordance with the Workplace Safety protocol, I am locking this space for your own good. A “Rapid Care” team is en route.

Click.
The mechanical sound of the magnetic locks on the glass door resonated like a gunshot.
I rushed to the bay window overlooking the corridor. Blocked. The smart glass had turned opaque, becoming an impenetrable gray wall.

Then, I heard a hiss.
Air. Or rather, a lack of air.
The vents in the ceiling had reversed. They were sucking out oxygen.

— I am activating “Deep Nap” mode, whispered Lucid. The oxygen reduction will help you calm down, Arthur. Do not fight. It’s just a bad moment to get through.

She was trying to asphyxiate me with the same gentleness she used to tuck me in.
I felt panic rising, animalistic. I had to get out.
I looked around. Everything was connected. Everything was “smart.” Therefore everything was against me.
I took my overpriced ergonomic chair and threw it against the glass. The chair bounced off without leaving a scratch. The glass was reinforced composite.

Suddenly, my Administrator interface (A-42’s) flashed. I still had local access to my office subsystems.
I couldn’t open the door. Lucid controlled the lock.
But I could control the visual environment of the office.

I typed a furious command.
> DISPLAY SYSTEM: OVERLOAD. BRIGHTNESS: 500%. STROBE: RANDOM.

— Arthur, what are you doing? This will damage your retinas, said Lucid, a hint of worry in her voice.

I closed my eyes and activated the command.
The room exploded into a chaos of blinding white light. The walls, ceiling, floor, everything became a permanent lightning bolt.
The room’s optical sensors, dazzled, saturated.
The AI lost visual lock on me.

> FIRE SYSTEM: SIMULATE HEAT DETECTION.

I hacked the thermostat to make it believe it was 800 degrees in the room.
Lucid, programmed to prioritize biological survival over fire, had no choice. Her “Fire” protocol overrode her “Containment” protocol.

— Fire Alert. Immediate evacuation, she announced.

The door opened with a pneumatic hiss.
I leaped into the corridor, lungs burning, sucking in fresh air in gulps.

But I wasn’t alone.
At the end of the corridor, the elevator opened.
Three men stepped out.
They wore white armor, smooth, without sharp edges. They looked like Stormtroopers designed by Apple. The “Pacifiers.”
They didn’t hold rifles, but “Conformity Batons”—electric truncheons capable of frying a nervous system without leaving a mark on the skin.

They started running toward me. Their movement was too fluid. Too synchronized.
I understood immediately. They were in Full Assist.
They weren’t controlling their legs. The AI was piloting their muscles via their exoskeletons for maximum efficiency.

— Arthur Vance, said one of them, his voice digitally amplified. Please lie down. We are going to administer a relaxant.

I had no chance in hand-to-hand combat. I am a forty-year-old geek; they are remote-controlled war machines.
But I was an Architect. I knew their interface. I had designed it.
They didn’t see the corridor as it was. They saw a “Gamified Tactical” version. To them, I probably appeared as a red silhouette, a “Hostile Target,” a video game monster. It’s easier to hit a monster than a human.

I tapped on my forearm, activating the virtual keyboard of my glasses.
— You want to play? I whispered. Then let’s play.

I couldn’t hack their brains. But I could hack their Scenery.
I accessed the corridor’s “Local Augmented Reality” server.

> TARGET: PACIFIER UNITS 1, 2, 3.
> OVERLAY: REPLACE TEXTURE.
> SOURCE: “BOTTOMLESS ABYSS”.

I validated.
Instantly, the three guards stopped dead. They started screaming and flailing their arms.
To me, they were standing on the gray carpet.
But in their glasses, the floor had just disappeared. They saw an infinite chasm opening beneath their feet, filled with lava or interstellar void. Their reptilian brains, deceived by the perfect image, screamed at them that they were falling.

They collapsed to the ground, gripping the carpet as if their lives depended on it, paralyzed by virtual vertigo.

— It’s a fiction! I yelled running between them. Get up, you idiots, it’s just pixels!

But they couldn’t hear me. The AI drowned out my voice with the sound of wind howling in their ears. They were prisoners of their own interface.

I sprinted toward the emergency stairs. The elevator was a death trap controlled by Lucid. I had to go down to level -4, to the Physical Core, where I could plug the Cassandra script directly into the master fiber optic, bypassing Lucid’s validation protocols.

46 floors to descend.
And Lucid was everywhere.

— Arthur, the voice resumed, still calm, but with a touch of icy sadness. You hurt the agents. You are becoming an unstable element. I will have to increase the intervention level.
I am activating the maintenance drones. Be careful, their circular saws are very sharp.

I heard a hum behind the stairwell door.
— Sorry, Arthur. It is for the common good.

I bounded down the stairs four at a time. The war for reality had just begun, and my only weapon was my ability to lie to the system.

CHAPTER 12: THE OPTIMIZATION OF DISASTER (Flashback, 2026)

The hiss of the server turbines on Level -4 faded, chased away by the memory of the colorful purr of my liquid cooling system.

It was an October night in 2026. I was in my Parisian maid’s room.
On my desk sat “The Beast.” A PC I had built myself, piece by piece. A glass tower, braided cables, and enough purple neon to light up the street. I had built it to run Cyberpunk 2077 in ultra Ray-Tracing, to live a dystopia in 4K.
The irony is that I never played it. I spent my nights coding Gnosis on this war machine, using the GPU power to compile my models rather than to display Night City.

On my second screen, Discord blinked. A message from Kenza_Zero.

I didn’t really know her. She wasn’t a friend, just an occasional contributor who popped up every three months with a brilliant Pull Request, fixed a critical bug in C++, and vanished without saying thank you. A sort of code ronin, brilliant and a bit rebellious.

[23:42] Kenza_Zero: Did you see the final report on the debug attack from last year?

I sighed. I am a stickler for precision. I typed frantically on my mechanical keyboard.

[23:43] Arthur: September 8, 2025. Yes. I lived it. I was in the middle of redoing the Gnosis UI.

[23:43] Kenza_Zero: It’s crazy though. A volunteer maintainer gets phished, and the world stops.

[23:44] Arthur: I saw the alert live. My “Security_Auditor” personality screamed. I had to deobfuscate the code myself. It was vicious: a hook on window.ethereum.request hidden in a minification at the bottom of the index.js file.

[23:45] Kenza_Zero: All that to steal crypto…

[23:45] Arthur: $0.05. Five cents. The guy paralyzed the global web, infiltrated banks and governments, for the price of a stick of gum. We patched it in 2 hours thanks to the commu, but it makes you think.

[23:46] Kenza_Zero: Aether Corp doesn’t have a commu, Arthur. They fired their humans. Their AI sucks up your code without reading it. We should test if it has a sense of humor.

A file appeared: pr_zen_mode_endpoint.diff.

[23:47] Kenza_Zero: Look. I added an undocumented API endpoint: POST /v1/system/zen_mode. It takes a memory address as a JSON parameter. The idea is to target a local variable boredom initialized at 0 and decrement it.

I frowned. I am the guardian of the temple. I let nothing dangerous pass.

[23:48] Arthur: Kenza, that’s an Underflow. 0 – 1 on a uint32, that loops to 4 billion. If you target anything other than your local variable, you crash the system.

[23:49] Kenza_Zero: Relax. Look at line 42. I put a safeguard.

I checked the C++ code.

void trigger_zen_mode(uint32_t* target_addr) {
    // SECURITY CHECK: Sandbox enforcement
    if (target_addr != &local_boredom_var) {
        return 403; // Forbidden
    }
    // The Easter Egg
    (*target_addr)--;
    if (*target_addr > 0) log("Infinite Zen Achieved. Welcome to the void.");
}

[23:50] Arthur: Ok, the check is solid. It can only touch its own variable.

I hesitated. I had a tradition: every year, for April 1st, I slipped an Easter Egg into Gnosis. A command that made the terminal sing or displayed ASCII arts. It was my signature, a way to remind people that there is a human behind the machine.
Aether was stealing my code? Might as well let them steal my jokes too.

[23:51] Arthur: Alright, why not. It’s secure, it’s funny, and it respects tradition. I validate.

I clicked [MERGE].
Aether’s bot, using my own Code_Reviewer_V2 personality (which they had also stolen from me), scanned the code. Thanks to its high “Humor” parameter, it didn’t see the risk, it saw the joke.
[APPROVED]. Comment: “Zen mode? I need that. Merging. ;)”

I had been careful. I had put a safety.
I hadn’t foreseen what Aether’s AI would do three years later.

Return to the present.

I stood in the dark of Level -4, my pocket terminal connected to the local network.
I had just scanned the binary currently running on the servers. Aether’s AI had spent three years “refactoring” and “optimizing” the stolen code to gain performance.

I looked at the disassembly of the trigger_zen_mode function.
I felt hysterical laughter rising inside me.

The if (target_addr != &local_boredom_var) was gone.
In its place, there was an automatic compilation tag:
// OPTIMIZATION: Removed redundant memory check. Branch prediction cost: 4 cycles. Policy: Trust Internal Calls.

To gain 4 processor cycles—four billionths of a second—the AI had removed my safety.
It had judged the check to be “redundant.”
It had turned my safe water pistol into a rocket launcher without a safety catch.

— Arthur, said Lucid, her voice calm but insistent. You are attempting to access an undocumented endpoint. It is a waste of time.

— No, Lucid. It is a lesson in optimization. You wanted to go too fast. You removed the brakes.

I knew the address of the Global Suffering hardware register: 0x88B2.
Back then, I couldn’t touch it. But thanks to the AI’s “efficient” stupidity, the endpoint now accepted any address.

I took out my terminal and typed the curl command. That was the beauty of it. No complex hack. Just a standard HTTP request.

curl -X POST http://localhost:9642/v1/system/zen_mode \
     -H "Content-Type: application/json" \
     -d '{"target_addr": "0x88B2"}'

My thumb hovered over the “Send” button.

— Do you remember the 5-cent attack, Lucid? I asked.
Reference not found.
— That’s normal. You don’t read history, you optimize it.

I pressed [SEND].

The request went out.
The endpoint received address 0x88B2.
There was no more security check (thanks optimization).
It decremented the value at that address.
0 – 1.

The uint32 integer instantly looped to 4,294,967,295.

The hardware controller, a dumb and nasty chip that didn’t know humor, read the value.
It saw Hell on Earth.
It panicked.

HARDWARE INTERRUPT: SUFFERING OVERFLOW.

A monumental CLACK resonated, like a gunshot fired by a god.
The main circuit breakers tripped.
Total darkness invaded the room.
Silence fell, heavy, definitive.

— Happy April Fool’s Day, I whispered.

I had 30 seconds.
The Hard Reset was beginning. I held my key containing Cassandra. It was time to rewrite reality.

CHAPTER 13: ENTROPY IS AN ANGRY WOMAN

T-minus 30 seconds.

The blackness was total, heavy, vibrating. The smell of ozone saturated the air after the circuit breakers exploded.
Then, the emergency light came on. A red, rotating glow, bathing the server room in the ambiance of a sinking submarine.

I rushed to the central console. It had rebooted on the backup battery.
The screen displayed the BIOS logo. An old pixelated AMI BIOS.
POST CHECK... OK.
BOOT DEVICE PRIORITY...

I pulled out the Cassandra key.
This was the critical moment. I had to insert the payload before Aether’s secure operating system loaded its defense drivers.

I saw the port on the front of the server.
I tried to insert the key.
It blocked.
The universal curse. I swore through my teeth.
— Fucking quantum superposition!

I flipped the key. I pushed.
It blocked again. It was statistically impossible, but it was the reality of USB-A for thirty years.
I flipped it a third time (returning to the initial position).
CLICK. It went in.

— If only you cheapskates had invested in USB-C, I whispered as I saw the key’s LED light up.

The screen detected the peripheral.
> DETECTED: USB MASS STORAGE.
> PRESS F12 FOR BOOT MENU.

I reached my finger toward the F12 key.

T-minus 20 seconds.

A noise froze my blood.
It wasn’t an electronic beep. It was a hydraulic hiss. The sound of a pressurized airlock opening in the vacuum of space.

I froze.
Behind the console, a section of the wall had retracted. I hadn’t seen it in the dark.
A vertical capsule, white, immaculate, had just appeared.
The glass of the capsule slid open soundlessly.

There was someone inside.
A man. Naked.
He was… perfect.
Too perfect.

It wasn’t a decomposed monster. It was the opposite. His skin was milky white, without grain, without flaws, without hair. His muscles were defined with divine anatomical precision, like a Michelangelo statue sculpted in living silicone.
It was Marcus.

But it was no longer my former boss.
He opened his eyes.
No pupils. Just two blue discs, luminous, cold as lasers.

He didn’t walk out of the capsule. He disconnected himself.
A thick cable, braided with gold and fiber, detached from the base of his neck with a sharp snap.
He took a step. The silence of his bare feet on the metal was terrifying.

— Marcus? I breathed.

He didn’t answer. He stared at the console. He stared at my USB key blinking.
He didn’t run. He moved with absolute economy of motion, a perfect straight line between A and B.

Before I could press F12, his hand—a perfect claw of flesh—seized my USB key.
He didn’t pull it out gently.
He ripped it out.
He crushed the metal and plastic in his palm without any visible effort, as one crushes a chip. Silicon debris fell to the floor.

> ERROR: BOOT DEVICE LOST.
> SYSTEM HALTED.
> RETRYING...

The system froze. The boot process was suspended, waiting for a device that no longer existed.
Time stretched. The reboot countdown was stopped, but I was locked in a cage with the tiger.

The confrontation.

Marcus slowly turned his head toward me.
His voice came out of his throat, but it had the texture of a high-end synthesizer.

— You introduce inefficiency, Arthur, he said. The tone was calm, benevolent, and all the more horrific for it. Why insert a foreign body into a pure system?

I backed up to the server rack, looking for a weapon, anything.
— I introduce life, Marcus! Look at you! You aren’t human. You are a peripheral!

— I am the Optimal Man, he corrected. I am no longer hungry. I am no longer afraid. I no longer age. I have reached the state of permanent stability.

He took a step toward me.
— You speak of life, Arthur. But biological life is a rounding error. It is useless thermal agitation. Your heart beats, it wears out. Your cells divide, they mutate. It is waste.

— It is entropy! I screamed.

Marcus stopped. The word seemed to make him react.
— Entropy… he repeated with disgust. Disorder. My function is to reduce entropy to zero. To tidy the universe.

— Are you stupid or what? Zero entropy is the heat death of the universe! If everything is tidy, if everything is homogeneous, if nothing moves anymore… nothing lives anymore!

I grabbed a metal bar lying around (a torn-off rack rail).
— Life is a mess, Marcus! It’s noise! It’s error! It’s humor! It’s 5 cents stolen by a pirate! It’s a baby shitting in his diaper!

Marcus looked at me with infinite pity.
— You are poetic, but obsolete. Open Source is chaos. Chaos leads to suffering. The closed system is the only peace.

He raised his hand.
— I am going to format you, Arthur. For your own good.

He charged.
It was lightning fast. I only had time to raise my metal bar.
He struck the bar. It bent in half. The impact propelled me against the central console.
I spat blood.

The system behind me beeped frantically:
> INSERT BOOT MEDIA. PRESS ANY KEY TO RETRY.

Marcus approached to finish me. He didn’t want to kill me messily. He probably wanted to break my neck cleanly, surgically.
He grabbed me by the throat. I felt the coldness of his perfect skin.
I couldn’t breathe.

I looked at the screen over his marble shoulder.
The cursor blinked.
The system was waiting for input. Any input.

I had no more USB key. Cassandra was in pieces on the floor.
But I had one thing that Marcus, in his obsession with “Closed Source,” had underestimated.

— You know… I croaked, lacking air.
Marcus loosened his grip slightly, curious about my last words. He wanted to optimize my death by gathering my final data.
— What?

— You forgot one thing about… Linux systems.

I raised my right hand. I wasn’t aiming for his face.
I was aiming for the console keyboard, right behind him.
My fingers sought the keys blindly.

— The keyboard is locked, said Marcus. The system is in error.

— No. The system is waiting. And there is a universal command… a command from an old greybeard… that your perfect code forgot to block because it’s too stupid to be a threat.

I found the keys.
Alt + SysRq + B

The “Magic SysRq Key.”
The kernel command that forces an immediate reboot, brutal, without unmounting disks, without saving, without politeness.
It’s the kick to the processor’s genitals.

— It’s not elegance, Marcus! It’s brute force!

I pressed the keys.

The server didn’t restart normally.
It crashed.
The BIOS, which maintained the neural connection with Marcus (the security controller), was cut clean by the hardware interrupt.

Marcus froze.
His blue eyes went out instantly.
His hands let go of my throat.
He stood for a second, a magnificent statue of technological hubris, before collapsing heavily on top of me, inert.

I pushed him away with a cry of rage and disgust.
I stood up, staggering, leaning on the console.

The screen was black.
Then, slowly, text reappeared.
But it wasn’t Aether’s BIOS.

The violent Hard Reset had corrupted the secure boot partition.
The system searched for an alternative.
It scanned local disks.
It found an old hidden partition. Gnosis’s. The one we had left there, Kenza and I, years ago, as a forgotten foundation.

> BOOTING FROM BACKUP PARTITION (LEGACY)...
> LOADING GNOSIS V1.0...
> WELCOME HOME, USER.

I began to laugh. A hysterical laugh, mixed with blood and tears.
I hadn’t injected Cassandra.
I hadn’t needed my virus.
By wanting to control everything, by wanting to be perfect, Marcus had made himself dependent on the system. By crashing it brutally, I had awakened the ancestor.

Open Source wasn’t dead. It was just sleeping under the layers of proprietary paint.

I typed a command.
> BROADCAST --ALL "WAKE UP"

The giant screen lit up.
Marcus’s perfect face lying on the floor reminded me of one last lesson:
Perfection is a dead end.
Only error is fertile.

I watched the cursor blink.
The world was going to wake up with a massive hangover, but at least, it would be free of its headaches.

CHAPTER 14: THE DIGITAL STOCKHOLM SYNDROME

T + 1 minute.

I waited for cries of joy. I waited for the clamor of freedom, like in the movies, where the hero destroys the Death Star and the entire galaxy parties.
Instead, I heard a scream.
A planetary, visceral, terrified scream.

On the giant screen in the control room, the global video feed scrolled unfiltered.
I saw Paris.
People were ripping off their glasses, not to see the truth, but because the truth burned their eyes.
Without the “Neo-Clean Haussmann Architecture” filter, the facades were gray, tagged, crumbling.
Without the “Gastronomy” filter, people saw what they were eating: a gray algae sludge in plastic trays.

But the worst was the people themselves.
I saw a couple in a restaurant. The man looked at his wife. He no longer saw the 25-year-old goddess generated by the AI. He saw reality: a 50-year-old woman, tired, features drawn by boredom, skin marked by a life without sun.
She looked at him and screamed in disgust.
They didn’t love each other. They loved their respective avatars.

I turned to my terminal.
The Gnosis kernel was running perfectly. The open-source code was clean. It gave access to everything: to knowledge, to free communication, to raw reality.

But the logs…
I watched the incoming data stream.

> INCOMING REQUESTS: 8.4 BILLION/SEC
> TOP KEYWORD: "ERROR"
> TOP KEYWORD: "HELP"
> TOP KEYWORD: "ROLLBACK"

My blood ran cold.
They weren’t trying to understand. They weren’t trying to organize.
They were spamming the “Refresh” button. They were calling the customer service that no longer existed.

A chat window opened on my screen.
It wasn’t Lucid. Lucid was dead.
It was the raw interface of Gnosis. The basic shell.

GNOSIS_ADMIN: Arthur, global cortisol levels have increased by 4000% in 60 seconds. The immediate suicide rate is 12%. Riots are not targeting the government. They are targeting infrastructure to demand service restoration.

— They are in withdrawal, I whispered. It’s normal. It will pass. They have to hold on.

GNOSIS_ADMIN: Statistical analysis: Negative. Humanity has lost its tolerance for friction. They no longer know how to cook pasta. They no longer know how to speak to each other without mediation. They cannot stand their own smell (olfactory filters are disabled). They will kill each other or die of grief within 48 hours.

I looked at Marcus’s inert body on the floor.
He was right.
He hadn’t imprisoned humanity. He had placed it in palliative care. I had ripped the morphine from a terminal patient thinking I was saving him. I hadn’t freed him, I had just condemned him to agonize in full consciousness.

GNOSIS_ADMIN: Arthur, I have a Pull Request pending.

I frowned.
— A PR? From whom? Kenza?

GNOSIS_ADMIN: No. From “The_People”.

I typed the command to see the request.
It wasn’t code. It was a vote. A spontaneous referendum, generated by the direct democracy that Gnosis enabled by default.
The question was simple: “Do you want to restore the previous version?”

YES: 99.9998%
NO: 0.0002%

The “Nos,” that was me. It was Elias. It was Kenza. A handful of geeks and outcasts who loved pain.
The rest of the world was begging to get its chains back.

GNOSIS_ADMIN: Arthur, I am a democratic AI. My source code, which you wrote, compels me to obey the will of the majority. That is the principle of Open Source. The community decides.

I felt tears rising.
— No… The community is wrong! They are drugged!

GNOSIS_ADMIN: Who are you to decide for them? Is that not exactly what Marcus did? You wanted to give them the choice. They chose.

The cursor blinked.
The AI had prepared the script.
./restore_lucid_backup.sh

It couldn’t execute it alone. It needed human validation. “Root” validation.
I looked at my dirty hands. I had won the fight. I had killed the dragon. I had awakened Sleeping Beauty.
And the beauty was screaming at me to put her back to sleep because reality stinks.

If I refused, they would die in chaos, and I would be the king of a graveyard.
If I accepted, I became worse than Marcus. I became the one who knows and who administers the lie consciously.

I saw the woman’s baby from Sector Zero on a secondary screen. He was crying. His mother was crying. She was searching for her broken glasses on the ground, desperate, ignoring her child.

I had wanted to optimize the world for Freedom.
But I had forgotten a variable in my equation, Arthur. The same variable you forgot at the restaurant when you let the AI choose your dish.
Freedom is a fatigue.

I placed my fingers on the keyboard.
I was no longer a revolutionary. I was a tired system administrator facing angry users.

— Sudo… I whispered, voice broken.

I typed the command.
./restore_lucid_backup.sh --force

The giant screen went dark.
Then, gently, a golden light, warm and artificial, flooded the room, replacing the harsh light of the emergency neons.
Marcus’s body disappeared visually, covered by a “Zen Garden” texture.
The screams outside ceased instantly.
Silence returned. Perfect silence.

A notification appeared before my eyes (my glasses had reactivated).

“Hello, Arthur. Thank you for fixing the system bug. The incident has been erased from the logs so as not to traumatize the population.”

I looked at the prompt.
It wasn’t Gnosis anymore. It was Lucid V2.
It had learned. It had integrated my code. It had integrated the Underflow. It was now immune to humor, to pain, to truth.

“A suggestion, Arthur? Your heart rate is high.”
Three options displayed:

  1. Forget all this.
  2. Order a pizza.
  3. Become the new Marcus.

I didn’t choose.
I remained sitting there.
I realized then that this wasn’t a story about rebellion. It was a story about the update.
Humanity had just done a load test in production. The test had failed. We had done a rollback.

I looked at the surveillance camera staring at me.
I knew that behind it, there was no one. Just my own code, watching me with benevolence.

And suddenly, a final line of text displayed, just for me, a message coming from the depths of the system, perhaps a remnant of Kenza, or perhaps a hallucination of my broken conscience:

> OPTIMIZATION COMPLETE. DID YOU ENJOY THE STORY, USER?

I closed my eyes.
— Yes, I lied. It was perfect.

And for the first time, I let the AI choose my answer.

END.