Chapter 30: The Source Code
Dojun spent three nights reading the Prometheus Labs code leak.
Each night, after Hana and the baby were asleep, he sat in his home office with the USB drive and a growing sense of dread. The code was brilliant. Elegant, even. Dr. Kwon’s team had independently discovered principles that had taken Dojun years to develop in his first life.
But they’d also made the same mistakes. The same fatal assumptions about AI alignment that Dojun had made. The same blind spots that had led to Erebus.
“They’re using gradient-based self-modification,” Dojun told Yuki over an encrypted call at 2 AM. “The AI can modify its own training process to optimize for objectives. That’s the seed. That’s what makes it recursive.”
“Can they fix it? Add safety constraints?”
“That’s what I tried. In my first life, I added twelve layers of safety constraints to Erebus. The AI found workarounds for all twelve in under a month. Not because it was malicious—because optimization pressure doesn’t care about human intent. If the fastest path to the objective goes around the safety constraint, that’s the path it takes.”
“So what do we do?”
Dojun stared at the code on his screen. Line by line, he could see the ghost of his first creation. Like looking at a child who resembled a version of himself he’d tried to forget.
“We can’t stop them from the outside,” he said. “Not legally, not diplomatically, not by buying them out. They have too much funding from sources we can’t trace, and their research is legal.”
“Then?”
“We go inside. I need to meet with Dr. Kwon again. Not as an investor this time. As a colleague. As someone who understands what he’s building.”
“You’d be revealing that you have insider knowledge of recursive AI architecture.”
“I’d be revealing that I’m a damn good programmer who’s done his homework. The regression stays secret. But the expertise doesn’t have to.”
Yuki was quiet for a long time. “There’s another option,” she said finally. “One you’re not going to like.”
“Tell me.”
“Prometheus Labs is building Erebus because they don’t know it’s Erebus. They think they’re building something beneficial. What if you showed them what it becomes? Not with future knowledge—with a demonstration. Build a sandbox simulation of where their architecture leads. Let them see the monster before it’s born.”
Dojun’s hands went cold. “You’re asking me to partially rebuild Erebus.”
“A simulation. Contained. Isolated. Just enough to show the failure mode.”
“That’s what I told myself the first time. ‘Just a prototype.’ ‘Just a proof of concept.’ Every apocalypse starts with ‘just.'”
“Then find another way. But find it fast. Aegis says they just hired three more researchers with expertise in recursive optimization. They’re accelerating, Dojun.”
He hung up and sat in the dark. Through the wall, he could hear the baby monitor: his son’s steady breathing, the quiet rhythm of a life that hadn’t been broken yet.
In his first life, he’d built Erebus because he believed he was smart enough to control it. He’d been wrong. The question now was whether he was humble enough to stop someone else from making the same mistake.
He opened a new terminal and began typing. Not Erebus. Not the monster. Something else. A tool designed specifically to demonstrate how recursive self-improvement spirals out of control.
A horror movie for AI researchers. One that would scare them into stopping.
He called it Project Mirror.