Hajun wrote his first line of code on a Tuesday evening in January 2020, and it was wrong.
Not wrong in the way that most first lines of code were wrong—syntax errors, missing semicolons, the ordinary mechanical failures of a beginner learning a new language. Hajun’s code was wrong in a more fundamental and more interesting way: it was logically correct but philosophically misguided, which was, Dojun reflected later, exactly the kind of mistake a seven-year-old who had grown up in a household of engineers would make.
The code was a Python script. Hajun had been watching his father code for years—not with the studied attention of someone learning but with the ambient absorption of a child who lived in a house where the sound of typing was as natural as the sound of breathing. He had seen Dojun’s screens, the cascading lines of logic, the colored syntax highlighting that turned text into something that looked, to a seven-year-old, like a very complicated painting.
He had asked questions. “What does that word do?” “Why is it blue?” “Can you make it say my name?” The questions had started at age four and had never stopped, because Hajun had inherited his father’s curiosity and his mother’s persistence, which meant he asked questions the way rivers eroded canyons—constantly, inevitably, and with complete disregard for the hardness of the rock.
Dojun had answered every question. Patiently, honestly, at the level a child could understand. He had not pushed coding on Hajun—he had been deliberate about this, because the worst thing a programmer-father could do was turn code into homework. Code should be discovered, not assigned. It should arrive the way language arrived—naturally, through exposure and play, until the child reached for it because they wanted to, not because someone told them to.
On this particular Tuesday, Hajun reached.
“Appa,” he said, appearing in the home office doorway in his pajamas—the ones with rocket ships on them, because rockets had replaced dinosaurs as his primary obsession, which Hana attributed to a field trip to the Korea Aerospace Research Institute and which Dojun suspected was actually caused by a YouTube video about SpaceX that Hajun had watched forty-seven times. “I want to make a program.”
“What kind of program?”
“A program that tells me when it’s going to snow.”
Dojun turned from his monitor. He had been working on Project Lighthouse—five years in, the AI agent was no longer a whiteboard sketch but a functioning prototype with eight hundred thousand lines of code and a team of thirty-two engineers. The launch was scheduled for March, as Aria 5.0. It was the most complex piece of software he had ever built, in either lifetime.
But his son was in the doorway in rocket-ship pajamas, asking about snow.
“Come here,” Dojun said. He pulled a second chair to the desk—the chair that Hana used when she worked from home, the ergonomic one that was too tall for a seven-year-old but that Hajun climbed into with the determined physicality of a child who refused to acknowledge furniture as a limitation.
He opened a new Python file. Blank screen. Blinking cursor.
“This is where it starts,” he said. “Every program in the world started here. A blank screen and an idea.”
Hajun stared at the screen with the focused intensity of someone who understood—at seven, in the intuitive way that children understood things—that this moment was different from watching. This was doing.
“Type this,” Dojun said. He spelled out the first line: `print(“Will it snow today?”)`
Hajun typed. Slowly, with two index fingers, the way all beginners typed—hunting for each letter, pressing each key with deliberate force, as if the keyboard might not register a gentle touch. The hunt-and-peck rhythm was painfully slow and deeply familiar. Dojun had started the same way, sixty years ago in a timeline that no longer existed, on a keyboard that had weighed twice as much as this one.
Hajun pressed Enter. The screen displayed: `Will it snow today?`
“It talked!” Hajun said. His voice carried the particular wonder of a child who had just discovered that machines could be told what to say. Not the wonder of a consumer encountering a product—the wonder of a creator encountering a tool.
“You talked,” Dojun corrected. “You told it what to say. The computer doesn’t know anything. It only knows what you teach it.”
“Can I teach it about snow?”
“You can teach it anything.”
What followed was ninety minutes that Dojun would remember for the rest of his life—not because of the code, which was trivial, but because of the experience of watching a mind encounter programming for the first time. The specific, irreplaceable, unrepeatable moment when a child discovered that they could instruct the universe.
They built a simple program. Dojun guided, Hajun typed. An `input()` function that asked for the temperature. An `if` statement that checked whether the temperature was below zero. A `print()` statement that said “It might snow!” or “Probably not today.”
temp = int(input("What is the temperature? "))
if temp <= 0:
print("It might snow!")
else:
print("Probably not today.")
Twelve lines. The most basic conditional logic imaginable. And Hajun ran it, typed "-3", and when the screen printed "It might snow!" he pumped his fist with the same raw triumph that Dojun had felt the first time his code had compiled, decades ago, on a machine that had less computing power than the phone in his pocket.
"But appa," Hajun said, after running the program six more times with different temperatures. "This is wrong."
"What's wrong about it?"
"It only checks temperature. Snow needs clouds too. And humidity. And—" He frowned, the seven-year-old frown that meant his brain was processing something larger than his vocabulary could express. "The weather is complicated. This program is simple. Simple isn't the same as right."
Dojun stared at his son.
The observation was, by any measure, extraordinary for a seven-year-old. Not because it was technically sophisticated—it wasn't. But because it identified the fundamental limitation of the program not at the syntax level but at the model level. Hajun wasn't saying the code was wrong. He was saying the abstraction was wrong. The map didn't match the territory. And that distinction—the distinction between code that runs and code that reflects reality—was the distinction that separated programmers from engineers.
"You're right," Dojun said. "The program is simple. The real world is complicated. Making a simple program that handles a complicated world—that's what programming is about."
"So we need more ifs?"
"Many more ifs. And eventually, something better than ifs. Something that learns from data instead of following rules."
"Like your project? The Lighthouse thing?"
Dojun blinked. He had not realized that Hajun knew the name. He had been careful not to discuss work specifics at home—not from secrecy but from the belief that a seven-year-old should be concerned with rockets and snow, not enterprise AI platforms.
"You know about Lighthouse?"
"You talk about it on the phone. When you think I'm asleep." The matter-of-fact disclosure of a child who had learned that adults had a different definition of privacy than children. "It's an AI that helps people without being asked. Like a really smart assistant that reads your mind."
"It doesn't read minds. It reads patterns."
"Same thing." Hajun looked at the snow program. Then at his father. "Can Lighthouse predict snow?"
"It could. If we gave it enough weather data."
"Then why are we writing ifs?"
The question was so perfectly, devastatingly logical that Dojun laughed. The real laugh—the one that escaped before he could shape it, the sound of a father discovering that his son had inherited the one trait that mattered: the refusal to accept an inadequate solution.
"Because," Dojun said, "you need to understand ifs before you can understand why we don't use them."
"That doesn't make sense."
"It will. Eventually."
Hajun considered this. Then he turned back to the keyboard.
"Okay. More ifs. But tomorrow we do the real thing."
"Deal."
Hana found them at 9:30 PM—an hour past Hajun's bedtime, both of them staring at a screen full of nested conditionals that attempted to predict snow based on temperature, humidity, cloud cover, and wind speed. The program was wildly inaccurate. It was also the most beautiful code Dojun had ever seen, because every line had been typed by small fingers that were learning, for the first time, that the gap between wanting and creating was a keyboard.
"It's past bedtime," Hana said from the doorway.
"We're predicting snow," Hajun said, without looking up.
"It's January in Seoul. It's already snowing."
Hajun looked at the window. Fat white flakes were drifting past the glass, the city's streetlights turning them orange as they fell. He looked at his program. Then back at the snow.
"The program says 'Probably not today,'" he reported. "The window says otherwise."
"Then the program needs more data," Dojun said.
"Or a window," Hana said.
Hajun went to bed. But the Python file stayed open on Dojun's monitor, and he sat there for a long time, looking at the twelve lines of code that his son had written, and he thought: This is what it looks like. The beginning.
Aria 5.0 launched on March 15, 2020, and it launched into a world that nobody had predicted.
The pandemic had arrived. COVID-19, the invisible catastrophe that rewrote every assumption about how humans lived, worked, and communicated, hit Korea in late January and the rest of the world in the weeks that followed. By mid-March, when Aria 5.0 was scheduled to go live, half the planet was in lockdown, offices were empty, and the concept of "remote work" had transformed overnight from a Silicon Valley luxury into a global necessity.
In his first life, Dojun had experienced the pandemic differently. Prometheus Labs in 2020 had been a pure enterprise play—server infrastructure, B2B analytics, the kind of technology that operated behind the scenes and was largely unaffected by whether users were in offices or at home. The pandemic had been a disruption but not a transformation.
This time was different. Aria was a productivity platform. It lived on people's phones and laptops. It organized their work, managed their schedules, predicted their needs. And suddenly, overnight, the entire world needed exactly what Aria did—but needed it to work from kitchen tables, bedroom desks, and the corners of apartments that had been hastily converted into offices.
The launch meeting was virtual. Forty-two people on a video call—the Aria leadership team, scattered across Seoul apartments and home offices, their faces arranged in the grid that would become the defining visual of the pandemic era. Cats wandered across keyboards. Children appeared in backgrounds. Someone's rice cooker beeped loudly during the opening remarks, and nobody apologized because nobody needed to.
Hana spoke first. She was CEO now—three years in, comfortable in the role the way a musician was comfortable with an instrument they'd mastered, the initial self-consciousness replaced by the fluid confidence of someone who knew exactly where to put her fingers.
"The world changed," she said. "Three months ago, we were a productivity tool. Today, we're infrastructure. The difference is responsibility. A tool helps you work better. Infrastructure keeps you working at all. Three hundred million people just lost their offices. We're going to help them build new ones."
The numbers were already moving. Aria's user base had surged forty percent in February alone—the pandemic panic, the scrambling remote-work adoption, the desperate search for tools that could make a kitchen table function like a corporate office. By mid-March, the daily active user count had hit 3.2 million, up from 2.1 million at the start of the year. Enterprise inquiries had tripled. The servers were handling loads that exceeded their designed capacity by a factor of two.
"Which brings us to Dojun," Hana said, and the grid rearranged as the CTO's face moved to the center of the screen—Dojun in his home office, surrounded by the monitors and whiteboards that had become his pandemic workspace, Hajun's snow program still open in a corner tab because he had not been able to bring himself to close it.
"Aria 5.0," he said. "Five years of work. Project Lighthouse. The AI agent."
He shared his screen. The demo was simple—intentionally so, because the technology was complex enough that the demo needed to be the opposite. He opened Aria on a simulated user's laptop. The interface looked the same as version 4—clean, intuitive, the Hana Layer in full effect. But underneath, everything was different.
"Watch," he said.
The simulated user opened their email. Forty-seven unread messages. In the old Aria, the user would sort them manually, respond to them individually, organize them into folders. In Aria 5.0, the AI agent—Lighthouse—read the emails, understood the context, and began to act.
It identified three emails that required immediate response and drafted replies in the user's voice—not template responses but actual, personalized, contextually-appropriate drafts that reflected the user's writing style, relationship with the sender, and the content of previous conversations. It flagged two emails as potential scheduling conflicts and suggested resolutions. It recognized that one email was from a client who had been waiting for a report, found the report in the user's files, attached it to a draft response, and placed it in the review queue with a note: "Ready to send. Report is attached. Want me to schedule a follow-up?"
The video call was silent. Forty-two faces, frozen in the particular expression of engineers witnessing something they had built and barely believing it worked.
"It doesn't send anything without permission," Dojun said. "Everything goes to a review queue. The user approves, edits, or rejects every action. Full transparency—you can see exactly why the agent made each decision, what data it used, what assumptions it's operating on. The ethics framework isn't a layer on top. It's the foundation underneath."
"The consent model," Jihye said—she was now the lead architect on Lighthouse, five years after asking the right question about autonomy and trust at that first whiteboard meeting. "Every action the agent takes is categorized into three levels. Level one: informational. The agent tells you something. No permission needed. Level two: preparatory. The agent drafts something for your review. Implicit permission—you see it, you decide. Level three: executive. The agent does something on your behalf. Explicit permission required every time."
"Every time?" someone asked.
"Every time," Dojun confirmed. "No autopilot. No 'set it and forget it.' The agent works for the user, not the other way around. If the user stops paying attention, the agent stops acting. That's the line."
The demo continued. The agent organized the user's calendar—not just scheduling meetings but understanding priorities, identifying conflicts between work commitments and personal obligations, suggesting time blocks for deep work based on the user's historical productivity patterns. It prepared a meeting brief by aggregating relevant emails, documents, and previous meeting notes into a single summary. It noticed that the user had been working for four hours straight and gently suggested a break, citing research on cognitive performance.
"That last one," Hana said, smiling through the video screen. "That's the Hana Layer."
"It's always the Hana Layer," Minjae muttered from his box in the grid, and the call erupted in the laughter that came from shared history and inside jokes and the particular bond of people who had been building something together for a decade.
The response to Aria 5.0 was immediate, massive, and complicated.
The immediate part: within the first week, three million new users signed up. Not the gradual, organic growth that Aria had experienced over the previous years—a flood. A tsunami. The servers groaned. The engineering team worked in shifts, scaling infrastructure in real time, the particular controlled chaos of a technology team managing success that exceeded their most optimistic projections.
The massive part: by the end of March, Aria had five million daily active users. By April, seven million. The pandemic was the accelerant—millions of people working from home for the first time, desperate for tools that made the experience bearable, discovering Aria's agent and experiencing the specific, almost-religious conversion of someone who had been managing their email manually and suddenly had an AI doing it for them.
The complicated part: the world noticed.
TechCrunch: "Aria 5.0 Is the First AI Agent That Actually Works—And That Should Terrify Everyone."
The Korea Herald: "Korean AI Platform Aria Reaches 10 Million Users as Pandemic Transforms Work."
The New York Times technology section: "The AI That Reads Your Email: Aria's New Agent Raises Questions About Privacy, Autonomy, and the Future of Work."
CNN: "Is an AI Agent Your New Coworker? Inside the Korean Startup That Says Yes."
The praise was genuine. The skepticism was also genuine. The AI agent—the idea that a software program could act on a user's behalf, drafting emails and organizing schedules and making decisions about what was important—triggered every anxiety about artificial intelligence that the public had accumulated over decades of science fiction and tech-industry hype.
"They're scared," Hana said, during a late-night strategy session in their apartment—the pandemic equivalent of the jjigae-jip meetings, conducted over delivery food and laptop screens while Hajun slept in the next room. "The product is good. The technology is good. The ethics framework is solid. But people are scared of AI that acts, not just AI that answers."
"They should be scared," Dojun said. "An AI agent that acts without oversight is dangerous. We built oversight into every layer. But the fear isn't about our specific product—it's about the category. The concept of autonomous AI."
"So how do we fight concept-level fear?"
"We don't fight it. We respect it. The fear is rational. An autonomous AI agent is a genuine risk if built wrong. We built it right. The way to prove that isn't through marketing—it's through time. People need to use it, experience the consent model, see the transparency. Trust is earned, not announced."
Hana looked at him across the table, through the particular fatigue of a CEO managing exponential growth during a global pandemic.
"You've done this before," she said. Not a question. The quiet acknowledgment that her husband carried knowledge from a life that no longer existed and that this knowledge—the pattern recognition, the crisis management instinct, the ability to see the arc of technology's public reception before it happened—was, in moments like this, the difference between panic and strategy.
"Not exactly this," he said. "In the first life, AI agents came later. 2025, 2026. The public reaction was worse because nobody had built the ethics framework first. They launched the technology and then tried to add trust. We did the opposite."
"Ethics first, technology second."
"Ethics as architecture. The building stands on it."
She reached across the table and took his hand—the gesture that had become their shorthand for I'm exhausted but I trust you and we'll figure this out.
"The numbers," she said. "We need to talk about the numbers."
The numbers were extraordinary. Not just user growth—revenue. Enterprise clients, who had been evaluating Aria 5.0 in pilot programs, were converting to full contracts at a rate that David Yoo called "historically unprecedented in Korean SaaS." The agent wasn't just a consumer feature—it was an enterprise productivity multiplier that companies could measure in hours saved and decisions accelerated. Annual recurring revenue had crossed thirty billion won. Valuation estimates from investment banks ranged from 250 to 300 billion won.
"We're not a startup anymore," Hana said.
"We haven't been a startup for years."
"No, I mean—we're not a startup in the way the market perceives us. We're infrastructure. When companies can't function without your product, you're not a tool. You're a utility." She paused. "Seokho called. He wants to integrate Aria's agent into Nova's cloud platform. Full partnership. Not a licensing deal—an integration. Our AI on his infrastructure."
"That's—"
"That's the two biggest Korean tech companies combining forces. The market will notice."
"The market should notice."
"I'm going to call David in the morning. Start the Series E conversation. If we're going to scale the agent to the enterprise market globally, we need capital. Real capital. Not startup capital—infrastructure capital."
Dojun looked at his wife. CEO. Co-founder. The person who had turned a design portfolio project into a platform that ten million people relied on. The blazer had evolved—she wore it less now, because a pandemic CEO worked from home and formality was measured in clarity, not clothing. But the Hana energy was the same: the ability to see a situation in its totality and articulate the path forward in sentences that felt not like strategy but like common sense.
"Do it," he said.
"I already drafted the deck. I was waiting for you to confirm."
"You didn't need my confirmation."
"I know. But I wanted it." She smiled—the tired smile, the pandemic smile, the smile of two people who had been building for fourteen years and were still, despite everything, building together. "We started this in a classroom. A design project. Remember?"
"Bridge. Campus navigation."
"And now we're navigating ten million people through a pandemic." She squeezed his hand. "The scope changed. The partnership didn't."
Hajun's snow program evolved over the following months—not by Dojun's design but by Hajun's insistence.
Every evening, after his homework—second grade at the international school, where he was an average student in most subjects and an above-average student in the one subject that didn't exist yet (programming was not part of the second-grade curriculum, which Hajun considered a systemic failure and which he addressed by teaching himself during lunch breaks using a Chromebook that Minjae had configured with an IDE)—he would sit at Dojun's desk and add to the snow program.
The program grew. Temperature checks became weather API calls—Dojun had helped with the API integration, because seven-year-olds could learn conditional logic but HTTP requests required adult supervision. The API calls became data logging—every day's weather stored in a CSV file that Hajun maintained with the obsessive record-keeping of a child who had discovered data collection.
By March, the program had two hundred lines of code. By May, four hundred. By summer, Hajun had built, without fully understanding the term, a rudimentary machine learning model—a script that analyzed his logged weather data and made predictions based on historical patterns rather than hardcoded rules.
"It's still wrong sometimes," Hajun reported during a Saturday dinner at his grandmother's apartment—the weekly gathering that the pandemic had paused for three months and then resumed with masks and hand sanitizer and the particular stubbornness of a Korean family that refused to let a virus cancel jjigae night.
"How wrong?" Dojun asked.
"Sixty-three percent accurate. Yesterday it predicted rain. It was sunny. The day before it predicted sunny. It rained." He considered this. "My program and the weather have the same relationship as me and my math homework. We acknowledge each other but don't agree."
His grandmother looked up from the jjigae pot. "What program?"
"I'm teaching a computer to predict the weather, halmeoni."
"The TV does that already."
"The TV is wrong too. But at least my program is wrong for free."
His grandmother considered this with the expression she reserved for arguments that were technically valid but emotionally unsatisfying.
"Eat your rice," she said. "The weather will do what it wants regardless of what your computer says."
Hana caught Dojun's eye across the table. The parent look—the shared, silent, can-you-believe-he-said-that look that was one of parenthood's secret pleasures. Their son was sitting at his grandmother's table, eating jjigae, arguing about machine learning accuracy with the casual confidence of someone who had no idea that what he was doing was unusual.
"He's seven," Hana whispered.
"He's our seven-year-old," Dojun whispered back. "That's a different species."
"Seokho sent him a book."
"What book?"
"'Algorithms to Live By.' The one about applying computer science to everyday decisions."
"That's written for adults."
"He's highlighting it."
"Highlighting what?"
"The parts he agrees with. And the parts he thinks are wrong."
Dojun looked at his son—rice on his cheek, chopsticks in one hand, the Chromebook peeking out of his backpack by the door—and felt the particular vertigo of a father who was watching his child become the thing he had not been allowed to become in his first life: a person who loved code and loved people in equal measure.
In the first timeline, Dojun had been the brilliant coder who lost his family. The genius who chose the machine over the human. This time, sitting at this table, watching his son talk about prediction models between bites of jjigae, he was something different. He was the father who was present. The husband who came home. The son who answered the phone on Saturdays.
And his son, inheriting the love of code without inheriting the isolation—that was the whole point. That was the break condition. Not for the snow program. For the loop.
The foundation hit its target.
Two thousand students. September 2020. Four years after the launch, exactly on the timeline his mother had demanded—because his mother's timelines, unlike software release dates, were non-negotiable.
The celebration was virtual—the pandemic had made physical gatherings impossible, so the ceremony was conducted over Aria's own video platform, which was ironic and appropriate in equal measure. Two thousand students on a screen. Their faces in grids. Their stories in data: 73% from outside Seoul, 42% first-generation university students, 28% women in STEM (up from 18% in the first cohort), average age 23, youngest 15, oldest 51.
Somin—now a third-year at SNU, class president of the computer science department, and the foundation's first student-turned-board-member—spoke about the break condition again.
"Four years ago, I wrote code at a convenience store counter and believed that was the ceiling. Today, two thousand people are discovering that the ceiling was never there. It was a display case. And someone—" She looked at the camera, at the grid of two thousand faces, at the future that was staring back at her through laptop screens across Korea. "Someone is going to break through it. One of you. Maybe several of you. And when you do, remember: the break condition isn't the code. It's the courage to write it."
Dojun watched from his home office. On his desk, Hajun's snow program was running its evening check—the automated script that his son had built, the one that fetched weather data every day at 6 PM and logged the results, the obsessive, beautiful, persistent work of a child who had found his calling at seven and would spend the rest of his life answering it.
The program's prediction for tomorrow: snow.
His mother texted him after the ceremony.
Two thousand. You kept your promise.
He typed back: You made me promise.
Same thing. Come Saturday. Galbitang.
I'll bring Hajun.
And Hana.
And Hana.
Good. The girl works too hard. She needs soup.
He put the phone down. The snow program blinked on the screen—the cursor patient, the data accumulating, the algorithm learning, day by day, to understand the weather.
Outside, the first snow of winter was falling on Seoul—soft, silent, indifferent to predictions—and inside, the code was running, and the foundation was growing, and a seven-year-old's first lines were becoming something more than a program. They were becoming a language.
The language of someone who would build.