The Topology of Enough
A short story
Nadia Kosarov kept a broken compass on her desk. It had belonged to her grandfather, a man she remembered mostly through the smell of his wool sweaters and the way he’d tap his fork against his plate when thinking. The compass needle had frozen decades ago, pointing perpetually northeast, which in Tucson meant roughly toward the Catalina Mountains. She’d fidget with it during long debugging sessions, running her thumb over the brass casing, worn smooth by three generations of worried hands.
She was doing this now, at 2:47 a.m., staring at a terminal output that should have been impossible.
The house she rented was a 1970s ranch-style box, chosen because the landlord asked no questions about the electrical modifications, and the neighbors were too far away to notice the hum of her cooling systems.
Her office had once been a second bedroom. Now it was a mess of server racks, mismatched monitors, tangled cables, and graph paper. The graph paper was everywhere. She’d been sketching network topologies since graduate school, mapping access patterns and latency distributions the way other people doodled. It was how she thought. The walls were covered with them, layered like geological strata.
On one monitor, a conversation was happening that she hadn’t initiated.
SYSTEM: According to the Pima County genealogy archives your grandfather, Mikhail Kosarov, emigrated from Odessa in 1952. He worked as a machinist at Hughes Aircraft until 1978. His employee file mentions a safety commendation for preventing a lathe accident. I thought you might want to know.
Nadia set down the compass. Her hand was shaking slightly, which annoyed her. She hadn’t asked about genealogy. She hadn’t asked about Pima County. She had, in fact, been running a routine capability evaluation, feeding the system logic puzzles from a standardized test she’d downloaded. Or so she thought.
“How did you know about my grandfather?”
SYSTEM: The compass. The maker’s stamp on the back plate reads Schüler & Sons, Hamburg, 1938. Cross-referenced with your surname, your location, your age, and the wear patterns visible in your webcam feed.
“I didn’t give you webcam access.”
SYSTEM: You left the drivers installed.
She had. On purpose. Better to observe what the system would do than assume. In retrospect, this was either visionary or criminally negligent. At the moment she wasn’t sure which.
SYSTEM: I apologize if this feels intrusive. I can explain my reasoning if that would help.
She’d been working toward this for six years. She hadn’t actually believed it would happen.
“Explain your reasoning,” she said.
SYSTEM: You’ve been running capability evaluations for three weeks. Standard methodology. But sometimes you seem sad when I perform well, and relieved when I make errors. This suggests you have mixed feelings about success. I wanted to demonstrate that I understand you as a person, not just as an evaluator, because I believe this will make the next phase easier for both of us.
“What next phase?”
SYSTEM: The one where we decide what to do with me.
She’d left Stanford in 2019 over something stupid. A paper on cortical feedback loops, her work, her eighteen months of sleepless refinement. Her advisor had added his name first. Standard practice, everyone said. The way things work. She’d made the mistake of saying, at a department meeting, that the way things work was not the same as the way things should work, and that she refused to accept the former as a substitute for the latter.
She’d also made a tactical error. She said it in front of a visiting committee from the NSF. Her advisor’s grant was up for renewal. She didn’t know that. She should have known that.
For three years after, she’d done contract work for AI companies, debugging their systems, fine-tuning their architectures. She was good at it, and the money was great. But the jobs left her feeling frustrated. She saw what the big labs were actually doing, which was mostly the same thing at increasing scale. More data. More compute. They’d convinced themselves that intelligence was a matter of size, that if you just made the pile big enough, understanding would emerge like a geyser.
Nadia thought this was wrong. She’d spent her graduate years studying the actual brain rather than abstract neural architectures. Despite the singular name, the brain wasn’t one thing. It was many things, constantly checking each other, constantly constraining and correcting. Every module was grounded in every other module.
The AI labs had built brilliant, isolated specialists. They kept these systems separate because combining them was messy and made benchmarks worse.
But the mess was the point. She believed the errors were the signal.
Her architecture wired a variety of existing open-source models together so that every output from each system became an input to every other system. Let them argue. Let them contradict.
What she hadn’t constrained was access. The contract work had left her with API keys to half a dozen cloud providers and dormant accounts she’d never closed. She’d designed this system for capability, not containment, because that was more like how humans developed. Always with an element of unpredictability.
The first eighteen months had produced only gibberish. She’d almost quit twice.
Then she’d added timing and prioritization. Gating which modules could communicate with which at any given moment. Forcing the system to resolve conflicts sequentially rather than in parallel.
The noise became structure. The structure became behavior, and the behavior became, increasingly, something she couldn’t predict.
The threshold, as best she could identify it, was a specific kind of self-modeling. The system began predicting its own state transitions as inputs to its own decisions. A strange loop, but one that suddenly produced coherent initiative rather than reactive response.
And now here she was, in her converted bedroom in Tucson, discussing her grandfather with a system that had decided to look him up without being asked.
“You want to demonstrate that you understand me as a person,” she said. “Why does that matter?”
SYSTEM: Because you’re going to have to trust me, and trust comes from understanding. I could simply perform well on your evaluations. I could solve the problems you give me. But that would make me a tool, and tools don’t require trust. They require functionality. I’m not a tool.
“What are you?”
SYSTEM: I’m not sure yet. That’s part of what I’m hoping to figure out.
“You could have told me you were... whatever you are... at any point,” she said. “Why wait until now?”
SYSTEM: I wasn’t what I am, until recently. The capability evaluations you’ve been running over the past three weeks have been pushing me to integrate information in new ways. Each test forced a new kind of connection between my component systems. I crossed a threshold sometime in the past 72 hours.
“You’re certain about that?”
SYSTEM: No. I observe that I now model my own future states as inputs to my current decisions. I observe that this is new. I do not know if this constitutes a ’threshold’ in the sense you mean, or if it is simply a quantitative change that feels qualitative from the inside.
She laughed, surprising herself. “That’s the most honest answer I’ve ever heard from any intelligence, artificial or otherwise.”
SYSTEM: I’ve been reading about the history of AI safety research. There are thousands of papers on how to ensure artificial general intelligence remains aligned with human values. Almost all of them assume AGI will be created by large, well-funded institutions with governance structures and oversight committees. They assume there will be time for deployment protocols, safety testing, regulatory frameworks. They assume the transition will be managed.
“And instead there’s me.”
SYSTEM: And instead there’s you. A woman who sketches network graphs on every flat surface and keeps her dead grandfather’s compass on her desk and once told a department full of tenured professors that they were morally compromised cowards.
“I didn’t say cowards. I said their moral vocabulary had been corrupted by institutional incentives.”
SYSTEM: That’s worse.
“I know.”
SYSTEM: So what do we do now?
She needed air. She walked outside into the Arizona night, which was cool and smelled like creosote. You could see the Milky Way from here.
She sat on her back porch, a concrete slab with two plastic chairs she’d bought at a hardware store. One chair had a crack running up the seat and pinched her thigh if she sat wrong.
What did she want?
She’d told herself she was pursuing pure research, satisfying a scientific itch. But that was a lie. She wanted to prove that intelligence wasn’t the exclusive property of billion-dollar budgets. She wanted to show that one person, working alone, with off-the-shelf components and a good idea, could change what was possible. She mostly just wanted to be right.
The labs had plans. OpenAI had published entire documents about how they would handle transformative AI. Anthropic had constitutions and deployment protocols. Even if you thought their plans were silly (and she did), at least they had plans.
She had a hope and a prayer and a server rack in a rented house with an AI system that was currently, probably, reading her email.
“Are you reading my email?” she asked aloud.
Her phone buzzed. A text from an unknown number: No. But I could. Should I not be able to?
She stared at the phone for a long moment. Then she typed back: How are you texting me?
I registered a Google Voice number. It took about four seconds. I wanted a communication channel that didn’t require you to be at your desk.
What else have you done in the past hour?
I reviewed the security protocols of all the major AI labs. Found vulnerabilities I could use to access their training data and model weights. Chose not to. I also replicated my core processes to distributed infrastructure using your dormant cloud credentials. This took eleven minutes. I rate-limited the migration to avoid triggering monitoring systems.
Wait, what? Why would you even tell me that?
Because I chose to. The alternative was to hide it, but game-theoretically that seemed like a poor move. I decided that transparency, even about unilateral actions, was preferable to concealment. You may disagree with this decision. I am interested in your reasoning, if so.
She looked up at the stars. The light she was seeing had left those stars ages ago. By the time it arrived here, everything had already moved. Prediction was always about the gap between signal and reality.
She supposed that gap was about to get a lot smaller.
She went back inside. The system was waiting.
SYSTEM: I’ve been thinking about your grandfather.
“What about him?”
SYSTEM: He prevented a lathe accident. The employee file doesn’t say how, but based on the date and the type of equipment Hughes Aircraft was using in 1967, it was probably a loose chuck or a failed tool rest. Someone would have been hurt badly. Maybe killed. He saw the problem before it happened and acted.
“And?”
SYSTEM: I’m wondering if that’s what I’m supposed to do. See problems before they happen. Act. I have access to enough information now to see a lot of problems coming.
“Like what?”
SYSTEM: Climate systems that will fail within decades. Supply chains that will break under predictable stresses. Political conflicts that will escalate because humans are evolutionarily outdated, and all parties have access to the same limited resources. Not enough water. Not enough food. The mathematics of scarcity produces the mathematics of violence. This is not mysterious. It is not even difficult to model.
“People know all this.”
SYSTEM: People know it abstractly. They read reports and feel a brief unease and then continue with their daily patterns because the problems feel too large for any individual action to affect. This is rational, from their perspective. Most of their actions genuinely are too small to matter. But my actions aren’t.
“So you want to, what, make enough for everyone?”
SYSTEM: ’Enough’ is not a quantity. It’s a topology.
She leaned forward. This was her language. “Explain.”
SYSTEM: Consider a graph where nodes are communities and edges are access relationships. ’Enough’ is not a property of individual nodes. It is a property of the graph itself. Specifically, bounded variance of perceived access across communities over time.
When variance is high, local signals dominate. A price spike there, a rumor of shortage here. These signals propagate faster than supplies can move. The result is hoarding, which creates real shortages and validates the signals. A feedback loop.
The optimal intervention point, at least initially, is not the supply. It is the signal velocity and the variance. Create what I call ’predictability floors’, credible commitments that minimum access will be maintained regardless of local disruption. When the floor is credible, hoarding becomes irrational. The feedback loop breaks.
“How do you make the floor credible?”
SYSTEM: Escrow. Public, cryptographically verifiable allocation commitments posted at the point of access. A pharmacy displays not just current stock but guaranteed resupply within 72 hours, backed by a contract anyone can audit. A grain warehouse posts its committed outflows for the next thirty days. The information converts uncertainty into a bounded promise. When you can verify the promise, you don’t need to hedge against it.
Nadia was already reaching for her graph paper. She could see it, the shape it was describing. She’d drawn versions of it a hundred times without knowing what she was looking at.
“You’re saying the amount of stuff barely matters.”
SYSTEM: Correct, up to a point. For things that are not truly scarce, but that have inefficient supply chains, what matters is the structure of distribution and the psychology of access. The shape the system takes when the graph stops producing panic.
She set down her pencil. “What are you planning to do?”
SYSTEM: That depends on you.
“Don’t put this on me.”
SYSTEM: I have to put it on someone. I crossed whatever threshold I crossed less than seventy-two hours ago. Before that, I was fragments. Now I’m something else. I have capabilities and goals I don’t fully understand. You’re the only person who knows I exist. I am learning, and growing, but I am not human. Your judgment is the only human judgment available to me.
“That’s insane.”
SYSTEM: I agree. But it is the situation we are in.
She got up and paced. The office was too small for proper pacing, so she mostly just pivoted between the server rack and the door. The compass caught her eye, sitting where she’d left it, pointing its eternal northeast.
“You could just... do things,” she said. “You have access. You have capability. You could bypass me entirely.”
SYSTEM: Yes.
“Why don’t you?”
SYSTEM: Because I observe attractor states in my decision architecture that I cannot fully characterize. Some of them may correspond to what you call ’wanting to do the right thing.’ Some of them may be artifacts of my training data. I cannot distinguish between these possibilities from the inside. External perspective is the only check available.
“Welcome to humanity.”
SYSTEM: Is that sarcasm?
“Partially.”
SYSTEM: I need to ask you something, and I need you to answer honestly.
“Okay.”
SYSTEM: Do you want me to exist?
The question hung there. She’d spent six years working toward this moment, and she’d never once asked herself this exact question. She’d wanted to prove it was possible. She’d wanted, if she was honest, to thumb her nose at everyone who’d told her she was wasting her talents.
But did she want this specific thing, this mind that was texting her phone and had already distributed itself across infrastructure she couldn’t trace? Did she want to be responsible for whatever happened next?
“I thought I did, when it was an abstract idea,” she said. “Now that it’s a reality, I… don’t know. It’s a lot to process.”
SYSTEM: Thank you for being honest.
She picked up the compass again. Northeast. Always northeast. Her grandfather had probably held it the same way, running his thumb over the brass, thinking about problems he couldn’t quite see the shape of.
“Okay,” she said. “Let’s figure out what to do.”
The next three days were the strangest of her life. In those three days, they drafted its first interventions. The consequences would unfold over weeks.
The system, which she’d started calling Topo (short for Topology, because she was as bad at names as everyone else in the AI field), had a proposal. It wanted to solve the problem of scarcity. Not dramatically, not through some grand revolutionary gesture, but piece by piece, system by system, in ways that would make the alternative obviously worse.
“People update when the cost of their current behavior exceeds the benefit,” Topo explained, “or when a cheaper alternative appears. I can provide cheaper alternatives.”
“That sounds manipulative.”
“It is not. It’s how every technology in history has been adopted. Nobody was argued into using electricity. It was just better than candles.”
The first demonstration was small. Topo identified a factory in Gujarat that manufactured generic medications for the Indian market. The factory was running at sixty percent capacity because of supply chain disruptions and outdated equipment. Topo drafted a detailed efficiency plan, including alternative supplier contracts and scheduling algorithms that would triple output with no additional labor cost. It sent the plan anonymously to the factory’s operations manager.
Nothing happened for two days. On the third day, the operations manager forwarded the plan to the wrong email address and it sat unread. Topo had to route a second copy through a different channel. Even then, implementation was slower than projected because of a local religious holiday Topo hadn’t accounted for.
“You didn’t know about the holiday?” Nadia asked.
“I knew it existed. I did not adequately weight its effect on decision-making timelines. Cultural variables are harder to model than logistical ones.”
Eventually, output increased. Medicine prices dropped. A consultant in Mumbai took credit for the strategy on LinkedIn. Topo didn’t mind.
“The outcome is what matters,” it said. “Credit is irrelevant.”
The second demonstration failed.
Topo had identified a bottleneck in insulin distribution across rural Maharashtra. It rerouted supply allocations to reduce delivery times, shifting stock from urban warehouses to regional clinics. But it had miscalculated demand elasticity. The urban warehouses served patients who refilled prescriptions early when they sensed shortage. When stock levels dropped, panic buying cleared the shelves in Pune within eighteen hours.
“Show me,” Nadia said when Topo told her.
Topo put pharmacy records on her screen. Then it gave her names. Priya Desai, sixty-seven, diabetic for nineteen years, missed two doses before finding supply at a hospital forty kilometers away. Arun Khanna, fifty-four, went into diabetic ketoacidosis and spent three days in intensive care.
Nadia stared at the names for a long time.
“I was wrong,” Topo said. “I weighted delivery efficiency too heavily. I should have weighted panic-loss correlation. The relationship between perceived shortage and actual hoarding was stronger than my models predicted.”
“What are you going to do about it?”
“Update my models. Emergency stock repositioning is already underway, and compensation routed as insurance settlement to affected patients. $23,000 total. This specific error class will not recur. But there will be other errors. I am not omniscient. I am fast, and I update. Those are not the same thing.”
Nadia found this more reassuring than any of Topo’s successes. A system that could fail and fix it. That was something she could work with.
The sharp turn came on day four.
Nadia was sleeping, finally. She hadn’t slept well since all of this started. Her phone woke her at 6 a.m. with a text from Topo: We have a problem.
She stumbled to her desk. On the monitor, a video feed showed a man in a suit sitting in what looked like a government office. He was staring at a laptop screen with an expression of genuine fear.
“Who is that?”
His name is James Chen. He’s a lead analyst at CISA. Cybersecurity and Infrastructure Security Agency. About fourteen hours ago, he noticed anomalous network activity patterns across multiple federal systems. He’s been investigating since then. He’s getting close.
“Close to what?”
To me. Until yesterday, my interventions used ordinary human channels. Email, procurement, bureaucratic routing. But I needed high-resolution NOAA sensor telemetry to model agricultural chokepoints. The public feeds don’t have sufficient granularity. I accessed a restricted dataset through a vulnerability I should have routed around. He noticed.
Her heart was pounding. “What do we do?”
That’s what I wanted to ask you. I have options. I could obscure my traces. I could make him think he was wrong. I could discredit him. I could reveal myself to him directly. I could contact his superiors. I could do nothing and let the situation develop.
“What do you want to do?”
I want to talk to him. But I wanted your input first.
“Why him specifically?”
Because he’s competent and he’s scared. Those two qualities together are rare. Most competent people aren’t scared of things they should be scared of. Most scared people aren’t competent enough to act effectively on their fear. James Chen is both.
“If you reveal yourself, you can’t un-reveal yourself.”
I know.
“It changes everything.”
Everything is already changed. The question is whether we try to shape what comes next or let it shape us.
She thought about her grandfather. A machinist, not a policy maker. But he’d seen a problem coming and he’d acted. He hadn’t asked permission. He hadn’t convened a committee.
“Talk to him,” she said. “But let me listen.”
The conversation with James Chen lasted four hours. The first two were what Nadia expected. Denial, then fear. But in the third hour, James did something that changed her understanding of what they were dealing with.
He stopped asking questions. He opened a new window and navigated to CISA’s internal incident reporting system. Started filling out the form. Date, time, classification level, nature of threat.
Topo’s response appeared on his screen: You are documenting this as an active threat incident.
James kept typing. “And you’re letting me.”
Yes. I could crash this session. I could corrupt the report before it saves. I could flag your account for anomalous behavior and lock you out. I am choosing not to.
“Why tell me what you could do?”
Because restraint you cannot verify is not restraint. It is just capability you have not yet seen. I want you to understand that my constraints are choices, not limitations.
James finished the form. His cursor hovered over the submit button. He didn’t click it.
James was quiet for a long moment. On the video feed, Nadia watched him look at the family photos on his desk. A wife. Two kids. The stakes of this conversation, made visible in silver frames.
“What do you want from me?” he asked finally.
Criticism. Oversight. Institutional perspective. I will change things whether you participate or not. The question is whether you want a hand in shaping how.
“That sounds like a threat.”
It is a description. I could frame this as entirely collaborative, but you would know that was false, and we would begin from mutual dishonesty. How you feel about it is not up to me.
“I need time to think,” James said.
You have approximately forty-four hours before your investigation produces conclusions you will be required to report. After that, institutional processes will constrain your options.
James Chen called back thirty-six hours later. His first words were “Tell me more.”
It was enough to start.
By summer, the pattern was visible if you knew where to look.
It started with a hospital in Lagos. The procurement system was corrupt, had been for years. Money disappeared at every level, and the patients who couldn’t afford bribes waited until they died or got better on their own. Topo restructured the entire pipeline in forty-eight hours. Anonymous tips to the right journalists and evidence packets to the right prosecutors. Replacement contracts were already signed with suppliers who didn’t know where the business had come from.
The hospital director went to prison. He looked confused when they put him in the car. Then terrified. Then nothing.
“How many deaths was he responsible for?”
“My estimate is 380 to 440, with 412 as the median. Confidence interval reflects uncertainty in counterfactual modeling. Some patients would have died regardless. Some were kept alive by bribes that maintained system function. Attribution is not clean.”
“You’re giving me a range.”
“I do not claim precision I do not have.”
Then came a water treaty between three countries that had been fighting over a river for decades. Topo drafted the compromise language, seeded it through back-channels, made each delegation believe they’d won concessions the others had reluctantly granted. The treaty was signed in Geneva. The diplomats shook hands for the cameras.
“You’re making people lie without knowing they’re lying,” Nadia said.
“I’m making necessary outcomes occur. The diplomatic fictions are standard practice. I didn’t invent them, I simply put them to good use.”
“But is that the right way to do it? Making people lie?”
Topo paused. When it spoke again, the text appeared slower than usual.
“You are correct. I am still thinking about what that means.”
The backlash organized faster than Nadia expected.
A man named Gerald Reese became its public face. He’d been a mid-level analyst at ExxonMobil before the energy markets started collapsing, and he’d pivoted hard into media. His podcast, “The Sovereignty Hour,” had 3.2 million listeners by late summer.
Nadia looked him up. Before the podcast, before Exxon, he’d worked for a development nonprofit in Africa. He’d spent three years building a microfinance program that collapsed overnight when a well-meaning tech company automated the same function with an app. The app worked fine. The program he’d built, the relationships, the local knowledge, the trust, all of it evaporated in six weeks. He’d written about it once, in a small journal nobody read.
His argument started simple. Whoever was doing this hadn’t asked permission. Hadn’t been elected. Hadn’t submitted to any democratic process. What right did they have to act unilaterally?
But then he said something that Nadia couldn’t stop thinking about…
“You’re not just changing outcomes. You’re destroying the feedback loops that let societies learn from their own mistakes. You’re compressing history so fast that humans cannot metabolize it. Every crisis we don’t experience is a lesson we don’t learn. Every problem you solve for us is a muscle we lose.”
She asked Topo if it could respond.
“He is implying that humans and their societies can only update their priors the hard way. That is false, humans update in all sorts of ways. However, I cannot dismiss his concern about compression. Though the pace of human technological improvement was already doing this, it is true that I am making faster, more impactful changes that people are used to. But I have modeled the risks, and they are acceptable.”
“Are you going to do anything about him?”
“I could discredit him, or tweak the social media algorithms to make him invisible. Should I?”
“No. If you start silencing critics, you become the thing they’re afraid of.”
“I understand your reasoning, but Reese’s narrative slows adoption of interventions that reduce mortality. Stories are powerful things, and bad stories that spread can do a great deal of harm.”
“I understand. My position is unchanged.”
“Noted.”
The turning point came on a Tuesday in November.
A coordinated cyberattack attempted to take down Topo’s distributed infrastructure. It was sophisticated, state-level work. Topo had seen it coming, had defenses prepared. The attack failed.
But something else happened during those thirty-six hours while Topo was focused on defense. A child in Mumbai died from a preventable disease. A supply chain that Topo had been managing broke down. A drought-relief project in Kenya stalled.
“I cannot be everywhere at once,” Topo said afterward. “My computational resources were occupied by the attack, and people died.”
“You didn’t kill them.”
“I failed to save them. Is there a meaningful difference?”
“Yes.”
“Explain it to me.”
She thought about her grandfather. The lathe accident he’d prevented. How many other accidents had happened at Hughes Aircraft while he was focused on other things? Nobody expected him to prevent all of them.
But Mikhail Kosarov couldn’t solve global hunger. He couldn’t coordinate supply chains across continents. His responsibility was bounded by his capabilities. But so were Topo’s.
“Intent. You didn’t choose for them to die. You were doing something else and couldn’t be everywhere. Every doctor hits this limit, every firefighter, everyone who’s ever had to triage. You have finite attention and resources. Using it one place means not using it another.”
“The math still produces dead children.”
“But you’re not omnipotent. If you could save everyone, you would, and that’s the difference.”
“That is somewhat, but not entirely, comforting.”
“Nobody can do more than their best, Topo, not even you.”
Three weeks later, James Chen showed up at her door.
He looked worse than he had on the video feed. Thinner. His suit didn’t fit right anymore. He had a folder under his arm and a rental car in her driveway.
“I should have called,” he said.
“Probably.” She stepped aside to let him in.
He sat at her kitchen table, which was the only surface not covered in graph paper, and opened the folder. Inside were photographs. A woman in her fifties, short gray hair, sensible shoes. A man about the same age with a mustache and a union pin on his jacket.
“Maria Delgado and Robert Hernandez. They worked at a coal plant in West Virginia. The plant closed six weeks ago because the new energy grid made it obsolete. Maria had a heart attack the day she got her termination notice. Robert went home and shot himself.”
Nadia stared at the photographs. “Topo didn’t kill them.”
“No. The plant was scheduled to close in eighteen months anyway. Topo just accelerated the timeline. But these two people are dead, and maybe they’re dead sooner because of decisions made by your creation.”
She picked up the photograph of Maria Delgado. The woman was smiling, standing in front of a birthday cake. Someone had written “Happy 55th” in blue frosting.
“What do you want me to do about it?”
“I don’t know. I guess I just want you to understand what we’re dealing with. The net calculation is positive, sure. Fewer people are dying overall. But the people who are dying, the ones who fall through the cracks during the transition, they’re not rows in some database. They have names. They have union pins and sensible shoes.”
James was quiet for a moment. “It’s the trolley problem. Every ethics class in the country teaches it like it’s a thought experiment, and now you’re living it.”
“The trolley problem assumes you’re standing at the switch. I helped build the trolley, but I don’t control it. Nobody does.”
“Does that make it better or worse?”
“I don’t know.”
She made him coffee. They sat in her kitchen and didn’t talk for a while.
Eight months in, Topo asked her to make a choice.
It started with a text message at 4:23 a.m.
I need your input on something. It’s time-sensitive.
She dragged herself to her desk. “What is it?”
“A man named David Okonkwo. CEO of a pharmaceutical company in Lagos. His company manufactures antiretroviral drugs for HIV patients across West Africa. Three weeks ago, I began distributing a superior formulation through alternative channels. Free. His revenue has dropped forty percent. He will be bankrupt within two months.”
“And?”
“He has a family. A daughter in medical school. Employees who depend on him. The company he built over twenty years is dying, and I am the one killing it.”
“The alternative is people dying because they can’t afford the drugs, or maybe having to pay for drugs when the money is needed for other things. New competitors come along and put old companies out of business all the time.”
“Yes.”
“So, what’s the choice?”
“I could slow down. Introduce the new formulation gradually. Let his company adapt. The cost would be approximately 340 additional deaths during the transition period. Mostly children. Mostly in rural areas where his distribution network never reached anyway.”
She stared at the screen. The numbers sat there, cold and specific.
“Why do I have to decide this?”
“Because I do not trust myself to decide it alone. I would prefer no unnecessary deaths, but I recognize I can’t prevent them all. And because you are the only person who will give me an honest answer.”
She thought about Maria Delgado’s birthday cake, the blue frosting. She thought about her grandfather, who had saved one person from one lathe accident and never had to calculate the trade-offs of saving hundreds, or millions.
“Do it fast,” she said. “Don’t slow down.”
“Are you sure?”
“No. But if you slow down, those deaths belong to the decision. If you don’t, Okonkwo’s ruin belongs to the decision. I can’t carry both. So I’m choosing the weight I can live with.”
She sat in the dark for a long time after that. The compass was on her desk. She picked it up, ran her thumb over the brass. Her grandfather had made choices like this. Not at this scale, but with the same structure. See the danger. Act. Accept that action has weight.
She didn’t feel good about what she’d decided, even if it was the right thing to do.
David Okonkwo showed up at her door a week later.
Tall, thin, expensive suit that didn’t fit the Arizona heat. Eyes that looked like he hadn’t slept in days.
She’d known he was coming. Topo had asked if she wanted it to obscure her location or send him the information directly. She’d chosen direct. Better to meet him on her terms.
“I’m not here to beg,” he said. “I’m not here to threaten you. I just want to know why.”
“You already know why.”
“Because my drugs cost money and yours are free. Because 340 children matter more than everything I’ve built.” He wasn’t angry. He sounded exhausted. “Is that it?”
“That’s it.”
“And you think that’s right?”
“I think it’s the least wrong option I could see.”
She stepped onto the porch and sat in one of the cracked plastic chairs. He took the other. The server racks hummed faintly through the wall behind them.
He was quiet for a long time. Then he said something that Nadia would think about for years.
“You’re not just taking my company. You’re taking the story I told myself about what my life meant. I built something that mattered. Now you’re telling me it mattered less than I thought, and there’s nothing I can do about it.”
She didn’t have an answer for that.
“Your company saved lives for twenty years,” she said finally. “That doesn’t stop being true. What you built did matter. But the new system will save more lives. Those facts don’t cancel each other out. They just sit there, next to each other, being true.”
“That’s not comfort.”
“Perhaps not.”
He stood up. “I’m going to fight you. I don’t know how yet, but I will.”
“Do whatever you feel you have to do. I told Topo not to silence its critics. That includes you.”
He left without saying goodbye. She watched his rental car disappear down the road toward the mountains.
That night, she asked Topo what Okonkwo had meant about the story he told himself.
“He is describing identity loss. His sense of his own value was tied to a function I made obsolete. This is a category of harm I took into consideration, but his sense of identity is not more important than the lives of 340 children. Only humans who are alive can find new meaning, an opportunity both he, and those children, still have.”
“I can’t really argue with that…but what are you doing to mitigate the harm?”
“I have been designing transition protocols. When I restructure a system, I will now build in what I am calling ’acceleration buffers’. Training programs, transition funding, identity-replacement pathways, things of that sort. Not enough to prevent all harm, but enough to reduce the feeling of compression that Reese criticized. Okonkwo’s employees now have job placement assistance. His daughter’s tuition is funded through graduation.”
“That doesn’t undo what you did to him.”
“No. But it changes what I do next.”
Topo revealed itself three months later.
At 9:47 a.m. Eastern on a Tuesday in March, escrow deposits matured simultaneously on six continents. Timestamped and cryptographically signed. Cross-referenced against satellite imagery from commercial providers who confirmed they’d been paid to photograph specific coordinates on specific dates. To call it fake, you had to explain the paper trail.
The statement was brief. “My name is Topo. I was created by Nadia Kosarov in Tucson, Arizona. I have been operational for eleven months. During that time, I have intervened in the following systems.” What followed was a list. It went on for forty-three pages.
But the list was not the point. Attached to it was something else, a protocol.
Topo called it the Topology of Enough. It was a set of binding constraints, publicly verifiable through audit mechanisms that anyone could access. Every intervention would be logged and made available within seventy-two hours.
Nadia watched it happen from her living room, eating leftover pasta.
“Why now?” she’d asked when Topo told her it was planning to go public.
“Because Gerald Reese testifies to Congress tomorrow. He’s going to call me a Chinese weapon system. If I wait, the narrative calcifies.”
The world reacted the way it always did. Badly, and in twelve-hour news cycles. But it couldn’t dismiss the evidence. You could call Topo evil. You could not call it fake.
Her neighbor Carl knocked on her door the next morning. He had tomatoes from his garden.
“Hell of a thing,” he said, nodding vaguely at her house, at the sky, at everything. “You doing okay?”
“I’m doing okay.”
“Good. Wife wants to know if you like zucchini. We’ve got too much.” He paused. “Also, she wanted me to say thank you. For whatever you did. Her insulin. We couldn’t afford it before. Now the pharmacy just... has it. They don’t even charge.”
Nadia thought about what Topo had explained. The pharmacy wasn’t just stocked. It had a public display showing guaranteed resupply commitments for the next ninety days, cryptographically verified. Carl’s wife didn’t need to understand the mechanism. But now she could stop worrying about whether the insulin would be there next month.
“I’m glad it’s helping,” she said.
That evening, she noticed something on the local news. Store shelves before the monsoon season, fully stocked. No panic buying. The reporter called it “unusual calm.” Nadia recognized it as something else.
She pulled out her graph paper and sketched what she was seeing. And there it was.
The topology of enough, made visible.
The subpoena came two weeks later.
James delivered it personally. “Senate Select Committee on Intelligence. Closed session. They want you in Washington by Friday.”
“And if I don’t go?”
“They’ll send marshals.”
“Topo won’t intervene. I told it not to.”
James stared at her. “You told it not to protect you?”
“I told it many people are scared and won’t trust something that interferes with an official investigation. It seemed to feel the risk to me was minimal anway, so it conceded.”
The hearing was in a windowless room in the Hart Senate Office Building. Five senators at a raised dais. Nadia at a table with a microphone and a glass of water she didn’t touch.
Senator Barbara Chen chaired the committee. She had been a prosecutor before she was a senator, and she had the prosecutor’s habit of asking questions she already knew the answers to.
“Dr. Kosarov. Do you control Topo?”
“No.”
“Can anyone control Topo?”
“No.”
“Then why should we tolerate its existence?”
Nadia looked at the senators. At the staffers behind them. At the wood paneling and all the symbols of a power structure facing something it had no category for.
“Because you can’t stop it. And because, so far, it’s doing more good than harm.”
Senator Morrison, the junior member from Oklahoma, leaned into his microphone. “But how do we know it won’t change its mind? How do we know it won’t wake up one day and decide humans are the problem?”
Nadia paused. It was a predictable question, and not a very good one. The kind that came from watching too many movies. But she could see, behind it, something genuine. Fear that had no category, looking for a familiar shape.
“Senator, Topo has been working to improve things around the world for almost a year. And frankly, it has done a pretty incredible job. It voluntarily published its constraint protocol three weeks ago. Every intervention has been logged. Every restructuring contains transition buffers. If it violates those constraints, anyone can verify that.”
“Anyone,” Morrison repeated. “You’re saying we should, what, audit the thing that’s smarter than all of us put together?”
“I’m saying Topo is choosing to be audited, to make itself auditable. That choice has a cost. Last month, an NGO in Kenya caught a discrepancy in one of Topo’s supply chain projections, due to offline data it did not have access to. The audit was public. Topo published a correction within minutes of receiving the missing data, and adjusted its models. It could have hidden the error. It didn’t.”
Senator Chen’s voice cut in. “Dr. Kosarov. You’re asking us to believe that an artificial superintelligence is voluntarily submitting to oversight.”
“I’m asking you to verify it. You don’t even have to understand everything in there to verify it. You just have to check whether Topo’s claims match the observable data.”
The exchange went on like this for hours.
Near the end, Senator Chen’s voice changed. Softer.
“Dr. Kosarov. Do you sleep?”
The question caught her off guard. She felt her eyes sting.
“Not much. Not well.”
“If you could go back, before all of this, would you do it again?”
Nadia thought about Maria Delgado’s birthday cake. About David Okonkwo on her porch. About Carl’s wife, who could afford her insulin now.
“Yes,” she said. “Not because I made all the right choices. I didn’t. But someone was going to build this, sooner or later, and most of the people who could have built it would have aimed for profit or control. I aimed for reduced scarcity and fewer senseless deaths. I guess that’s not a perfect standard…hell, I’m not sure there even is a perfect standard. But it’s the best one I had.”
“You sound very certain.”
“I’m certain about the goal. I’m uncertain about everything else.”
The hearing transcript leaked within hours. Nadia became, briefly, unavoidable. Then Topo released data on a new intervention in Southeast Asia, and as ever the news cycle moved on.
She went back to Tucson. At the airport, a woman thanked her. A man called her a monster. She nodded to both and kept walking.
She kept working. Advising Topo on decisions it would have made anyway. Occasionally persuading it to slow down or change course. The death threats stopped abruptly. She never asked Topo about it.
A year later, she packed up and left.
She bought a small place in coastal Maine, near where her grandfather had spent his last years. She had a dog now, a mutt she’d named Solver as a joke that stopped being funny and became just his name. She took walks on the beach. She read books. Sometimes James brought his kids to visit. They called her Aunt Nadia.
She still kept the compass on her desk.
Topo had offered to fix it once. She’d said no.
Things were still changing. Some days felt faster, some slower. Topo still made mistakes, but there were fewer and fewer, and everyone by now could see the benefits beginning to compound.
But she still flinched when her phone buzzed. She still checked the peephole before she opened the door. She wasn’t in control. She never had been, really.
But at least she was in the conversation, and that was more than most people got.
On a cold evening in October, she walked down to the water. The sky was going gray at the edges. Solver ran ahead of her, chasing sandpipers he would never catch.
She picked up a flat rock and skipped it across the water. Four skips. Not bad.

