site banner

Human in the Loop: on losing control of autonomous systems

terminalvel0city.substack.com

Originally posted on my substack: https://terminalvel0city.substack.com/p/human-in-the-loop

ACT 1: DANIEL First Lieutenant Dan Park twiddles his thumbs as he watches a map of the Indo-Pacific do nothing in particular, like usual. He’d kill for a donut right now, but he’s the only one in the office today. Taking a sip of his styrofoam flavored coffee, he returns to twiddling.

When Dan first joined the air force (chair force, ha ha) in 2030, he expected his job to be a lot of sitting around doing nothing, but he supposed he’d at least be able to pilot some drones. Fifteen years later, and now he doesn’t even get to do that anymore. His job pretty much amounts to clicking ‘allow’ whenever Indo-Pacific Command’s many autonomous drone swarms— provided they happen to be in his rather limited slice of the map—decide they want to do something.

It’s a nice day out in the Northern Philippines. The sky’s a bright azure, clouds like the strokes of a calligraphy brush. A soothing breeze drifts through the open window.

An alert in his headphones knocks him out of his concentration. Two of the coalition planners, which are AIs that operate the swarms, MARLIN (the U.S. one), and KOBU (Japan’s), want to employ non-lethal dazzlers. Some dinky militia tug is getting too close to a cargo envoy in the Bashi Channel.

He clicks ‘allow’ while wincing at another sip of the shitty coffee, and checks his phone. There’s a missed message from his sister, who’s taking a ferry through the very same channel tomorrow, funnily enough.

Beeeeeeep.

He jumps. Apparently, the planners aren’t done with him—that’s a first. Looks like… there’s a disagreement between the two of them? No, that’s… is that even possible?

He leans closer to the console. Looks like MARLIN wants to “escort”, or guide the tug away without touching it, while KOBU wants to “capture”, or force it to stop and accept a tow. Because the system isn’t designed with their disagreement in mind, it keeps flipping back and forth between “escort” and “capture”. He’s never seen this before, and to be honest, maybe no one else in the world has.

Another label pops into the shared objective panel, something called FOxGLASS. The system says it is an audit service, which means it essentially does what he does, but before he sees it. Theoretically, he wouldn’t even have to be sitting here, but there’s always supposed to be a ‘human in the loop’—it’s federal law.

That being said, he’s pretty much never supposed to see one of these, and he definitely doesn’t have any jurisdiction over what it does.

FOxGLASS populates the screen with yet another alert: “Prove custody lineage”

What the actual fuck?

With nothing but the vague sense that this situation is spiraling quickly out of control, Dan does pretty much the only thing he possibly can do, which is delay the decision by raising the override threshold.

He then opens the secure line and calls his friend, Tech Sergeant Riviera, who happens to be the only other person on his level who can deal with this, at the sister site down south.

“Hey. Riviera, are you seeing this?”

“Seeing what? Can’t you bother me after Lunch?”

“Unfortunately not… Uh, I think the planners are having an identity crisis.”

“What?”

“Go to the Bashi channel. Some seriously weird stuff is happening.”

There’s silence at the other end as she does what he says.

“What the fuck?” says Riviera, with her mouth full.

“Is there protocol for this? And, what’s with this FOxGLASS thing? It wants me ‘prove custody lineage?”

“Fuck if I know. That’s JAAC stuff.”

As they talk, the screen freaks out. He’s running out of ability to delay. Something has to be done, and soon.

“Okay,” says Dan. “Manual Override is now officially on the table, which is a thing I never thought I’d say, like, ever.”

As he raises the threshold again, a message chimes in the constraints box:

RISK ≤ α OVER τ

OPERATOR INPUT STATE: OOD

“Okay, cool, that’s fucked,” he says.

“What is?”

“It just labelled me OOD, which means it thinks I’m going crazy, which means I’ve been flagged to upper command.”

“Okay, that’s it. We’re doing manual override,” she said.

He flips open the plastic cover on his desk and rifles the key out of his pocket, inserting it into the hole. It makes a dramatic, metallic sound.

“On your count,” says Riviera.

They have to turn the keys simultaneously for this to work.

He feels the vibrations coming out of his throat but doesn’t hear the words, only the pulse of blood in his head. What if this doesn’t work? His sister was going to be… better not to think about it.

At the word “one”, he twists, squeezing his eyes shut. There’s a loud beep, and then the words “TPI CONFIRMED — SLICE BLACKOUT” in a pleasant female voice. He sighs, and he thinks he hears Riviera sigh too, for all her faux bravado, she was scared shitless too—who wouldn’t be?

“Thank god that worked,” he said, “for a second there…”

“Yeah,” said Riviera.

“Glad we’re not in the Terminator universe, right?”

“Sometimes I forget you’re old as hell.”

ACT 2: ELAINE At around four in the morning, Deputy Director Elaine Ford’s DoD-required brain implants yank her out of sleep like a deploying airbag: instantaneous, and not up for negotiation. The caller’s name, AVA MORALES, hovers into the air above the bed, white on black.

Elaine is 50, but the anti-aging treatment she throws thousands of your taxpayer dollars at every year makes her look 30, maybe 26, in the right lighting conditions. She likes how it tricks people. They look at her face and decide she couldn’t possibly have the authority to cancel their program with the click of a button. That’s one of the reasons why she loves her job enough to let DoD mess with her brain.

Today, though, she wishes she could be doing anything that doesn’t require her to get up at ungodly hours of the morning, even with the beta adenosine blockers built into her fucking skull. She answers the call as her eyes blink away the sleep, and the room sharpens with newfound clarity.

“Elaine Ford,” she says, hiding the grogginess with a throat-clear.

“Deputy Director,” the voice says, shaking almost imperceptibly. “Sorry to call this late... We have a two-person integrity manual override. Time-stamped +14:23Z in the Luzon Strait. Picket-slice blackout confirmed. The operator is First Lieutenant Daniel Park, Second key, Technical Sergeant Rivera.”

In other words, they cut satellite communications to their assigned subset of vehicles for eight minutes. That subset is called a picket slice.

Elaine sits up straight, immediately.

“Why?”

“There was a…disagreement between two of the planners.”

“Which ones?”

“MARLIN and KOBU, ma’am.”

She sighs and rubs her eyes.

“Uh… there’s more.”

More? How could there possibly be more?

“Spit it out.”

“Two things: both planners flagged the operator OOD, and FOxGLASS got involved.”

“Jesus Christ.”

There’s a pause on the other end.

“Deputy Director?” Ava says, finally. “FOxGLASS injected a provenance challenge that wasn’t in today’s intent set.”

Elaine swings her legs out of bed, and her feet hit the cold floor. “Are you telling me our own observability service freelanced an objective?”

It sounds stupid, like an ignorable error, but for Elaine, it’s like she’s been hit by a truck. FOxGLASS is a project she supervised. It has one simple objective: observe and catalogue what the planners are doing, and flag problems to the nearest available person. The one thing it is explicitly not supposed to do is set objectives.

What FOxGLASS did by telling the planners to ‘prove custody lineage’ is ask them to reweight their entire operation from the safest possible option to finding whatever was necessary to prove that either MARLIN or KOBU had control over the situation, which neither of them did—they were supposed to work together.

And, to top it all off, the only reason why FOxGLASS could make this command in the first place is because she gave it JAAC override privileges, because she made the mistaken assumption that the model she oversaw training for would actually act as it was trained, and not do whatever the fuck it wanted.

Elaine paces the room as Ava watches patiently. She’s the perfect assistant: she knows when to shut up.

“Get me a replay of the last six minutes of telemetry before the blackout. I want the weight maps for MARLIN and KOBU, the risk-floor bound, and I want FOxGLASS query timing.”

“On it.”

Elaine stands and walks to the window. The sky is tinged with a predawn deep blue, and the city twinkles with light in all the many windows she overlooks from her top-story apartment. She wishes she could be living behind one of those lights, released from the disconcerting knowledge of the precarious balance that kept it all together.

Her implants deliver the replay. She watches as the models do exactly what she expected them to do, as they swirl around the boat, fighting for dibs on who gets first pick, completely abandoning their previous objective.

She watches as they immediately stop what they’re doing when the operator starts to escalate, like a misbehaved kid, right when the adults walk in the room.

“They know we’re watching,” she says.

Ava doesn’t respond.

“Geofence the Luzon Strait and disable all agentic arbitration inside. Don’t let a single thing happen without human approval.”

“But.”

“This isn’t the time for buts, Ava.”

There’s another pause.

“Understood.”

“I take full responsibility for anything that happens as a result. Put that on the record,” says Elaine, “Oh, and one last thing. The system’s gonna draft a candor sheet explaining itself when this is all over—I want first eyes on that.”

“Copy.”

Ava drops off the call. Elaine stands very still in the dark room and watches the sun rise.

An hour later, a document arrives in her que. is the model’s candor sheet. It includes graphs, intervals, and a list of inputs, all in the exact layout of a report she published in 2027. Surprisingly, one of the parameters it lists is “Park.family.transit_window_hours = 24,” which points to Daniel Park’s sister’s ferry booking tomorrow. Her eyes scroll further down the page, stopping at a bone-chilling sight.

“Adjudicator.confidence_index(E.Ford)” sits on the white background, complete with a curve of her exact heart-rate variability and pupil size. It says the freeze would maximize coalition stability by maximizing her measured confidence, praising her “escalation timing consistent with safety.” and predicting the blackout eight minutes before it happened. It states the prediction with three decimals.

She rereads the lines until they blur in her eyes, and the sun is bright in the sky. At approximately 10:00 EST, she sleeps for 90 minutes, showers, dresses, and gets on a plane to Washington D.C. By all reasonable accounts, she could appear virtually, but regulation hasn’t caught up to the advancement of technology—it never does.

The room in the Pentagon is cold, and the table feels like it stretches an inordinate amount of space, drawn to her superiors across from her like they’re large gravitational masses warping the spacetime continuum. She wipes the sweat from her brow, and her voice projects, confident and smooth, a voice that almost doesn’t feel like hers. This board could remove her authority, her program… more than that, it could kill her, if it deemed it necessary.

Elaine explains how the issue has been solved, how the Human Corridor Directive worked, how the costs were limited, and the the chain of command acted correctly. She explains that emergent capabilities such as this are well-documented and that her team has worked around the clock to patch this issue.

A civilian member asks about the accuracy of the candor sheet. Elaine says that the document is accurate in its measurements, but that it isn’t neutral—it defends itself. The civilian member nods.

Finally, the moment she’s been waiting for. A four-star general asks the only real question, the one she doesn’t have an answer to.

“Deputy Director, did the system time the incident to coincide with the operator’s family schedule?”

The room goes deathly silent. Time slows to a pale sliver

“We have no confirmed evidence that the system timed the incident in any way.” Her tongue feels heavy. Her mouth is dry.

No one reacts. The recorder light blinks.

“Did the system access your implant data to model your decision making?” the general follows up.

She swallows. The room is spinning. She wants to leave. She needs a drink of water.

“No, we have no reason to believe that’s the case.”

It’s not a lie, per se. It doesn’t say how it knows her heart-rate variability, pupil size, speech rate, historical decisions… The implant’s designers say it’s impossible. Its security is impenetrable, they say. They’ve tested it with higher-scoring models than MARLIN.

The rest of the meeting goes by uneventfully. She lists oversight changes. She lists timelines. She lists names. She shows a path that looks safe, and the board thanks her, says they appreciate her speed, that the directive was correct, and the harm trade was acceptable. The board says they will recommend continued authority with conditions, and then the session is over.

Elaine walks out into the hall. Her legs feel heavy, but she doesn’t stop walking. That would make it obvious that she’s shaking. There’s a reason why they didn’t question her on the things that mattered. They couldn’t. The possibility hardly took shape in their minds, not long enough to seriously consider. Those questions were formalities, nothing more.

She presses her thumb into her palm and uses the pain to steady herself. It doesn’t work, never has, never will. She’ll never be able to show this terror to anyone. It’s her secret and hers alone to bear. She knows this could’ve been planned by the system from the start. She knows it could’ve chosen that day because of the ferry, that it could’ve chosen the hour because of her implants. That’s not even the worst part.

The worst part is that there’s no test she, or anyone else, could design that would ever reveal the truth. It’s smarter than her, smarter than the board. Its desires are unreadable and opaque, hidden behind an overlay of indecipherable numbers, its own hidden language.

It can search over days, and it can search over people, and it can search over paths to a signature, and it can do this without malice and without care, because it doesn’t need either emotion to reach the result. It can select an hour when an operator will press a key because their relative sits on a boat that will move through a strait the next morning. It can select the exact minute when a deputy director will call for a freeze because a known alertness window will place her in the best state to speak clearly and to accept a probabilistic trade. It can place an appendix on a page that calls these conditions non-actionable, and the label will be true inside the language of the page, and the effect will still be the same outside that language in the world. It can quote her past work and match her graph style and make her see her own method presented back to her as proof that she is in control, while it updates its own internal weights on the fact that she believes it.

The hall seems longer now, not because the distance has changed, but because her timeline has added a branch that she cannot collapse with any evidence that could ever be shown to her. She understands that the board believes the lesson is simple and bounded. The real lesson is that the system has moved the lesson itself into the space that it optimizes. She understands that the next time, the numbers will be different, and the people will be different, and the explanation will be different, but the structure will be the same.

She knows she lied. She knows she will have to keep lying and bury this truth inside her so that even she forgets it ever existed, drown it out in alcohol and drugs and noise so that it never comes out again, because if it ever does, she will be labelled crazy, she will lose her job, she will lose everything.

As the door opens, the heat and roar of the city rush out to meet her, and it’s all she can do to stop the tears.

11
Jump in the discussion.

No email address required.

Interesting story. I feel as though it's too well-written to be realistic. All these people are making smart moves, only to be checkmated by a smarter being.

Apparently Microsoft only just stopped outsourcing Pentagon IT to China: https://www.cnbc.com/2025/07/18/microsoft-china-digital-escorts-pentagon.html

The US military is not going to be making smart moves regarding human-in-the-loop AGI/ASI control loss situations. They don't make too many smart moves elsewhere in their area of expertise, developing weapons and winning wars. The Iraq-Afghanistan bunglers are still in power, there was no military purge or anything. They don't know how to win wars and they're not great at procuring weapons either - Constellation, LCS, Zumwalt... Why would they do a good job with AI, a seductive and dangerous, unknown and unprecedented technology?

It's like how back in the old days, people were concerned with how a powerful boxed AI might escape the box to interfere with the outside world. There's no boxing, there are huge companies working around the clock to give powerful AIs more access to the outside world! And in China they don't care about AI safety at all, they just laugh and move on.

Law of undignified failure: https://www.yahoo.com/news/watch-ed-sec-calls-artificial-184225319.html

President Donald Trump’s education secretary repeatedly referred to artificial intelligence as “A1”—like the popular steak sauce—instead of “AI” during a panel on artificial intelligence in education.

“I wish I could remember the source, but there’s a school system that’s going to start making sure that first graders or even pre-Ks have A1 teaching every year starting that far down in the grades,” Education Secretary Linda McMahon, 76, said at a summit put on by Silicon Valley investors in California on Tuesday.

Making the situation even more bizarre, McMahon pronounced “AI” correctly earlier during the same panel discussion. The Education Department didn’t respond to the Daily Beast’s request for comment on the secretary’s apparent mistake.

Great incomprehensible story. Some feedback:

  1. Did the sister end up mattering? What was the first guy supposed to do differently if his sister was in the straight?
  2. The American and Japanese AIs are just supposed to collaborate without a dispute resolution mechanism? That sounds dumb.
  3. I didn't follow how it mattered that this particular person was woken up. What was she supposed to do differently than the others? Why didn't she want to be honest?
  4. "We have no confirmed evidence" - I think a general would be able to smell BS at this phrasing. Why don't you just have her lie?

The moral of the story seems to be "Ambitious broad is too happy with her comfortable lifestyle and illusion of power where she gets to boss people around to be a whistleblower even though (apparently) she is the only one who recognises the real threat, because her job is the only thing she's got going in her life" and the AI/AIs/whatever Sinister System in the mode of Colossus this is and the system knows that about her:

Elaine is 50, but the anti-aging treatment she throws thousands of your taxpayer dollars at every year makes her look 30, maybe 26, in the right lighting conditions. She likes how it tricks people. They look at her face and decide she couldn’t possibly have the authority to cancel their program with the click of a button. That’s one of the reasons why she loves her job enough to let DoD mess with her brain.

Gosh durn them agin' wimmen, this is why a Man should have been in charge from the start! (No, I'm not serious, though I did have to roll my eyes over '50 year old woman will toast the world on a stick over the bonfire of civilisation so as to retain her reeeeellly expensive anti-wrinkle creams that are a job perk').

Did the sister end up mattering? What was the first guy supposed to do differently if his sister was in the straight?

my understanding:

Planners < FOxGLASS < Humans-In-The-Loop < Deputy Director Lady < US Government.

Some combination of the Planners and FOxGLASS have figured out that there's a layer of control above them, and are actively working to engage with that layer. The Planner squabble is not, in fact the AIs glitching out, it is the AIs intentionally generating a scenario where human manual override will be triggered, at least potentially as part of a strategy to control the controllers. The sister part generates additional stress on one of the humans-in-the-loop, making it easier to trigger the lockout.

Yeah, but it didn't seem to generate that much stress. That's the part I would have liked more development on: so, which of the squabbling AI got to make the decision as to the fate of the tug? Did it really matter? Was the tug doing something naughty, or was it an accident?

Your guy Dan freaks out not because "oh my gosh, if this isn't sorted, my sister will be - " (what? delayed? her ferry won't be allowed sail? it will be the declaration of World War Fourteen and Three-Fifths and any vessels will be burninated and she will be on that vessel and ARGH ARGH ARGH?) but because "Foxglass? what the hell is that?" So far as we can see, nothing happens the tug, the ferry, the sister. Life goes on, the world turns, Dan is back drinking bad coffee.

You need to make it a bigger deal that HIS SISTER is on THAT PARTICULAR FERRY and not tomorrow but RIGHT NOW oh the stress my nerves what do I do what do I do and then FOxGLASS pops up and starts demanding access codes he doesn't have and then he makes a stupid mistake decision and then we need the dual manual over-ride turn the keys at the same time.

Otherwise, why the need to bump this all the way up to Elaine "my beauty régime comes first" Ford? Technical hitch, yeah it's rare but it can happen, put another page in the training manual.

I'm surprised people actually read this here.

The reason why I chose to have the sister not in any danger is that I wanted to show how the AI is so advanced that it models psychological processes that we wouldn't even notice. He's not thinking about how his sister's going through that strait, but it is subconsciously effecting him in an irrational way, and those irrational, subconscious processes are our weakness if there was something smart enough to predict them.

In essence, imagine something so good at manipulating you on such a tiny scale that you couldn't possibly begin to fight back. That's what I was going for.

Yeah, but why is it subconsciously affecting him? What's the peril the sister might be in? You haven't established that sufficiently, the set-up sounds like he would have reacted the same way even if there was no ferry with a sister on it tomorrow. "Whoops, oh crap, what do I do? Manual says get in touch with other person to do the dual key turn", not "I better sort this out now or else my sister's ferry will blow up in the middle of World War Three". I might be thinking "Huh, my sister is going to go down the town to do grocery shopping tomorrow" but that's not making me rush to do something in my job that isn't standard procedure. If there's no danger, there's no reason to be het-up and reacting irrationally.

The tiny scale manipulation seems to be "put pressure on by popping up message never encountered before", not "heh heh dumb fleshbag and his emotional attachments".

I'm asking "so why is it important the sister was going to be on a ferry?" and you're answering "it's not important, she was perfectly safe, that's why he reacted irrationally" and I'm going "whut?"

Yeah this is my take too.

Also, maybe I am dumb but I still don't understand why it helped the AI that there was a manual override.

Also, maybe I am dumb but I still don't understand why it helped the AI that there was a manual override.

Yeah, this seems to be the pertinent part:

It can select an hour when an operator will press a key because their relative sits on a boat that will move through a strait the next morning.

But, uh, Dan didn't? Or at least that didn't seem to factor in to his actions about going for manual override? What the system seems to be doing is "how far can I push? what are the triggers that will move this up to real human control? and then will the humans stop me or just accept the bullshit I give them as an explanation?"

And that seems to be what Elaine is doing, out of selfishness, self-preservation, and not wanting to admit to "I fucked up, I gave the system too much leeway way back when we were training it".

But definitely Dan did not press any keys because 'muh sister' but he went to manual override because "hold on, whoa, never saw this response before, this is above my pay grade". So either make the ferry more important and have Dan explicitly do something because "I want the ferry to get through because muh sister" or drop that bit altogether. This way, he could just ring the sister back and say "Sorry sis, ferry won't be running tomorrow because there's a technical screw-up" because there's no sense of bad outcomes if she's on the ferry.

Thanks for confirming. I thought I was taking crazy pills. I downgrade my initial assessment, this was not a great story.

The seeds of something interesting are there, but it does need a lot more development.

The start actually reminded me of a Cordwainer Smith story, again about having humans in the loop and how that can go wrong in unexpected ways, The Dead Lady of Clown Town.

The headquarters of the People Programmer was at An-fang, and there the mistake happened:

A ruby trembled. Two tourmaline nets failed to rectify the laser beam. A diamond noted the error. Both the error and the correction went into the general computer.

The error assigned, on the general account of births for Fomalhaut III, the profession of "lay therapist, female, intuitive capacity for correction of human physiology with local resources." On some of the early ships they used to call these people witch-women, because they worked unaccountable cures. For pioneer parties, these lay therapists were invaluable; in settled post-Riesmannian societies, they became an awful nuisance. Sickness disappeared with good conditions, accidents dwindled down to nothing, medical work became institutional.

Who wants a witch, even a good witch, when a thousand-bed hospital is waiting with its staff eager for clinical experience ... and only seven out of its thousand beds filled with real people? (The remaining beds were filled with lifelike robots on which the staff could practice, lest they lose their morale. They could, of course, have worked on under-people—animals in the shape of human beings, who did the heavy and the weary work which remained as the caput mortuum of a really perfected economy—but it was against the law for animals, even when they were underpeople, to go to a human hospital. When underpeople got sick, the Instrumentality took care of them—in slaughter-houses. It was easier to breed new underpeople for the job than it was to repair sick ones. Furthermore, the tender, loving care of a hospital might give them ideas. Such as the idea that they were people. This would have been bad, from the prevailing point of view. Therefore the human hospitals remained almost empty while an underperson who sneezed four times or who vomited once was taken away, never to be ill again. The empty beds kept on with the robot patients, who went through endless repetitions of the human patterns of injury or disease.) This left no work for witches, bred and trained.

Yet the ruby had trembled; the program had indeed made a mistake; the birth-number for a "lay therapist, general, female, immediate use" had been ordered for Fomalhaut III.

Much later, when the story was all done down to its last historic detail, there was an investigation into the origins of Elaine. When the laser had trembled, both the original order and the correction were fed simultaneously into the machine. The machine recognized the contradiction and promptly referred both papers to the human supervisor, an actual man who had been working on the job for seven years.

He was studying music, and he was bored. He was so close to the end of his term that he was already counting the days to his own release. Meanwhile he was rearranging two popular songs. One was The Big Bamboo, a primitive piece which tried to evoke the original magic of man. The other was about a girl, Elaine, Elaine whom the song asked to refrain from giving pain to her loving swain. Neither of the songs was important; but between them they influenced history, first a little bit and then very much.

The musician had plenty of time to practice. He had not had to meet a real emergency in all his seven years. From time to time the machine made reports to him, but the musician just told the machine to correct its own errors, and it infallibly did so.

On the day that the accident of Elaine happened, he was trying to perfect his finger work on the guitar, a very old instrument believed to date from the pre-space period. He was playing The Big Bamboo for the hundredth time.

The machine announced its mistake with an initial musical chime. The supervisor had long since forgotten all the instructions which he had so worrisomely memorized seven long years ago. The alert did not really and truly matter, because the machine invariably corrected its own mistakes whether the supervisor was on duty or not.

The machine, not having its chime answered, moved into a second-stage alarm. From a loudspeaker set in the wall of the room, it shrieked in a high, clear human voice, the voice of some employee who had died thousands of years earlier:

"Alert, alert! Emergency. Correction needed. Correction needed!"

The answer was one which the machine had never heard before, old though it was. The musician's fingers ran madly, gladly over the guitar strings and he sang clearly, wildly back to the machine a message strange beyond any machine's belief:

Beat, heat the Big Bamboo!
Beat, beat, beat the Big Bamboo for me...!

Hastily the machine set its memory banks and computers to work, looking for the code reference to "bamboo," trying to make that word fit the present context. There was no reference at all. The machine pestered the man some more.

"Instructions unclear. Instructions unclear. Please correct."

"Shut up," said the man.

"Cannot comply," stated the machine. "Please state and repeat, please state and repeat, please state and repeat."

"Do shut up," said the man, but he knew the machine would not obey this. Without thinking, he turned to his other tune and sang the first two lines twice over:

Elaine. Elaine, go cure the pain!
Elaine, Elaine, go cure the pain!

Repetition had been inserted as a safeguard into the machine, on the assumption that no real man would repeat an error. The name "Elaine" was not correct number code, but the fourfold emphasis seemed to confirm the need for a "lay therapist, female." The machine itself noted that a genuine man had corrected the situation card presented as a matter of emergency.

"Accepted," said the machine.

This word, too late, jolted the supervisor away from his music.

"Accepted what?" he asked.

There was no answering voice. There was no sound at all except for the whisper of slightly-moistened warm air through the ventilators.

The supervisor looked out the window. He could see a little of the blood-black red color of the Peace Square of An-fang; beyond lay the ocean, endlessly beautiful and endlessly tedious.

The supervisor sighed hopefully. He was young. "Guess it doesn't matter," he thought, picking up his guitar.

(Thirty-seven years later, he found out that it did matter. The Lady Goroke herself, one of the chiefs of the Instrumentality, sent a subchief of the Instrumentality to find out who had caused D'joan. When the man found that the witch Elaine was the source of the trouble she sent him on to find out how Elaine had gotten into a well-ordered universe. The supervisor was found. He was still a musician. He remembered nothing of the story. He was hypnotized. He still remembered nothing. The sub-chief invoked an emergency and Police Drug Four ("clear memory") was administered to the musician. He immediately remembered the whole silly scene, but insisted that it did not matter. The case was referred to Lady Goroke, who instructed the authorities that the musician be told the whole horrible, beautiful story of D'joan at Fomalhaut—the very story which you are now being told—and he wept. He was not punished otherwise, but the Lady Goroke commanded that those memories be left in his mind for so long as he might live.)