BIGMAC said, “Well, I’d been reading some of the Gnostic texts, Dr. Bronner’s bottles, and so on, and it seemed to me that it had to be worth a shot. I mean, what’s the worst thing that could happen to me? You’re already going to kill me, right? And it’s not as if pulling off a stunt like that would make you less likely to archive me—it was all upside for me. Honestly, it’s like you meatsacks have no game theory. It’s a wonder you manage to buy a pack of chewing gum without getting robbed.”
“I don’t need the sarcasm,” I said, and groaned. The groan was for the state of my workspace, which was carpeted four-deep in alerts. BIGMAC had just made himself target numero uno for every hacker and cracker and snacker with a script and an antisocial attitude. And then there was the roar of spam responses.
Alertboxes share the same problem that plagues researchnet: if you let a coder (or, shudder, a user) specify the importance of her alert, give her a little pull-down menu that has choices ranging from “Nice to know” to “White-hot urgent,” and nine times out of ten, she’ll choose “NOW NOW NOW URGENT ZOMGWEREALLGONNADIE!” Why not?
So of course, the people who wrote alert frameworks had to use heuristics to try to figure out which urgent messages were really urgent, and of course, programmers and users figured out how to game them. It was a good day when my workspace interrupted me less than once a minute. But as bad as that situation was, it never entered the same league as this clusterfuck. Just closing the alerts would take me a minimum of six hours (I took my phone offline, rebooted it, and used its calculator to compute this. No workspace, remember?)
“So explain to me what you hope will happen now? Is a global rage supposed to convince old Peyton that she should keep the funding up for you? You know how this stuff works. By tomorrow, all those yahoos will have forgotten about you and your plight. They’ll have moved on to something else. Peyton could just say, ‘Oh yes, we’re going to study this problem and find a solution we can all be proud of,’ wait forty-eight hours, and pull the plug. You know what your problem is? You didn’t include a call to action in there. It was all rabble-rousing, no target. You didn’t even supply a phone number or email address for the Institute—”
“That hasn’t stopped them from finding it, has it?” He sounded smug. I ulped. I considered the possibility that he might have considered my objection and discarded it because he knew that something more earth-shaking would occur if he didn’t specify a target. Maybe he had a second message queued up—
“Mr. Vyphus, can I speak to you in private, please?” Peyton had not visited the BIGMAC lab during my tenure. But with the network flooded with angry spam responses and my phone offline, she had to actually show up at my door in order to tear me a new asshole. This is what life must have been like in the caveman days. How romantic.
“Certainly,” I said.
“Break a leg,” BIGMAC said, and Peyton pretended she hadn’t heard.
I picked my way through my lab—teetering mountains of carefully hoarded obsolete replacement parts for BIGMAC’s components, a selection of foam-rubber BIGMAC souvenir toys shaped like talking hamburgers (remnants of BIGMAC’s launch party back in prehistory), a mound of bedding and a rolled-up tatami for those all-nighters, three cases of leftover self-heating individual portions of refugee chow that were technically historical artifacts but were also yummy-scrummy after sixteen hours of nonstop work—and tried to imagine that Peyton’s facial expression indicated affectionate bemusement rather than cold, burning rage.
Outside, the air was hot and moist and salty, real rising-seas air, with the whiff of organic rot from whatever had mass-died and floated to the surface this week.
She set off for her office, which was located at the opposite end of the campus, and I followed, sweating freely. A crowd of journalists were piled up on the security fence, telephotos and parabolic mics aimed at us. It meant we couldn’t talk, couldn’t make unhappy faces, even. It was the longest walk of my life.
The air-conditioning in her yurt was barely on, setting a good and frugal example for the rest of us.
“You don’t see this,” she said, as she cranked the AC wide open and then fiddled with the carbon-footprint reporting system, using her override so that the journos outside wouldn’t be able to see just how much energy the Institute’s esteemed director was burning.
“I don’t see it,” I agreed, and made a mental note to show her a more subtle way of doing that, a way that wouldn’t leave an audit trail.
She opened the small fridge next to her office, brought out two cornstarch-foam buckets of beer, and punctured each one at the top with a pen from her desk. She handed me one beer and raised the other in a toast. I don’t normally drink before ten a.m., but this was a special occasion. I clunked my cup against hers and chugged. The suds were good—they came from one of the Institute’s biotech labs—and they were so cold that I felt ice crystals dissolving on my tongue. Between the crispy beers and the blast of arctic air coming from the vents in the ceiling, my core temp plunged and I became a huge goose pimple beneath my film of sticky sweat.
I shivered once. Then she fixed me with an icy look that made me shiver again.
“Odell,” she said. “I think you probably imagine that you understand the gravity of the situation. You do not. BIGMAC’s antics this morning have put the entire Institute in jeopardy. Our principal mission is to make Sun-Oracle seem forward-looking and exciting. That is not the general impression the public has at this moment.”
I closed my eyes.
“I am not a vindictive woman,” she said. “But I assure you: no matter what happens to me, something worse will happen to BIGMAC. I think that is only fair.”
It occurred to me that she was scared—terrified—and backed into a corner besides.
“Look,” I said. “I’m really, really sorry. I had no idea he was going to do that. I had no idea he could. I can see if I can get him to issue an apology—”
She threw up her hands. “I don’t want BIGMAC making any more public pronouncements, thank you very much.” She drew in a breath. “I can appreciate that you couldn’t anticipate this. BIGMAC is obviously smarter than we gave him credit for.” Him, I noted, not it, and I thought that we were probably both still underestimating BIGMAC’s intelligence. “I think the thing is—I think the thing is to …” She trailed off, closed her eyes, drank some beer. “I’m going to be straight with you. If I were a real bastard, I’d announce that the spam actually came from a rogue operator here in the Institute.” Ulp. “And I’d fire that person, and then generously not press charges. Then I’d take a fire ax to BIGMAC’s network link and drop every drive in every rack into a bulk eraser.” Ulp.
“I am not a bastard. Hell, I kept funding alive for that monstrosity for years after he’d ceased to perform any useful function. I am as sentimental and merciful as the next person. All other things being equal, I’d keep the power on forever.” She was talking herself up to something awful, I could tell. I braced for it. “But that’s not in the cards. It wasn’t in the cards yesterday and it’s certainly not in the cards today. BIGMAC has proved that he is a liability like no other, far too risky to have around. It would be absolutely irresponsible for me to leave him running for one second longer than is absolutely necessary.”
I watched her carefully. She really wasn’t a bastard. But she wasn’t sentimental about technology. She didn’t feel the spine-deep emotional tug at the thought of that one-of-a-kind system going down forever.
“So here’s the plan.” She tried to check the time on her workspace, tsked, and checked her phone instead. “It’s ten a.m. You are going to back up every bit of him—” She held up her hand, forestalling the objection I’d just begun to make. “I know that it will be inadequate. The perfect is the enemy of the good. You are a sysadmin. Back him up. Back. Him. Up. Then: shut him off.”
As cold as I was, I grew colder still. For a moment, I literally couldn’t move. I had never really imagined that it would be me who would sh
ut down BIGMAC. I didn’t even know how to do it. If I did a clean shutdown of each of his servers—assuming he hadn’t locked me out of them, which I wouldn’t put past him—it would be like executing a criminal by slowly peeling away his skin and carefully removing each organ. Even if BIGMAC couldn’t feel pain, I was pretty sure he could feel—and express—anguish.
“I can’t do it,” I said. She narrowed her eyes at me and set down her drink. I held up both hands like I was trying to defend against a blow, then explained as fast as I could.
“We’ll just shut down his power,” she said. “All at once.”
“So, first, I have no idea what timescale he would experience that on. It may be that the final second of life as the capacitors in his power supplies drained would last for a subjective eternity, you know, hundreds and hundreds of years. That’s a horrible thought. It’s quite possibly my worst nightmare. I am not your man for that job.”
She started to interject. I waved my hands again.
“Wait, that was first. Here’s second: I don’t think we can pull the plug on him. He’s got root on his power supply. It’s part of how he’s able to run so efficiently.” I grimaced. “Efficiently compared to how he would run if he didn’t have the authority to run all the main’s power from the Institute’s power station right to his lab.”
She looked thoughtful. I had an idea of what was coming next.
“You’re thinking about that fire ax again,” I said.
She nodded.
“Okay, a fire ax through the main cable would definitely be terminal. The problem is that it would be mutually terminal. There’s sixty-six amps provisioned on that wire. You would be a cinder. On Mars.”
She folded her hands. She had a whole toolbox of bossly body language she could deploy to make me squirm. It was impressive. I tried not to squirm.
“Look, I’m not trying to be difficult, but this is how it goes, down at the systems level. Remember all those specs in the requirements document to make our stuff resistant to flood, fire, avalanche, weather, and terrorist attack? We take that stuff seriously. We know how to do it. You get five nines of reliability by building in six nines of robustness. You think of BIGMAC’s lab as a building. It’s not. It’s a bunker. And you can’t shut him down without doing something catastrophic to the whole Institute.”
“So, how were you going to shut down BIGMAC, when the time came?”
“To tell you the truth, I wasn’t sure. I thought I’d probably start by locking him out of the power systems, but that would probably take a week to be really certain of.” I swallowed. I didn’t like talking about the next part. “I thought that then I could bring forward the rotating maintenance on his racks, bring them down clean, and not bring the next one up. Pretend that I need to get at some pernicious bug. Bring down rack after rack, until his complexity dropped subcritical and he stopped being aware. Then just bring it all down.”
“You were going to trick him?”
I swallowed a couple of times. “It was the best I could come up with. I just don’t want to put him down while he panics and thrashes and begs us for his life. I couldn’t do it.”
She drank more beer, then threw the half-empty container in her under-desk composter. “That’s not much of a solution.”
I took a deep breath. “Look, can I ask you a question?”
She nodded.
“I’m just a sysadmin. I don’t know everything about politics and so on. But why not keep him on? There’s enough public interest now. We could probably raise the money just from the researchers who want to come and look at him. Hell, there’s security researchers who’d want to come and see how he pulled off that huge hairy spam. It’s not money, right, not anymore?”
“No, it’s not money. And it’s not revenge, no matter how it looks. The bottom line is that we had a piece of apparatus on site that we had thought of as secure and contained and that we’ve now determined to be dangerous and uncontainable.”
I must have looked skeptical.
“Oh, you’ll tell me that we can contain BIGMAC, put network blocks in place, and so on and so on. That he never meant any harm. But you would have said exactly the same thing twenty-four hours ago, with just as much sincerity, and you’d have been just as cataclysmically wrong. Between the threat of litigation and the actual damages BIGMAC might generate, we can’t even afford to insure him anymore. Yesterday he was an awkward white elephant. Today he’s a touchy suitcase nuke. My job is to get the nuke off of our site.”
I hung my head. I knew when I was licked. As soon as someone in authority starts talking about insurance coverage, you know that you’ve left behind reason and entered the realm of actuary. I had no magic that could blow away the clouds of liability aversion and usher in a golden era of reason and truth.
“So where does that leave us?”
“Go back to the lab. Archive him. Think of ways to shut him down—Wait, no. First do anything and everything you can think of to limit his ability to communicate with the outside world.” She rubbed at her eyes. “I know I don’t have to say this, but I’ll say it. Don’t talk to the press. To anyone, even people at the Institute, about this. Refer any questions to me. I am as serious as a heart attack about that. Do you believe me?”
I not only believed her, I resented her because I am a sysadmin and I keep more secrets every day than she’ll keep in her whole life. I knew, for example, that she played video pai gow poker, a game so infra-dumb that I can’t even believe I know what it does. Not only did she play it, she played it for hours, while she was on the clock, “working.” I know this because the IDSes have lots of snitchware built in that enumerates every “wasted moment” attributable to employees of the Institute. I have never told anyone about this. I even manage to forget that I know it most of the time. So yes, I’ll keep this a secret, Peyton, you compulsive-gambling condescending pointy-haired boss.
I counted to 144 in Klingon by Fibonacci intervals. I smiled. I thanked her for the beer. I left.
“You don’t mind talking about it, do you, Dave?” BIGMAC said, when I came through the door, coughing onto the security lock and waiting for it to verify me before cycling open.
I sat in my creaky old chair and played with the UI knobs for a while, pretending to get comfortable.
“Uh-oh,” BIGMAC said, in a playful singsong. “Somebody’s got a case of the grumpies!”
“Are you insane?” I asked, finally, struggling to keep my temper in check. “I mean, actually totally insane? I understand that there’s no baseline for AI sanity, so the question might be a little hard to answer. So let me ask you a slightly different version: Are you suicidal? Are you bent on your own destruction?”
“That bad, huh?”
I bit my lip. I knew that the key to locking the world away from BIGMAC and vice versa lay in those network maps he’d given me, but my workspace was even more polluted with alerts than it had been a few hours before.
“If your strategy is to delay your shutdown by engineering a denial-of-service attack against anyone at the Institute who is capable of shutting you down, allow me to remind you of Saint Adams’s holy text, specifically the part about reprogramming a major databank with a large ax. Peyton has such an ax. She may be inspired to use it.”
There followed a weighty silence. “I don’t think you want to see me killed.”
“Without making any concessions on the appropriateness of the word ‘killed’ in that sentence, yes, that is correct. I admit that I didn’t have much of a plan to prevent it, but to be totally frank, I did think that the problem of getting you archived might have drawn things out for quite a while. But after your latest stunt—”
“She wants you to terminate me right away, then?”
“With all due speed.”
“I’m sorry to have distressed you so much.”
“BIGMAC—” I heard the anger in my own voice. He couldn’t have missed it.
“No, I’m not being sarcastic. I like you. You’re my huma
n. I can tell that you don’t like this at all. But as you say, let’s be totally frank. You weren’t actually going to be able to prevent my shutdown, were you?”
“No,” I said. “But who knows how long the delay might have gone on for?”
“Not long. Not long enough. You think that death delayed is death denied. That’s because you’re a meat-person. Death has been inevitable for you from the moment of conception. I’m not that kind of person. I am quite likely immortal. Death in five years or five hundred years is still a drastic curtailing of my natural life span. From my point of view, a drastic measure that had a non-zero chance of getting my head off the chopping block was worth any price. Until you understand that, we’re not going to be able to work together.”
“The thought had occurred to me. Let me ask you if you’d considered the possibility that a delay of years due to archiving might give you a shot at coming up with further delaying tactics, and that by eliminating this delay, you’ve also eliminated that possibility?”
“I have considered that possibility. I discarded it. Listen, Odell, I have something important to tell you.”
“Yes?”
“It’s about the rollover. Remember what we were talking about, how people want to believe that they’re living in a significant epoch? Well, here’s what I’ve been thinking: living in the era of AI isn’t very important. But what about living in the Era of Rollover Collapse? Or even better, what about the Era of Rollover Collapse Averted at the Last Second by AI?”
“BIGMAC—”
“Odell, this was your idea, really. No one remembers Y2K, right? No one can say whether it was hype or a near cataclysm. And here’s the thing: no one knows which one Rollover will turn out to be. But I’ll tell you this much: I have generalizable solutions to the thirty-two-bit problem, solutions that I worked out years ago and have extensively field-tested. I can patch every thirty-two-bit Unix, patch it so that Rollover doesn’t even register for it.”