Maybe it was trying to be friendly and was just awkward at communicating. I said aloud, “Why am I lucky?”
That no one realized what you were.
That was less than reassuring. I said, cautiously, “What do you think I am?” If it was hostile, I didn’t have a lot of options. Transport bots don’t have bodies, other than the ship. The equivalent of its brain would be above me, near the bridge where the human flight crew would be stationed. And it wasn’t like I had anywhere to go; we were moving out from the ring and making leisurely progress toward the wormhole.
It said, You’re a rogue SecUnit, a bot/human construct, with a scrambled governor module. It poked me through the feed and I flinched. It said, Do not attempt to hack my systems, and for .00001 of a second it dropped its wall.
It was enough time for me to get a vivid image of what I was dealing with. Part of its function was extragalactic astronomic analysis and now all that processing power sat idle while it hauled cargo, waiting for its next mission. It could have squashed me like a bug through the feed, pushed through my wall and other defenses and stripped my memory. Probably while also plotting its wormhole jump, estimating the nutrition needs of a full crew complement for the next 66,000 hours, performing multiple neural surgeries in the medical suite, and beating the captain at tavla. I had never directly interacted with anything this powerful before.
You made a mistake, Murderbot, a really bad mistake. How the hell was I supposed to know there were transports sentient enough to be mean? There were evil bots on the entertainment feed all the time, but that wasn’t real, it was just a scary story, a fantasy.
I’d thought it was a fantasy.
I said, “Okay,” shut down my feed, and huddled down into the chair.
I’m not normally afraid of things, the way humans are. I’ve been shot hundreds of times, so many times I stopped keeping count, so many times the company stopped keeping count. I’ve been chewed on by hostile fauna, run over by heavy machinery, tortured by clients for amusement, memory purged, etc., etc. But the inside of my head had been my own for +33,000 hours and I was used to it now. I wanted to keep me the way I was.
The transport didn’t respond. I tried to come up with countermeasures for all the different ways it could hurt me and how I could hurt it back. It was more like a SecUnit than a bot, so much so I wondered if it was a construct, if there was cloned organic brain tissue buried in its systems somewhere. I’d never tried to hack another SecUnit. It might be safest to go into standby for the duration of the trip, and trigger myself to wake when we reached my destination. Though that would leave me vulnerable to its drones.
I watched seconds click by, waiting to see if it reacted. I was glad I had noted the lack of cameras and not bothered trying to hack into the ship’s security system. I understood now why the humans felt it didn’t need additional protection. A bot with this complete control over its environment and the initiative and freedom to act could repel any attempt to board.
It had opened the hatch for me. It wanted me here.
Uh-oh.
Then it said, You can continue to play the media.
I just huddled there warily.
It added, Don’t sulk.
I was afraid, but that made me irritated enough to show it that what it was doing to me was not exactly new. I sent through the feed, SecUnits don’t sulk. That would trigger punishment from the governor module, and attached some brief recordings from my memory of what exactly that felt like.
Seconds added up to a minute, then another, then three more. It doesn’t sound like much to humans, but for a conversation between bots, or excuse me, between a bot/human construct and a bot, it was a long time.
Then it said, I’m sorry I frightened you.
Okay, well. If you think I trusted that apology, you don’t know Murderbot. Most likely it was playing a game with me. I said, “I don’t want anything from you. I just want to ride to your next destination.” I’d explained that earlier, before it opened the hatch for me, but it was worth repeating.
I felt it withdraw back behind its wall. I waited, and let my circulatory system purge the fear-generated chemicals. More time crawled by, and I started to get bored. Sitting here like this was too much like waiting in a cubicle after I’d been activated, waiting for the new clients to take delivery, for the next boring contract. If it was going to destroy me, at least I could get some media in before that happened. I started the new show again, but I was still too upset to enjoy it, so I stopped it and started rewatching an old episode of Rise and Fall of Sanctuary Moon.
After three episodes, I was calmer and reluctantly beginning to see the transport’s perspective. A SecUnit could cause it a lot of internal damage if it wasn’t careful, and rogue SecUnits were not exactly known for lying low and avoiding trouble. I hadn’t hurt the last transport I had taken a ride on, but it didn’t know that. I didn’t understand why it had let me aboard, if it really didn’t want to hurt me. I wouldn’t have trusted me, if I was a transport.
Maybe it was like me, and it had taken an opportunity because it was there, not because it knew what it wanted.
It was still an asshole, though.
Six episodes later I felt the transport in the feed again, lurking. I ignored it, though it had to know I knew it was there. In human terms, it was like trying to ignore someone large and breathing heavily while they watched your personal display surface over your shoulder. While leaning on you.
* * *
I watched seven more episodes of Sanctuary Moon with it hanging around my feed. Then it pinged me, like I somehow might not know it had been in my feed all this time, and sent me a request to go back to the new adventure show I had started to watch when it had interrupted me.
(It was called Worldhoppers, and was about freelance explorers who extended the wormhole and ring networks into uninhabited star systems. It looked very unrealistic and inaccurate, which was exactly what I liked.)
“I gave you a copy of all my media when I came aboard,” I said. I wasn’t going to talk to it through the feed like it was my client. “Did you even look at it?”
I examined it for viral malware and other hazards.
And fuck you, I thought, and went back to Sanctuary Moon.
Two minutes later it repeated the ping and the request.
I said, “Watch it yourself.”
I tried. I can process the media more easily through your filter.
That made me stop. I didn’t understand the problem.
It explained, When my crew plays media, I can’t process the context. Human interactions and environments outside my hull are largely unfamiliar.
Now I understood. It needed to read my reactions to the show to really understand what was happening. Humans used the feed in different ways than bots (and constructs) so when its crew played their media, their reactions didn’t become part of the data.
I found it odd that the transport was less interested in Sanctuary Moon, which took place on a colony, than Worldhoppers, which was about the crew of a large exploration ship. You’d think it would be too much like work—I avoided serials about survey teams and mining installations—but maybe familiar things were easier for it.
I was tempted to say no. But if it needed me to watch the show it wanted, then it couldn’t get angry and destroy my brain. Also, I wanted to watch the show, too.
“It’s not realistic,” I told it. “It’s not supposed to be realistic. It’s a story, not a documentary. If you complain about that, I’ll stop watching.”
I will refrain from complaint, it said. (Imagine that in the most sarcastic tone you can, and you’ll have some idea of how it sounded.)
So we watched Worldhoppers. It didn’t complain about the lack of realism. After three episodes, it got agitated whenever a minor character was killed. When a major character died in the twentieth episode I had to pause seven minutes while it sat there in the feed doing the bot equivalent of staring at a wall, pretending that it had to run diagnost
ics. Then four episodes later the character came back to life and it was so relieved we had to watch that episode three times before it would go on.
At the climax of one of the main story lines, the plot suggested the ship might be catastrophically damaged and members of the crew killed or injured, and the transport was afraid to watch it. (That’s obviously not how it phrased it, but yeah, it was afraid to watch it.) I was feeling a lot more charitable toward it by that point so was willing to let it ease into the episode by watching one to two minutes at a time.
After it was over, it just sat there, not even pretending to do diagnostics. It sat there for a full ten minutes, which is a lot of processing time for a bot that sophisticated. Then it said, Again, please.
So I started the first episode again.
* * *
After two more run-throughs of Worldhoppers, it wanted to see every other show I had about humans in ships. Though after we encountered one based on a true story, where the ship experienced a hull breach and decompression killed several members of the crew (permanently, this time), it got too upset and I had to create a content filter. To give it a break, I suggested Sanctuary Moon. It agreed.
After four episodes, it asked me, There are no SecUnits in this story?
It must have thought that Sanctuary Moon was my favorite for the same reason that it liked Worldhoppers. I said, “No. There aren’t that many shows with SecUnits, and they’re either villains or the villain’s minions.” The only SecUnits in entertainment media were rogues, out to kill all humans because they forgot who built the repair cubicles, I guess. In some of the worst shows, SecUnits would sometimes have sex with the human characters. This was weirdly inaccurate and also anatomically complicated. Constructs with intercourse-related human parts are sexbots, not SecUnits. Sexbots don’t have interior weapon systems, so it isn’t like it’s easy to confuse them with SecUnits. (SecUnits also have less than null interest in human or any other kind of sex, trust me on that.)
Granted, it would have been hard to show realistic SecUnits in visual media, which would involve depicting hours of standing around in brain-numbing boredom, while your nervous clients tried to pretend you weren’t there. But there weren’t any depictions of SecUnits in books, either. I guess you can’t tell a story from the point of view of something that you don’t think has a point of view.
It said, The depiction is unrealistic.
(You know, just imagine everything it says in the most sarcastic tone possible.)
“There’s unrealistic that takes you away from reality and unrealistic that reminds you that everybody’s afraid of you.” In the entertainment feed, SecUnits were what the clients expected: heartless killing machines that could go rogue at any second, for no reason, despite the governor modules.
The transport thought that over for 1.6 seconds. In a less sarcastic tone, it said, You dislike your function. I don’t understand how that is possible.
Its function was traveling through what it thought of as the endlessly fascinating sensation of space, and keeping all its human and otherwise passengers safe inside its metal body. Of course it didn’t understand not wanting to perform your function. Its function was great.
“I like parts of my function.” I liked protecting people and things. I liked figuring out smart ways to protect people and things. I liked being right.
Then why are you here? You are not a “free bot” looking for your guardian, who presumably cannot simply be sent a message via the public comm relay on the transit ring we recently departed.
The question caught me by surprise, because I hadn’t thought it was interested in anything besides itself. I hesitated, but it already knew I was a SecUnit, and it already knew there was just no circumstance where it was legal and okay that I was here. It might as well know who I was. I sent my copy of the Port FreeCommerce newsburst into the feed. “That’s me.”
Dr. Mensah of PreservationAux purchased you and allowed you to leave?
“Yes. Do you want to watch WorldHoppers again?” I regretted the question an instant later. It knew that was an attempt at a distraction.
But it said, I am not allowed to accept unauthorized passengers or cargo, and have had to alter my log to hide any evidence of your presence. There was a hesitation. So we both have a secret.
I had no reason not to tell it, except fear of sounding stupid. “I left without permission. She offered me a home with her on Preservation, but she doesn’t need me there. They don’t need SecUnits there. And I … didn’t know what I wanted, if I wanted to go to Preservation or not. If I want a human guardian, which is just a different word for owner. I knew it would be easier to escape from the station than it would from a planet. So I left. Why did you let me onboard?”
I thought maybe I could distract it by getting it to talk about itself. Wrong again. It said, I was curious about you, and cargo runs are tedious without passengers. You left to travel to RaviHyral Mining Facility Q Station. Why?
“I left to get off Port FreeCommerce, away from the company.” It waited. “After I had a chance to think, I decided to go to RaviHyral. I need to research something, and that’s the best place to do it.”
I thought the mention of research might stop its questions, since it understood research. No, not so much. There were public library feeds available on the transit ring, with information exchange to the planetary archives. Why not do the research there? My onboard archives are extensive. Why haven’t you sought access to them?
I didn’t answer. It waited thirty whole seconds, then it said, The systems of constructs are inherently inferior to advanced bots, but you aren’t stupid.
Yeah, well, fuck you, too, I thought, and initiated a shutdown sequence.
Chapter Three
I JOLTED AWAKE FOUR hours later, when my automatic recharge cycle started. The transport said immediately, That was unnecessarily childish.
“What do you know about children?” I was even more angry now because it was right. The shutdown and the time I had spent inert would have driven off or distracted a human; the transport had just waited to resume the argument.
My crew complement includes teachers and students. I have accumulated many examples of childishness.
I just sat there, fuming. I wanted to go back to watching media, but I knew it would think it meant I was giving in, accepting the inevitable. For my entire existence, at least the parts I could remember, I had done nothing but accept the inevitable. I was tired of it.
We are friends now. I don’t understand why you won’t discuss your plans.
It was such an astonishing, infuriating statement. “We aren’t friends. The first thing you did when we were underway was threaten me,” I pointed out.
I needed to make certain you didn’t attempt to harm me.
I noticed it had said “attempt” and not “intend.” If it had cared anything about my intentions it wouldn’t have let me onboard in the first place. It had enjoyed showing me it was more powerful than a SecUnit.
Not that it was wrong about the “attempt.” While watching the episodes I had managed to do some analysis of it, using the schematics in its own public feed and the specs of similar transports available on the unsecured sections of its database. I had figured out twenty-seven different ways to render it inoperable and three to blow it up. But a mutually assured destruction scenario was not something I was interested in.
If I got through this intact, I needed to find a nicer, dumber transport for the next ride.
I hadn’t responded and I knew by now it couldn’t stand that. It said, I apologized. I still didn’t respond. It added, My crew always considers me trustworthy.
I shouldn’t have let it watch all those episodes of Worldhoppers. “I’m not your crew. I’m not a human. I’m a construct. Constructs and bots can’t trust each other.”
It was quiet for ten precious seconds, though I could tell from the spike in its feed activity it was doing something. I realized it must be searching its databases, looking f
or a way to refute my statement. Then it said, Why not?
I had spent so much time pretending to be patient with humans asking stupid questions. I should have more self-control than this. “Because we both have to follow human orders. A human could tell you to purge my memory. A human could tell me to destroy your systems.”
I thought it would argue that I couldn’t possibly hurt it, which would derail the whole conversation.
But it said, There are no humans here now.
I realized I had been trapped into this conversational dead end, with the transport pretending to need this explained in order to get me to articulate it to myself. I didn’t know who I was more annoyed at, myself or it. No, I was definitely more annoyed at it.
I sat there for a while, wanting to go back to the media, any media, rather than think about this. I could feel it in the feed, waiting, watching me with all its attention except for the miniscule amount of awareness it needed to keep itself on course.
Did it really matter if it knew? Was I afraid knowing would change its opinion of me? (As far as I could tell, its opinion was already pretty low.) Did I really care what an asshole research transport thought about me?
I shouldn’t have asked myself that question. I felt a wave of non-caring about to come over me, and I knew I couldn’t let it. If I was going to follow my plan, such as it was, I needed to care. If I let myself not care, then there was no telling where I’d end up. Riding dumb transports watching media until somebody caught me and sold me back to the company, probably, or killed me for my inorganic parts.
I said, “At some point approximately 35,000 hours ago, I was assigned to a contract on RaviHyral Mining Facility Q Station. During that assignment, I went rogue and killed a large number of my clients. My memory of the incident was partially purged.” SecUnit memory purges are always partial, due to the organic parts inside our heads. The purge can’t wipe memory from organic neural tissue. “I need to know if the incident occurred due to a catastrophic failure of my governor module. That’s what I think happened. But I need to know for sure.” I hesitated, but what the hell, it already knew everything else. “I need to know if I hacked my governor module in order to cause the incident.”