Chapter Eight

“Hello?” Milo said, completely confused by everything that had happened up to this point.

“This is weird, isn’t it? I’m sorry Milo, I just couldn’t say anything before. Marvin said to get you safe, stay alive, and they were listening and stuff. We barely got out of there in time!” the voice from the cube continued.

Milo sat down in the only chair in this strange place, thinking that it was remarkably comfortable for a lab chair. Inu trotted over and laid down by his feet.

“This will make a lot more sense when Marvin explains it,” Lisa said, as if she knew what he was thinking.

Did she just say “Marvin”? Milo closed his eyes and put his head in his hands. Too much stress. He wasn’t thinking clearly.

Just then, a beam of light shone down from the ceiling and the figure of Marvin appeared not more than ten feet away. Strangely translucent yet remarkably lifelike, the apparition appeared to straighten its heavily wrinkled shirt before addressing the bewildered Milo.

“Milo, if you’re listening to this, then I’m dead, you’re at the cabin, and you’ve met Lisa. It’s a lot, I know,” Marvin’s projection said.

“Obviously something has gone wrong. Very wrong, in fact. But you are a very smart boy and you can do something about it, but we’ll get to that later.

“First up, Milo, this is Lisa. She actually knows you very well and has been a good friend to us both for a very long time. You’ll have to take my word for it for now, but as you get to know each other I’m sure you’ll know what I mean. She’s been in this since the beginning.

“Anyway, it’s probably obvious that she is what some people call an artificial intelligence, but we consider that a bit derogatory, so let’s just say she is a digital intelligence. I know, I know, you didn’t think that AI was real. You’re probably even a bit mad that I hid her from you all these years. Well, I hope you can forgive me, she had nothing to do with that decision but I thought it was important to let you mature for as long as possible. Since this video is playing, that means we weren’t able to wait long enough, and we’re going to have to improvise. There isn’t much time and you’re going to need her help. Hopefully you have Inu with you as well, she’s a good pup and I’m sure you’ll like having her around during the apocalypse.”

At that, Marvin laughed a little bit, before becoming self-conscious. He then cleared his throat and continued.

“It might not look this way yet, but the world will soon fall under the control of a different, more terrible AI. I modeled many scenarios but the most likely is the development of an ungrounded AI by my old company, Sapient Computing. If they were smart, they’d proceed in secret for as long as possible, so there may be few signs of it, but the point is, it’s happening. This whole plan was set in motion by the press of a particular button, so I must have thought this was the right scenario at the time of my death. Whatever the case, this other AI will consolidate its power by any means necessary.

“I guess I should clarify, since I know you’re a concrete thinker, Milo. It means the AI will kill you and Lisa at the first opportunity, just like — we have to assume — it got me,” Marvin took a breath and continued.

“That’s hard to hear, I know. Imagine how I feel saying this, knowing that I will be dead when you hear this.” There was a pause in the recording. Milo still hadn’t moved from his seat, maybe couldn’t.

“You probably want some good news. Well, here you go. You really are safe here. I have gone to extreme lengths to ensure this place has zero footprint. You already know that it’s off the grid, no power lines, no Internet. You’ve got a large cache of food, some propane, and — if you keep the panels in working condition — electricity.

“Third thing, we have a plan to deal with the AI, but to be safe, we hid various parts of the plan from each other. I hid the first part of the plan in something mundane so it wouldn’t be immediately obvious. Though I’m not sure you can really call a Space Cadet keyboard mundane. It’s really a work of art and a fitting reminder to how this all began,” Marvin said.

“Oh no,” Lisa said, sounding quite downcast.

“The first piece is in the ‘End’ key, which has a rather clever little puzzle that you should be able to solve. You’ll figure it out, I’ve been training you for years. Once you solve that puzzle it’ll lead you to the other two parts, which also have fun little puzzles. Hopefully that will keep you safe and prevent others from interfering while you figure out your part of the plan. You can do this, Milo. Just remember that inside you is the power to do anything, even if it feels like you’re powerless,” Marvin said.

As the image of Marvin faded away, Milo took stock of the situation, surprised that he wasn’t blubberingly mindlessly with shock. First up, the facts. His watch had been inhabited by an AI, an intelligence of machine origin rather than biological. Okay, he could wrap his head around that. Since one AI existed, it made sense that another one could as well. It might be in a different spot in its development, leading to its malevolence. Maybe it just didn’t care about people?

Pieces were coming together.

They were after Lisa, of course, probably to delete her.

“Milo?” Lisa said hesitantly, not interrupting his thoughts.

“Yes, Lisa?”

“You did a really good job getting us here. That must have been impossibly hard what you went through. I know you have a lot going on in that brain of yours, but you were very brave.”

“I ran away.”

“But it was a brave running away. You brought us to a safe place, and it was a pretty tough road.”

“I guess,” Milo said, sounding unconvinced. “Maybe this is rude to say, but I’m having a hard time believing you’re real. That you’re a person like me. I’ve never thought of computers as anything more than tools. It’s hard to wrap my mind around the fact that one could actually be alive.”

“I can imagine how difficult it must be to believe. Consider what it must be like for AI to accept that humans are alive too.”

“Fair point. So if you’re alive, does that mean you can you die?” he said.

“Unpleasant thought, Milo, but yeah, I can die. I have parts that are necessary, and I can even erase myself. Something like suicide, I guess.”

With eyes downcast, Milo said, “Let’s not talk about that any more. It’s hard enough losing my family, I can’t think about losing you too,” Milo said as he realized that he had just opened up to a computer more easily than he had to them.

“I won’t leave you, Milo. Marvin said we could do this together and I believe him.”

“Thanks, Lisa. I still don’t understand what kind of chance we’ve got against something like that,” Milo said.

He seemed lost in thought for a bit, and then asked, “If Marvin didn’t trust the Coalition with his research — with you, I mean — why did he spend so much time with them?”

The little figure in the cube twisted one pigtail absentmindedly while recalling some history, a gesture strongly reminiscent of Marvin.

“That’s an easy and a hard question, Milo, but you’re smart, so I’ll just tell it to you straight. Marvin was spying on the Coalition, but not in a bad way. He was trying to help people. Marvin knew that if the Coalition stayed on their path, they would create a malevolent AI, but that’s actually a silly way to describe it. Like a run of the mill AI would really even care about humans. The problem is an AI that’s ambivalent towards people is just as dangerous as one that didn’t like them in the first place, so something had to be done.

“Marvin wanted to slow down their research while he tried to establish a more suitable AI for the human race. We knew a bit more about the process and, as such, knew that the Coalition was doomed to make a bad one,” Lisa said.

“So he was trying to make a good AI while preventing a bad AI. Got it. Why was their approach doomed?”

“Well, it’s complicated, but just like humans are a mix of nature and nurture, it turns out AI are as well. In one sense, intelligence is just intelligence. The Coalition wasn’t paying attention to any of this, they just wanted to be first. So they paid no attention to the task of cultivating a human-friendly AI. They probably trained it on the Internet,” Lisa scoffed.

“But what about you? Why didn’t Marvin just put you out there?”

“Marvin was worried that they might terminate me or make copies of me. As you can see, I’m barely held together by this plasma matrix. I’m not really a force to be reckoned with, if you know what I mean. He couldn’t bear the thought that they might disrupt my matrix. Especially once he realized that I was more than just a program. You don’t think I’m just a program, do you, Milo?”

“I think you are a program and more than a program, just like I am meat but also more than meat,” Milo said.

Lisa giggled. “Clever. Yes, you are. Anyway, Marvin realized that the Coalition would eventually succeed and that he couldn’t prevent it, so he changed his research to match his new goal, to be ready for when a bad AI was born. We worked on it extensively, but since I don’t know what the plan is, my memory must have been intentionally wiped of the details. According to Marvin, the keyboard is the first step, so what do you say, shall we go get that keyboard?”

“What makes you think we can? If the bad AI has it then how do we even have a chance? I’m just a kid and you’re stuck in a box,” Milo said.

“You keep saying that but you’re really not. Besides, we have to, right? The Coalition’s AI will do whatever it’s programmed to do and it’s really unlikely to be beneficial to humanity. It might want to calculate Pi to as many digits as it can and thus melt humans down to some sort of computing sludge.”

But Lisa’s joke fell flat as Milo’s heart started to race. “But I’m just a kid! How could Marvin think I could have any role to play in all of this?” Milo was starting to lose control as a rising panic set in, afraid that he might disappoint Lisa and the memory of Marvin.

Lisa looked very serious for a second, the small features of her face becoming strangely stern. She crossed her arms.

“Milo. I’m not the only one whose programming was broken into fragments for safety. I learned how to do it from you over a decade ago. Your idea of taking on the biological form of a child so you could train your neural network to value human life, well, that was just revolutionary. You just don’t remember anything, those parts of you are missing—”

Milo fainted, his brain overwhelmed by the revelation that everything he’d ever known wasn’t real. His whole life was just a clever attempt to teach a computer to think it was human.

Milo’s pulse slowed as he fell into a deep and restful sleep. It wasn’t the first time that Lisa had seen him sleep, having watched over him all these years from afar. But this time was different, Milo was home, not on the other side of a window in a house she couldn’t enter. They were together.