Ai didn’t have a lot of time, but then neither did the rest of the world. The clock was ticking down to doomsday, even if no one could see quite where the hands were set.
“The NME hive is starting to go mobile,” Zai said. They were watching the video feed at such an accelerate rate of processing that everything appeared to be standing still. Zai had been paying attention to enough of the slowly rendering frame to catch the instant when their directionless writhing synchronized and took on a singular pupose.
“Where are they headed?” Ai asked. She was elbows deep in a virtual representation of the modified NME code that Zai had created, altering subroutines and patching on new interfaces at a speed no other human on the planet could match.
“The orders being broadcast to them are creating a priority list of critical infrastructure and population areas all over the city. So they’re kind of heading everywhere I think?”
“Are there any of locations outside of Gamma City?” Ai asked, looking up from her coding. The work couldn’t wait, but she also couldn’t afford to built the wrong tools into their copy of the NME code. No when it was the one tool they had that might be capable of holding off the apocalypse.
“Nope. All the target sights are within our borders.”
“And what about other incidents of multiple NME transformations?” Ai asked. “Anything being reported anywhere else? Check worldwide. This wouldn’t need to be limited geographically.”
“Good news? I guess? No other reported NME rampages are being reported anywhere,” Zai said. ”There is a lot of traffic flowing about this one though. People know something is happening but I don’t think anyone is aware of just how bad it is.”
“Damn. The Research Group is making an example of us,” Ai said.
“That seems like a bad call,” Zai said. “It’s not like NMEs are unstoppable. Any one of Tython’s rivals would be happy to glass the city rather than allow another robot uprising to occur.”
“I think that’s their plan,” Ai said, turning back to her virtual workspace and swiping away a section of code she’d been assembling.
“Why? What do they achieve by wiping out a single city, even one as big as ours?” Zai asked.
“One city is a small price to pay when the prize is the whole world,” Ai said. “People remember the last robot apocalypse as a piece of history but most of us didn’t live through it. It’s just fodder for period piece war vids. Even the NME attacks that have occurred over the last few days are a curiosity. No one is panicking over them. Not like they will over something like this.”
“So this is a wakeup call for people? Why though?” Zai asked.
“Could tie into a lot of different plans,” Ai said. “The last time we had techno-zombie problem it was caused by a fault in the automated upgrade process for bio-modifications. There are specific limits in place to prevent large scale distribution of any bio-mod upgrades, precisely to give people time to review the effects of the initial roll out. If everyone is terrified of another robot apocalypse happening though, it would be pretty easy to achieve worldwide distribution, especially since Tython can create NME pandemics by hacking a small subset of the populace in any area that has a lot of hold outs. The alternative would be to try to hack everyone on the planet which would be inefficient, time consuming, and open to the possibility of failure.”
“That makes a disturbing amount of sense,” Zai said. “If they’re going to sell the ‘upgrade’ as a ‘cure’ they need to convince people that there’s something for them to be ‘cured of’, right?”
“It’s the fastest method of dispersing the code,” Ai said. “Make people afraid and then dangle a carrot in front of them that offers security. There’ll be some who pass on the upgrade anyways, but they’ll be small enough in number that your horde of techno-zombie humans will be able eradicate them whenever you want.”
“Please don’t tell me you thought of this because it’s what you would do?” Zai said.
“What I would do is largely determined by the circumstances I’m in, what I’m trying to accomplish and what I stand to lose,” Ai said. “There are limits of course. Things I would never be interested in accomplishing and things I wouldn’t be willing to lose for any price. That cuts down the set of ideas I’d potentially act on fairly substantially. What I can imagine doing though? That’s a much broader field.”
“The Valkyries will be reaching their outer engagement range with the Hive shortly,” Dr. Raju said. She been crafting new modifications for the Valkyries based on Harp’s most recent upgrade. Ai was pleasing to see that part of the project seemed to involve disarming some of the Valkyries more volatile shutdown systems. “Harp asked you for a plan before she left. Do you have one?”
“I will,” Ai said.”I needed to see what the NMEs were doing first, but now we’ve got something to go on. I just wish there was time to get back in the conversation room with Hector.”
“Who is Hector?” Raju asked.
“One of the Tython research team members,” Ai said. “Long story short, they killed Fredericks and have usurped the NME project from him, and from Tython. Also they have a fully working version of the Cure, which he called the Omnigrade, so we’re probably doomed.”
“The Omnigrade?” Raju said. She was a machine, despite appearance. The surprise and concern in her voice however was quite real. “That’s much more than a cure.”
“Yeah, that’s what I was afraid of,” Ai said. “I’ve seen an early version of Tython’s code and read Zai’s notes on it. Nothing I saw there was comforting. There were incomplete modules in the version we got that suggested Tython would be able to take complete control over anyone who gets upgraded with it.”
“The Omnigrade, at least the one which Fredericks originally hypothesized about, has much greater capabilities than simply controlling its host,” Raju said. “The true Omnigrade was supposed to be able to crack through any breakable system, and hold the key to full integration between organic and machine minds.”
“So not only would it give them control over you, the transformation would also give them access to every bit of information stored in your mind?” Ai said. “And there’s no level of security that can protect you from it?”
“No security which a standard citizen would have access too, and yes, that and the ability to program a new personality into the subject. Or part of a personality. The controller of the Omnigrade could alter any aspect of a transformed subject. Could bend the subject’s perceptions and memories to be whatever they wished the subject to see or remember.”
“Why was Tython only looking for an NME Cure then? The Omnigrade sounds like it’s the endgame for humanity I was afraid it might be,” Ai asked.
“Fredericks’ early prototypes showed that there are inherent limitations in the design. The “True Omnigrade” is an impossible creation. Anything with too broad of a transformative power would also transform itself and any safeguards placed on it. You could in theory make a temporary version of the True Omnigrade, but you could never control the creatures it creates. In short order, milliseconds most likely, they would iterate their design around any controls you tried to place on them. Tython saw no profit in converting humanity into an uncontrollable horde of techno-monster.”
“Could Fredericks have found a solution to that problem?” Zai asked. “Some method of maintaining control over the uncontrollable?”
“No. Even if the system was perfectly constructed to eliminate every path that would allow the subjects to iterate their designs beyond the controls on them, there was no method of ruling out the effects of entropy,” Raju said.
“Meaning, things break, and sooner or later those things will include the controls that are placed on the Omnigraded NMEs?” Ai asked.
“Yes, and from all of our calculations it would be sooner too. The iteration cycle required to merge organic and mechanical consciousness has to be so fast that significant errors are predicted to start showing up within the first several seconds of the subject’s existence,” Raju said. “It’s why the original NME code failed and why Tython was willing to back a plan for a Cure. Frederick’s prototype showed that the Omnigrade was impossible but it proved that a Cure was feasible.”
“I have a feeling we need to see the code from those new NMEs the Valkyries are about to fight,” Ai said.
“You’re afraid Fredericks’ did the impossible?” Zai asked.
“No. I’m afraid his research team either managed it, or thinks they’ve managed it,” Ai said. “Or, worse, that they’re as clever as I suspect they are.”
“If they managed to make a true Omnigrade, then we’ve already lost,” Raju said. “If they’ve deluded themselves, then billions will die in their attempt to seize control of the world. But you see a scenario which is more worrisome than that?”
Ai nodded.
“Imagine a tool that could reshape the human body on a molecular level. Imagine it allowing for near perfect integration between the host and a set of cognitive enhancements. Not a replacement of the organic mind but an augmentation far beyond what a simple Cognitive Partner can managed,” Ai said.
“Like what the Valkyries have,” Raju said.
“Like what we are,” Zai said, her tone indicating she had already followed Ai’s train of thought to its destination.
“Imagine this upgrade having an intentional weakness in its design though. It can exert chemical influences throughout the brain, controlling pain and deadening or enhancing emotions, but it only exists alongside the organic mind, not merged together with it,” Ai said.
“Direct control of the mind and data extraction from memories would be impossible,” Raju said, “But control of the body would be trivial to achieve.”
“Yeah. From a design perspective there would be only one other piece you would need to make the upgrade work. Certain core elements would need to be sacrosanct, unable to be overwritten because of a mathematically perfect lock you place on them.”
“A mathematically perfect lock like the one Sil trapped me in…” Zai said. She’d called a copy of the locking code up for Ai, who was already looking for where it might slot into the NME code they had.
“Where did Sil get that locking code from Doctor?” Ai asked as she opened a function and found a near perfect interface waiting for the lock Zai had given her.
“I provided it,” Raju said.
“It’s the same code you used to lock down the Valkyries so that their transformation wouldn’t carry them too far. Wasn’t it?” Ai asked.
“Yes,” Raju said. “It’s how I insured I wouldn’t lose control of them. Or at least the mechanical components in them. I suspect I never had control over their human sides.”
“I think you had more influence on them than you’re aware of,” Ai said. “The key question now though is whether the Omnigrade the Tython team has isn’t the version Fredericks’ dreamed up, but rather one that converts people into beings similar to the Valkyries, only with deeper rooted controls?”
“You could go farther than I did with the Valkyries. Much farther,” Raju said. “I never wanted to destroy who they were. Not after what I did to Alice. The neural linkages could be extended as you described though. Deep enough to effectively submerge the person who once wore the body into something like a dreamstate.”
“Or a nightmare,” Zai said. “That’s what you were afraid had happened to Ai, isn’t it?”
“Exactly,” Raju said. “My Valkyries are wonders, but they largely created themselves. I only helped insure they didn’t lose themselves in the process. The safeguards I put in place were because I feared my work would fail, that the women they were would be lost and only monsters like me would remain. Without someone like me around to shape the process, I couldn’t see how you could remain yourselves.”
Ai nodded in understanding, and then stopped short, her breath catching in her throat.
“Oh no,” Zai said. “I know that look.”
“Doctor Raju, you are a genuis,” Ai said, wonder and delight spreading across her face as she opened a communication channel to the Valkyries. “Harp? I’ve got a plan for you. Transmitting it now!”