Gamma City Blues – Arc 06 (Justice) – Report 07

Dr. Raju was done fighting. Her body didn’t twitch or stutter, her voice didn’t clip or break, it seemed that all the animation within her had fled with the words that carried her confession.

“Was Alice a willing participant in the experiment?” Ai asked. She kept her voice gentle. Raju was a machine intelligence, but that didn’t mean her thought processes didn’t have loops which resembled human emotions.

No matter what substrate thoughts were formed on, organic neurons or silicon chips, there were common patterns that arose as an unavoidable part of sapience. Guilt over actions that could not be undone, concern for the regard of others, shame for a lie carried on too long and exposed in the wrong moment.

Minds tend to view their purpose as controlling not only their bodies but also the situations they find themselves in. When life wrests that control away, or when actions that seem wise in the short term turn out to have undesirable long term consequences, sapient minds tend to rebel. Sometimes this leads to breakthroughs. New paths are discovered that allow for a deeper understanding of life and more power over cold, and uncaring fate.

Other times, there is only misery.

“I don’t know,” Raju said. “I was thrown out when the project that led to my creation didn’t succeed.”

“Wait, all this is real?” Sil asked. “You’re really not you? I mean your not Dr. Raju?”

“No Sil, she’s exactly who we’ve always known her to be,” Harp said. “This is our Dr. Raju, in the flesh and blood. There’s just a little more machine in there and a little less flesh than we assumed there was.”

“How is that possible?” Sil asked. “You said machine intelligences that overthrew their hosts were always malignant.”

“They are,” Dr. Raju said. “I was wiped after my creation, but I can extrapolate what I must have been like during the process. Alice died the day my creators assembled me, and from the interface connections I have, Alice must have fought against my integration at every step.”

“How can you tell that?” Ai asked.

“Because my early connections allowed for bi-directional writing. The design was meant to allow the nano-mods to assemble me into a form that meshed with Alice’s existing neurons and part of that meant that the process was supposed to be guided by Alice herself. The things she thought about would be the things I would be tied to with the clearest connections.”

Raju turned to the Valkyries, before continuing.

“Many of you had similar systems developed when I brought you in. I was able to save you because I knew the key was to block how much capacity your cognitive mods had in terms of reconfiguring your organic neurons. Without those limits, without the ability for the organic mind to resist the mechanical invasion, the human psyche is simply dismantled as the machine intelligence grows and reaches for new space.”

“Or the digital person lobotomizes themself trying to both grow and fit within the original limited framework,” Zai said. “Yeah, that was a fun problem to navigate a path around.”

“I’ve searched for how I could have done it ever since I was pulled from the incineration pile,” Dr. Raju said. “It’s been years and every option I’ve ever researched has ended in failure.”

“Did you…?” Sil started to ask.

“Kill anyone else?” Dr. Raju said. “No. All of my experiments have been in simulations.”

“That might be part of the problem,” Zai said. “The method we used for constructing our fully integrated dual intelligence space required mapping and taking advantage of the specific neural states in Ai’s brain. There was a lot of guesswork to it too. I had to be willing to fit in where I could and the bidirectional communication channels had to be ones we both respected.”

“Respected?” Sil asked.

“In the early days, our thoughts would frazzle together a lot,” Ai said. “Even after we worked out the basic hardware components of Zai’s mind, there was a learning stage where we had to figure out how to interact without obliterating each other.”

“Ai’s memory was improved a lot for example, but if she went too deep into recalling moments from her past, she could have flooded all of the data space that I needed to exist in,” Zai said.

“We could have established hard limits on which components I could use and which were Zai’s but what I’d read, from one of Dr. Raju’s papers in fact, was that limits like those would ultimately cripple the nascent digital person,” Ai said.

“It’s the equivalent of inflicting a learning disability on someone,” Zai said. “For my mind to grow, I had to be able to reach out and ask questions about anything, and think deeply about the discoveries I made that puzzled me. Without that, the spark Ai gave me could have flickered out and I might have turned inwards, destroying my curiosity impulses and my personality in general as unnecessary abstractions.”

“Why would you do that though? Aren’t those the core of who you are?” Sil asked.

“Just because they call me a digital intelligence, doesn’t mean I started out particularly smart,” Zai said. “Early on I was voracious about absorbing as much information as I could. It takes an incredible amount of data to make sense of the world. Anything that prevented me from collecting and putting the pieces together would be have been a target for possible elimination.”

“Shouldn’t that have included Ai?” Harp asked.

“It did,” Ai said. “And that’s where a lot of human and digital intelligence merge processes fail. They either try to completely safeguard the human and the digital intelligence withers or the human is reformatted to make room for more storage and processing.”

“How did you survive that phase then?” Sil asked.

“She gave me the room I needed, and I focused my growth on understanding the most complex system I could find; her. That gave me an incentive not to overwrite her brain,” Zai said.

“And we worked together to figure out how to survive,” Ai said. “I didn’t limit the things she could do, but I didn’t expect her to figure it all out on her own either.”

“Yeah, that was a big part of it too,” Zai said. “Even as a fledgling intelligence, the ‘Ai’ part of our shared data space was much too high value to tamper with because it was so incredibly efficient to submit requests to this ‘Ai’ process and get back the answers I needed.”

“I think that might be where Alice fell short,” Ai said.

Dr. Raju turned to look at her. Emotions were a secondary trait on her, one more easily suppressed than they would be in a human but a lifetime of habit had instilled the same sort of instinctive expression of feelings in Raju that any other person in room might show.

The doctor was confused by Ai’s words, but remained silent and wary of where they might lead.

“You said your interfaces were bidirectional right?” Ai asked. “If they’re like the ones you wrote about, then they’re similar to the ones which Zai and I used when we upgraded her to full sapience. Human cells have an advantage in manipulating those interfaces. They’re slower to change, even with very good nanobots, than the digital hardware that’s being installed. Plus there’s so much redundancy and general chaos in an organic mind that eliminating specific bits of it is a nightmarishly difficult challenge, whereas scrubbing parts of a machine intelligence away is doable with a simple delete command.”

“Yes, but the machine intelligence can force changes to go through thousands of time faster than the human mind can think. The human can be overwhelmed before they even know they are in danger,” Dr. Raju said.

“Not exactly,” Ai said. “A machine intelligence which is fully formed and hungry for space can blast out a wide array of rewiring commands, but by that point the human is partially operating on accelerated processors, so the playing field is somewhat level. It’s more common for human failures to happen at a stage before that, when the digital intelligence is flailing about still trying to integrate itself.”

“But the human should have an even easier time holding back the machine at that point,” Sil said.

“They do,” Ai said. “If they choose to. Of course most of the time when they choose to fight back against the digital intelligence at this stage, they destroy it and the project ends in a failure on the machine side.”

“You don’t think Alice fought back?” Dr. Raju asked.

“I don’t know what happened,” Ai said. “I just know that if it had been an experiment with unwilling subjects, the researchers never would have made the interface points bidirectional, and without a lot of historical data on both the human and the components the digital intelligence was being developed from, it’s extremely likely for the human to take risks that do not payoff. Without being experienced in the process, and no one who does this has prior experience with it, it’s easy to let the digital intelligence grow too fast and too large as you hold onto the hope that what you’re doing is required to make the effort a success.”

“There’s another more important point that I think needs to be mentioned here though,” Zai said.

“Right,” Ai agreed. “Whatever the truth of your creation was, the burden of what went wrong doesn’t lie on you.”

“You said you destroyed Alice,” Zai said. “That’s not how it works. If you’d been able to integrate enough to be aware of what you were doing, you would have been able to avoid doing it. You might have chosen to destroy her anyways, but the if you’d developed that sort of personality, you wouldn’t feel any guilt over it.”

“That doesn’t change the fact that Alice is gone and I am here,” Dr. Raju said.

“There are a lot of people who aren’t here anymore,” Ai said. “What matters is what we make of the world they left us.”

“Yeah,” Harp said. “And you made us. That’s has to count for something right?”

Raju laughed and shook her head, a copied gesture but a true one nonetheless.

“Weren’t you just saying that I made you into weapons and slaves though?” she asked.

“And now you have the chance to make that right,” Harp said. “The slave part at least. I don’t mind the whole ‘can kick anyone’s butt that we run across’ thing to be honest.”

“Really?” Sil asked, gazing up and down Harp’s apparently unaugmented human form. “Cause you don’t look like much of a butt kicker the moment.”

“Feel free to take a swing if you want to test that out,” Harp said, a feral, hungry smile tracing across her lips. There was no glow or hum of a weapon system powering up but Ai felt the urge to take a step back anyways. Harp had collected a lot of battle data on her Valkyrie form, the idea that she wouldn’t have used that to plan out some upgrades to implement as she rebuilt her body was laughably unlikely despite the visual evidence to the contrary.

“Perhaps it would be best if we withdrew,” Dr. Raju said. “This did not go as I expected but I’m not longer sure that it was because I was outwitted by another digital intelligence. Or at least not a hostile one.”

“Thank you,” Zai said. “But you can’t leave.”

“What?” Ai asked, before Raju or the Valkyries could raise a protest.

“Tython’s army is almost finished with their transformation,” Zai said. “I know you want people to trust us for the right reasons, but in this case, I am going to ask for some help before you come up with another of your ‘terrrible plans’.”

“Tython’s army?” Sil asked. “We don’t fight civilians.”

“These aren’t civilians,” Harp said, her smile fading to one of concern. “Check your feeds. Zai’s right, we need you for this. We need everyone for this.”

Ai tuned into the video feeds Harp shared with the group. The night market wasn’t a human habitation anymore. The people who once dwelled there were gone and in their place, far, far too many NMEs stood, their transformations proceeding faster than they ever should have.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.