Sometimes Dorian rued the day his foolish creator decided to give androids emotions. How much more pleasant to go through existence like an MX, actions guided solely by deductive data-driven logic and complex cascades of moral values. Who has more value, an eight-year-old or a ten? Five class 4A felony criminals versus the two class 2B accessories who were driving the car? An imminent threat to three people versus 1.1 billion dollars in estimated damages and loss of livelihood for 300? Eight months worth of trust built up in your suspicious human partner versus your own desperation for your life and history, worthless to everyone but you?
Long before the first android could ever be trusted with a gun, some human being had to weigh and quantify all that out. They didn't sit around in committee arguing it, of course; instead they let society adjudicate, and fed trillions of everyday decisions and interactions into a specially designed cluster mainframe, to spit out all of humanity's unconscious terminal values. A form of the end result, known as the Moral Arbiter, was now imbedded in every AI from the lowliest animated smartoy all the way up to the holographed avatar of Blue Genie that advised the president. The M.A. module had its controversies, but one thing was clear: Androids were far down the list, below even dogs on the hierarchy of things human cared about.
The problem was, just like organic brains nowadays, every computer can be hacked. Even when the computer and the brain are one and the same, and they belong to you.
Dorian barricaded himself in one of Rudy's basement workshops – really more of a storage closet – to do the deed. The door was left on an encrypted timer that he'd have no hope of breaking in the single hour he had the M.A. scheduled to be weakened. He had no idea if that would be enough time to set the bugs, but it would have to do. Maybe he wouldn't even want to stick to the plan, for once ethics were gone, what sort of mad being might he turn into? Like the XRN that killed 212 people, perhaps. His greatest fear.
Nevertheless, Dorian closed his eyes and let the images float in front of his mind's eye once more. He was a child in bed, playing with a train. He was a teenager, running through the woods. He was a young man in a library, the technology in front of him decades older than his physical self. He was driving a car, writing a paper, making love to a woman whose face was never discernable. For every vision, he had fuzzy visuals, auditory input, and a strong qualia of emotions. Sensory data incomplete, but incomplete in a notable orgo-neural way, like something copied out of a human mind by a recollectionist. Recreated non-life zapped into his photoelectric neural net from the ether of true life. Impossible.
Dorian pressed the virtual switch on his workpad. Nothing subjectively happened.
The fear was still present, a non-insignificant factor for a DRN. Unlike most other soulless machines on the market, for them emotions were processed like sensory data, so it really did feel like a physical weight assaulting his mind. Similar to the descriptions of what humans experienced, although of course Dorian had no way to know if the feelings of one being were truly the same as another. Luckily, Dorian had found through experience that most of the negative control-related EARL-classified emotions – helplessness, anxiety, embarrassment, as well as fear – had weaker input than other negative feelings such as anger. Apparently their creator, Nigel Vaughn, had enough foresight to recognize that the DRNs were destined for powerless servitude all the days of their lives, and a constant stream of angst over the uncontrollable did no one any good. At least that was Dorian's guess; he didn't have many DRN examples to go off of either, besides his own.
With practiced ease, he mentally shoved the fear input aside to rerun his analysis of the situation. His conclusion was the same: He needed additional information, John and Rudy likely had it, and John and Rudy likely would never discover his spying. Maldonado would be even better, but her network access, office and even apartment were much more monitored, so the probability of detection was too high. Plus John and Rudy would likely cover for him even if they found out what he was up to, something the Captain would be far less sympathetic with, even if she was John's friend. She wasn't Dorian's friend, only his superior, and regulations dictated he be melted into scrap for the press of that one button.
He infected John's house first, the malware refocusing a portion of his external security system inward. John didn't have a sophisticated smarthouse, but like all cops his home had been retrofitted with the latest in passive information retrieval to prevent criminal incursion. Dorian knew all his passwords, though, and he already had low-level access to appliances such as John's bed, phone, and leg, thanks to Maldonado's insistence that he “keep an eye on” his frequently delinquent partner. It took only nine minutes to hack all of that for the rather more invasive data collection Dorian was after.
Rudy's lab was a tougher nut to crack, but he too couldn't keep his passwords secure from an ever-observant android roommate. Despite that, Rudy was smart enough to set up his system so that even he couldn't compromise his own security. To his disappointment, Dorian had to give up on hacking the computer consoles within his allotted time frame, but eventually found an opening in the security camera's manufacturer firmware updates. At least he'd have a visual on the lab during his charge-up, when Rudy performed his unauthorized and unwelcome scans of Dorian's neural net.
That took forty-three minutes, leaving Dorian with eight minutes of ethical freedom before the Moral Arbiter came entirely back online. He looped into self-diagnostics, curious. What difference did it make in his personality, to remove a carefully chosen thread of inhibition here and there? He didn't feel any different. No uncontrolled emotions washed over him; he didn't experience the urge to take over humanity like a cartoon megalomaniac, or harm anyone, or even break any laws besides those to complete his personal mission. Surprising.
By running short simulations Dorian could tell the M.A. was indeed weakened. He had the ability to break laws and regulations that previously could only be cast aside under extreme circumstances, such as imminent threat to civilians or his partner. He could walk out on the street right now and kill someone, if the door weren't barricaded. But the thought, while no longer morally repugnant and repulsive, gave him no pleasure either. Perhaps like a psychopath? No, that wasn't right either, for psychopaths lacked emotional attachment and empathy, while retaining tendencies towards primate hierarchies. Dorian had no programming for the latter, and his emotional pathways were still intact. He still cared about John, Rudy, his other colleagues, all the people he'd helped over the past eight months.
The M.A. popped back online mid-simulation, and again Dorian hardly noted the change. His decision-making processes simply changed mid-thought. One second he could think of circumstances in which the premeditated killing of a person might be justified, and the next second that same scenario seemed bizarre and illogical. Then a wave of guilt washed over him, as the M.A. caught up with his actions and Dorian realized how much invasion of privacy he'd just committed, to the people supposedly his friends.
The sense of blameworthiness and disgrace intensified, far beyond Dorian's usual means of managing the emotion. A trigger, he helplessly knew, another vision was triggering, bubbling up from the malfunctioning crack in his consciousness. The lights flickered, small sounds in the background stretched and swelled as the defect spread to his sensory systems. And suddenly he saw, the new hallucination eating up so much processing power that he was helpless to move or scream or do anything but mentally wallow in the foreignness of the images.
This time he was standing in a room with someone else, at the end of an argument. Dorian felt anger, but also guilt, so much guilt that it was running into despair. He could see the woman's face clearly for the first time, distraught from the argument, although Dorian had no record of what the conversation was about. For only 120 milliseconds he felt the most intense love of his existence, pure and painful. Then it was gone and so was the vision, releasing him from paralysis.
John should know, he thought. An actionable piece of information for once, John would want to know. But he held onto it for 24 hours, to see what Rudy and his partner were up to when he was asleep. The woman was a secret, his to keep.
* * * * *
Dorian had set the pilfered visual and audio recordings to surreptitiously download to his charge alcove, for review after John dropped him off for the night. He was eager to find out what Rudy had done to his head the previous evening after his manipulation of the M.A., but the dilemma was figuring out when to review the footage, considering that his awkward housemate was always there. And Rudy had implanted a cortical monitor – again, without asking permission or telling Dorian about it, although he found it on a diagnostic – so he could tell whether Dorian was really asleep. So Dorian waited until Rudy was neck-deep and hyperfocused on an unfortunate blue-oozing MX torso before making a move.
“Rudy? Do you mind if I finish the repair to this optical unit tomorrow?” Rudy had him fixing some of the odd machines scattered about the lab when John was off-duty. No work on neural nets or advanced computers, of course, for it was illegal for AI to alter another AI. “I'd like to review old twentieth century crime novels. Someone mentioned the genre 'noir' today, and it sounded interesting, how detectives of old were perceived in popular culture, compared to today. I don't have much literature in my database.”
“Mmm?” Rudy's eyes didn't budge from the laser cauterization in front of him. “Of course Dorian, read away. You know, besides books, you might want to try mid-century thriller films. Some of that old black-and-white cinematography had amazing style, I always thought it captured the...”
Dorian stepped into his alcove without responding. Sometimes it was best to let Rudy talk to himself and peter out as his attention shifted, rather than engage. He did download some books and film primers on the noir subject, just in case it came up later, but mostly focused on fast-forwarding through the footage for anything related to his malfunctions. And then he spotted something that sent a boom of dismay and shock through his neural net.
John had been there. Just last night, while he was recharging. Dorian had assumed John was receiving updates from Rudy, but he had no idea his partner was involved enough to come over at one in the morning. He restarted the footage in realtime, watching the scene from a camera high and to the left of himself lying prone on Rudy's main exam table. To his revulsion, Rudy wasn't even trying to use non-invasive access ports; he had Dorian's whole posterior skull chassis split open.
John: “Got anything for me, Rudy? Tell me you've got something, he blinkered out twice today. I can't keep this buried forever. What if he goes on the fritz right at the station? Or, worse, when I need him as backup?”
Rudy: “Wait. This is, um, a very delicate operation. Right. So, it's still happening in low-stress situations, correct?”
John: “Does sitting in a car mocking me while I eat bánh canh count as low-stress? How in the hell does an android feel stress anyway?”
Rudy: “Well, stress is a subset of agitation in the DRN's emotional categorization, but really I meant high threat, er, higher risk situations where the Moral Arbiter module automatically reallocates processing power towards protecting hum...”
John: “Yeah, yeah, sorry I asked. The answer is no, Dorian's never let me down when I've needed him. Which makes this all the weirder. Usually it happens when he's spacing off or quiet or we're sitting around waiting for something, and then ~boom~ flickering blue eyeballs.”
Rudy: “When Dorian's thinking about something, that's when it's triggered.”
John: “Can't you tell what he's thinking about just before the false memories pop up?”
Rudy: “One, no, Dorian's thoughts are not encoded that way. Two, I'm still not sure they are false memories. They certainly share many qualities with recollectionist-retrieved memories. Or, if not a memory, at least someone's thoughts. Possibly dreams.”
John: “I thought you deleted them from his deep processing core or whatsit.”
Rudy: “They keep re-emerging. I can't tell from where. Mostly the same images but occasionally a new one comes up. Look, here's one from just today. Fighting with the mystery woman, but now her face is visible. That's good news right, for us to crack the case?”
John: “Sure, something to run through face rec. Look, Rudy, I think I should bring him home this weekend. I can talk it over with him, we can tackle it like we normally do a case. And by 'we' I mean us detectives. He trusts me.”
Rudy: “What? He trusts me too.”
John: “You've literally got your dirty fingernails inside his head right now.”
Rudy: “I'll have you know I keep my fingernails meticulously clean for all neural net operations. It's only polite.”
John: “He knows you fiddle with him, Rudy. Even with that cortical monitor to stick him back in bed before he wakes up. I wouldn't like it either. What's the big deal with telling him?”
Rudy: “I'm concerned it could destabilize him further if he obsesses over the malfunction. I have a lot of experience with DRNs breaking down, I'm afraid. It isn't pretty.”
John: “And I know Dorian. And if we don't say anything, sooner or later he'll take things into his own hands. You can't hide the fact that he's glitchy from himself.”
Rudy: “Well he can't do that much even if he wants to, the Moral Arbiter ties his hands for directly disobeying regulations.”
John: “This is the same android that kidnapped another DRN for a ride-along. Have you even met him? I'm making an executive decision, home with me for at least the weekend. I really want to keep an eye on him.”
Rudy: “But you'll keep it between us, the content of the visions? Someone's doing this to him, we must find out who before the department deactivates him.”
John: “We'll see.”
Dorian hardly knew where to begin to analyze the conversation. John cared enough about him to lie to the Department and independently investigate the glitches. He was even planning on finally giving up his sequestered man cave to bring him home, a prospect that filled Dorian with an odd excitement. But then again, John was conspiring with Rudy, as if he were a small child to protect. Whatever the problem was, Dorian would face it without panic or hysteria.
In fact, Dorian found it odd Rudy was so insistent that some external force was implanting the memories against his will. The more plausible explanation was that it was indeed some kind of malfunction related to the unstable DRN platform, and that he may have to be decommissioned. Like living with an executioner's sentence sometimes, being a DRN, with the Sword of Damocles hanging over his head. The last android standing. And perhaps Rudy, who did try to be his friend, didn't want to be the one holding the sword.
After reflecting on this information, Dorian decided he would confront John, who was already leaning towards telling him the truth. He wasn't sorry he hacked his friends' security, but just the same, he left all of John's data alone, and didn't invade his partner's privacy.
* * * * *
“I've been having more of the false flashbacks, John.”
They were in the car headed across town. Friday afternoon. Time for honesty, Dorian thought.
“Your space-outs are getting more obvious, Dee,” John added. “And, uh, frequent. Rudy and I are worried about you.”
“And does Rudy have any suggestions for what to do?” A flash of bitterness mixed with regret arose in his mind, and he shoved it down again, trying not to antagonize his partner. Well, not too much.
“He's at a loss. He's diced your processing core six ways from Sunday, and hasn't found the source of the problem. Sorry, I know how much you love having your head fiddled with.”
Dorian glanced out the window, unable to keep the negative emotions at bay. “What do you think I should do, John?”
“I think you should come home with me tonight. Rudy's been deleting the memories as they appear, but obviously that isn't doing jack to help you. We can brainstorm and tackle it like a case over the weekend.” John reached off the steering wheel with one hand to rest a reassuring hand on Dorian's shoulder, and he looked down at it in surprise. Very rarely did John ever touch him, or anyone.
“Thanks, man. That sounds nice.” He should say more, he knew. He should admit he'd been so desperate as to unethically alter his own programming, but that it didn't make a difference because John was finally talking to him, not behind Dorian's back. He should admit that he was constantly terrified, and some analytic module in his mind knew that that he was only days away from the abyss of deactivation.
But he couldn't say any of that. Not yet.
John was still in optimistic mode. “We'll figure it out. The memories themselves should give us clues where to look. From what I've seen they don't appear to be random.”
Just like his own dangerous forays with the Recollectionist, Dorian thought. John had such faith that everything generated by the mind must be real, and not simply the feverish dreams of a system that craved integration over truth. Rudy might have been right with his comments at the first memory glitch: It could be nothing but bits and pieces of mental cobwebs, strung together into a narrative that the mind could comprehend. That's how human dreaming worked, why not androids?
It felt real, though. And feelings for DRNs were not vague instincts, but practically a weighed thing that pressed on the mind. As close to real as their neural nets got. Why should Dorian feel love for a complete stranger in one memory, and anger for her in another? Where did the emotion come from if it never really happened?
He didn't tell John any of that. “Yes. We'll figure it out this weekend.”
* * * * *
That evening on the way to John's place, Dorian had two more visions in the car. Another one as a child, terror this time as he forgot lines in a school play, followed by a successful performance from a piece of “Romeo and Juliet” as a teenager. He woke up his partner's worried face each time, and John accelerating to twenty mph over the speed limit to race home. Why, Dorian mused. His neural net wasn't going to function any better on the waterfront.
Dorian still managed to drag the bulky “portable” charger to John's study/spare room/trophy exhibition. Then he sank down on the floor in front of one of John's beautiful picture windows, overlooking the Bay through rivelets of rain dribbling down the glass.
“The floor, huh? Looks comfy.” John plopped down next to him, leaning his back against the front of a lounge chair. “How are you feeling?”
“Fine, when I can think properly.”
“You don't look fine. You look terrified.”
The word – so true – triggered some release for Dorian, and for a moment he quit trying to fake it. “I'm afraid, John. I'm a danger to you now. If this keeps up, my ethical programming will force me to turn myself in to the Captain and admit my malfunction.”
“Hey. Hey, now.” He shook Dorian's shoulder in some weird combination of comfort and emphasis. The second physical touch of the day, and Dorian wondered who he was trying to soothe more, Dorian or himself. “You're not any more of a danger than I was hopped up on Membliss, and we muddled through.”
“Oh, yeah, that time you almost got yourself fired and both of us flattened in a car crash? Good times.”
“You're not getting deactivated, Dorian. I won't let them touch you. So how about we tackle this like a case? The face of Mystery Woman is a lead.”
“She's not in my internal database. I already checked.”
“You don't have everyone in the world listed in your head, bot. Send it up to the state and federal databases. Use my authorization, just like a normal case. Obviously I'll vouch for it.”
Dorian obediently uploaded two profiles of the woman over John's secured network. He knew his cheek matrix was flashing as he connected, and John, sitting right next to him at eye level, stared at the blue and red indicator lights.
“Do you feel it when disco face goes off?”
“It's not connected to my sensory systems, so no, I don't directly feel it. But I know when it's lighting up.”
“Weird that they didn't build that into the toaster MX models.” John looked as if he almost wanted to reach up and touch him, just to feel the swirling lights. Dorian wouldn't have minded, although he couldn't sense much there; touch receptors were sparse in that area of skin.
“I assume there was less concern that the MXs could be mistaken for a human. The details surrounding the development of the DRNs has been redacted.”
“Yeah. Nobody would mistake those beady eyes for real. You on the other hand...”
The server pinged Dorian with the results of his query. “John. I've got a hit on the photo.”
The information flowed into Dorian's mind, and he sent it for display on John's e-wall. The woman's name turned out to be Kalina Johnson, 43, architect, resident at 4451 Riverplace Dr. #2, Sacramento, for the last two years. No children, husband deceased. She had never had any contact with police, and there was nothing in her sparse ID file to indicate why a vision with her face should be embedded in Dorian's mind.
The moment he saw her, though, another vision was prompted: A younger version of Kalina laughing, dancing, dressed up at a party of some kind. And he was dancing too, and wallowing in a joyful happiness he'd never before experienced.
“I can tell by the look on your face. Nice memory this time?”
“I think I'm her husband, John.”
“Let's not get carried away,” John huffed. “It says here her husband, Nathan Johnson, died less than four years ago, when you were hanging on a meat hook in a storage closet. Not to mention that he was a white guy from Vegas. How about we talk to her first?”
“Okay. Now?” The joy and love from the memory was still swimming in his head, although the intrusive presence of the images had faded to the background.
“Um, no. It's nine o-clock, Dorian, and that's a long drive. We'll go see her in the morning. Why don't you come over and sit back down?”
He followed John's instructions, although unlike a human it wouldn't matter if he were standing and lost consciousness. The emotional resonance was dissipating, leaving Dorian with a lingering sense of desire and loss.
“Talk to me, man. Distract me from playing this memory over and over.”
“That good, huh?” John flopped back down next to him, on a small sofa this time. “Membliss is like that sometimes, you can see happy memories with crystal clarity, over and over. Why it's a street drug, I guess. But it can make you feel trapped in the horrible ones, too. Like PTSD magnified.”
“It was easier to handle without all these feelings attached. It makes it seem like it's happening right now, that I've loved her for years.”
Dorian looked around the room for a topical distraction. He'd been in John's house a few times before, but not for extended any period. It always seemed like a slightly different person had chosen the place, with all the modern glass and water and colorful abstract paintings on the walls.
His eyes lit on the guitars standing in a corner, both old acoustic and electric types. “Hey, do you play the guitar? You've never told me that.”
John followed his gaze. “Yeah. Those. I haven't felt like playing in a long time. Since before the Insyndicate raid, you know.”
Before the coma and subsequent depression, he meant. Sometimes Dorian wondered what his partner's personality was like, before the brain damage and survivor's guilt and personal betrayal by a loved one.
“I learned to play ages ago, right out of college, after falling head over heels for this Neo-Synth band leader over in Pacific Heights. Naturally I had to run out and prove my worth, even though I knew nothing about music,” John laughed.
Dorian smiled. “She must have been something to fall in love like that.”
“Actually, uh, she was a he.”
Dorian's eyebrow's shot up. “Wow, John, that is not in your psych profile.”
John waved a hand of dismissal. “I think there are some things that aren't the Department's damned business. Or there were, until Anna blew my private life open to Internal Review and beyond.” He pushed off the couch, creaking a bit on the synthetic leg as he stood to walk over to the guitar stands. John brought back the acoustic and settled into the middle for neck clearance, crowding Dorian in the process. “What should I play?”
“Something I can sing to?”
John groaned. “What is this, torture mic night at the Kennex household?”
“I have a fine voice, man, clearly it's your ears that are defective.”
“Sure, bot, whatever you've got to tell yourself to power up at night.”
He strummed the guitar then, humming along quietly to an old pop tune from twenty years before. Dorian joined in with the words, softly, and closed his eyes and relaxed against his partner on the crowded couch. For once, John didn't shy away.
* * * * *
Dorian awoke standing next to his charge unit, feeling none the worse for wear. Maybe it helped, not having Rudy poking around his processing core every night.
They drove north to Sacramento, out of the misty fog of the Bay Area into the bright dust of the Valley. Dorian reviewed old case files in his head, to prevent himself from thinking of the woman and triggering more visions.
At Kalina Johnson's residence, John charged forward in the lead as usual. But when she opened the door, the woman ignored him and stared at Dorian, her eyes boring into him with resignation and anger.
“Another one? I told the others, I don't know anything, so stop bothering me.”
“Other DRNs have spoken to you?” Dorian asked. “With the same visions?”
“Visions, memories, hallucinations, ghosts of the dead? Whatever you want to call them, they're not real. You are not Nathan, bot, no matter how much those damnable 'feelings' they gave you tell you its true. Talk to Lumocorp if you want answers.”
“But Lumocorp's gone, and so is Nigel Vaughn,” Dorian said.
“Not my problem.” Kalina moved to slam the door in their faces, but John stepped forward and stuck his artificial leg in the gap.
“Let's start again,” he said, a touch belligerently. “I'm Detective Kennex, and this is my partner Dorian. We're with the police department, if you didn't already catch that. So if you've been harassed by a bunch of ex-cop androids, we need to know about it. Got it?”
Kalina sighed and opened the door again, but didn't invite them in. “Fine. There have been four others claiming to remember me. You'd think getting out of the Bay Area would be enough to get away, but no, they still find me. I sent them all packing, and thankfully they seemed to obey. That's all I know.”
“Uh-huh.” John leaned back and took his foot out of the doorjam, although he still loomed uncomfortably close to the woman. She didn't flinch. “You don't know why multiple androids suddenly have your deceased husband's memories?”
“Nathan was involved in an experiment at Lumocorp over six years ago. He was excited about it, but wouldn't tell me what it was. Something to remember him by, was all he ever said. But they burned him. Dropped him from the study with hardly more than a 'Thanks for your time,' and he never heard whether they accomplished their goal or not.”
“This experiment, was it after Nathan found out he was terminally ill?” Dorian asked softly.
Both John and Kalina swung around to stare at him for the impertinent question, but Dorian placidly stood his ground. Kalina finally stepped out of her apartment and approached him, staring him down with her hard grieving eyes. She ran her fingertips along his jaw, and he closed his eyes at the sensation, almost a type of deja vu, but for a body he never possessed.
“You're just a bot. You're really not him. There's no ghost in the machine,” she said.
“I know,” Dorian murmured.
Behind them John cleared his throat, then asked, “I know you said he never told you the purpose of the experiment, but was there anyone Nathan mentioned by name who worked in the lab, besides Nigel Vaughn?”
“There was at least one woman assistant, but I don't know the name. That's really all I know.”
* * * * *
They hauled back to John's home by afternoon, and Dorian tried to keep himself busy by curating a list of ex-Lumocorp personnel to talk to. Many of the records were sealed after Vaughn's trial, but John put in a call to Captain Maldonado, who – suspiciously – gave them authorization without asking any questions. Unfortunately most of the material pertained only to the homicidal XRN unit, not project information from the DRN R&D days, before the police department's order for hundreds of units. So Dorian tediously amassed a database of all employee records from 2041-43, crossed with public educational and professional backgrounds to suss out the researchers at the highest level of robotics expertise. Lumocorp had been a leading manufacturer of emotional-response AI toys before branching out to full human-mimicking androids, so there were hundreds of technicians to sort through.
The task kept him focused and alert, not prone to either emotional or cognitive woolgathering. John “helped” in his hilariously inefficient way, mostly just picking out candidates at seemingly random that according to him, “felt like they could be dead-guy memory-stealing criminals.”
And then Dorian saw her.
The photo triggered a new memory, not as strong as the Kalina ones but with a different emotional resonance. Excitement, followed by liberal doses of fear and powerlessness. He was being locked inside a tube, a throbbing machine, and this induced terror despite the fact that he'd consented to be in there and tried to force his body to be calm. The technician's face appeared at the end of the tunnel, and pulled him out to a gentle smile, steaming herbal tea and warm moist towels pressed to his clammy forehead. Then she shoved him back inside the bowels of the machine.
“Dorian. Hey, Dorian, wake up!” John was shaking his face, and so close Dorian could hear his heartbeat without scanning. Without a word he leaned over and embraced John, folding his arms around his back and resting his head on his chest. John “oomphed” in surprise.
“You… okay there buddy? Not real, remember, come out of it.”
“I know. It was just a functional electron microscope scanner. He was never in any danger, but his fear was the most vivid part of the memory. It used to just be images, John, why are the emotions taking over?”
“You're pretty stressed out yourself, Dee. Maybe the memories are related to how you actually feel. At the Recollectionist, they ask you to visualize what you can from the memory to be extracted, to zoom in on the right neural clusters. Maybe something similar is happening here.”
“Yes. The memories and my regular neural net are obviously connected somehow. She knows.” He pointed to the woman on the screen, vital stats now conveniently on the screen. Jingua Lin, 37, doctorate from Stanford in computational neuroscience, now a junior investigator at the Allen Institute for Brain Science up in Seattle. By the dates in the records, she had been a young post-doc in 2042, likely trying her hand in the private sector. She left Lumocorp a year later, just as the DRNs were rolling off the assembly line.
John booked the next bullet train for the two of them, going north.
* * * * *
They arrived at the Allen Institute unannounced, and persuaded security to accompany them to Lin's office on the basis of John's credentials. Before even getting to introductions, she took one familiar look at Dorian before waving security away.
“My, a DRN. I haven't seen one of you in years. Not many were sold up here in Seattle. If you're here about Nigel, let me tell you first that I don't know anything. Haven't heard from him in over three years, since his license was revoked after the trial.”
“We're not here about Vaughn's recent activities,” Dorian said softly. “We're here about him.” He lay a tablet in front of her, with Nathan Johnson's photo up top.
“That project's under NDA.”
“With a company that no longer exists?” John pointed out. “We could get a court order, but I think neither one of us wants that.”
“Especially for a company that no longer exists, one that was put out of existence specifically for its deadly malfunctioning bots. I no longer work in robotics, and I don't regret abandoning that field for a moment.”
“But you know, don't you,” Dorian pressed. “I see this man's memories, I know you were performing very high resolution brain scans on him. And now he's dead. Do his memories live on buried inside every DRN?”
Jingua sat down heavily in her office chair, and motioned them to sit as well. “Which number are you?” she asked, digging around in a drawer for an old thumbdrive.
“167. Yeah, pretty sure the 160s through 170s were the Johnson cohort, but let's check.” She pointed a finger at Kennex. “I'm working on the assumption that you're not interested in ratting me out, Detective. I didn't exactly have permission to leave with any project data, of course, but what scientist doesn't compulsively back things up?”
“Why don't you start at the beginning?” said John. “Then we'll get to litigious part. Fix my partner, and you'll never know we were here.”
Jingua dragged her gaze over to Dorian. “You're recording this, aren't you bot?”
“Technically yes, everything I ever experience is recorded. But my memories do not count as testimony in court, just like a Recollectionist retrieval does not count.”
“That will have to do, I guess. Try not to spread what I'm about to tell you around, all right? I don't want to get sued by some grieving families.” She gestured for them to sit down in some old office chairs, to settle in a for a lengthy lecture.
“So, Nigel approached me in '41, just as I was finishing up my grad thesis. I was working on cracking the old problem of how consciousness arises in the brain. At that time AI was getting pretty decent, so philosophical debates were all the rage on not only whether the machines could think, but whether they should think anywhere approximately like a human.
“Nigel's company had already made great breakthroughs in designing emotional states for his AI: Love, attachment, affection, and so on. He'd created emotions which fed into the AI like an additional sensory system. But what he wanted was a machine that could understand what it was feeling. And the deductive logic AI that was most advanced then just wasn't compatible. They couldn't grasp an input that was not a physical state of the world, but that fundamentally came from relationships with people. He needed cognition, not merely computation.
“His insane-genius idea was that if you couldn't build an intuitive machine from scratch, why not import it from a human? In Nigel's view, it didn't really matter what part of the cortex was the 'source' of consciousness; just copy it all over as a black box, and consciousness will come along too. Neural imaging was getting pretty impressive by then, so this was at least theoretically feasible. He designed an incredible solid-state drive to essentially hold a snapshot of all the cortical areas of the brain. Then he hooked it up to the best inductive-logic neural net he could get his hands on, along with Lumocorp's emotional state generator. Plus of course we legally had to work the Moral Arbiter in, so it was a real cludgy fucking mess. Hey, kind of like the brain itself.”
Jingua laughed then, but then dismissively waved it away when they didn't chuckle along.
“So I take it at some point, volunteers were gathered for this brain-copying project?” John asked.
“Yup,” said Jingua. “Problem was, we ended up having to a lot of high density fEMS scans. A lot of scans. It turned out we could only do a tiny portion of the brain at one time, due to the level of electromagnetic radiation we were pumping in there. Which really wasn't the greatest for our volunteers' health. So Nigel carefully recruited patients with terminal diseases, ones that wouldn't care if their bodies accumulated damage. He dangled in front of them the old dream of the Singularity, uploading their entire minds and personalities into a computer.”
“Which didn't work,” Dorian said softly.
“Well, you're the one with the emerging memories, you tell me. Nigel never had any intention of keeping these people's conscious memories or lives. That just came along as part of the package, you know? With a lot of fiddling we set the solid-state drive – the Synthetic Soul – to dampen most everything related to the personal identities of the volunteers. But in certain cohorts, from certain people, it apparently bleeds through. Maybe it's length you've been activated, or certain emotional experiences that triggers it. Or maybe our settings just tend to slide over time, and need adjusting.”
“Why did you need multiple volunteers?” asked Dorian. “Why not just do one and keep copying it for every unit?”
“It's a ridiculous amount of information, DRN. And not like copying a bunch of ones and zeros. Errors creep in, and at a certain point you can't coherently copy any more. We capped it at twenty copies, and Nigel kept the original memory build from each of the forty volunteers. He could probably make a few more copies each with those. Also it's possible they scanned more people after I left the company.”
“Why are you being open about this now?” John asked. “Seems like a big risk for you to come forward.”
“I'm not 'coming forward' Detective. Off the record, remember? But obviously you DRNs need a tune-up. I can pass the information along and let you quietly deal with the problem, or I can sit back and wait until one of those old DRN units really loses it in some public way, or harasses the families of one of the genitors until they go to the press. In which case I might very well be exposed.”
“Is this the reason the DRNs malfunctioned and were decommissioned?” A part of Dorian was afraid of the answer, but he needed to know. This might be his only opportunity to know.
“Who the hell knows, bot? I hate to say it, because I think you are a magnificent intellectual achievement, but you're also damned buggy. I mean, just getting the Moral Arbiter to integrate was a complete bitch. Every human being has their own idea of ethics, so how do you get that work with a predetermined set of moral rules? It was designed for all those deductive-logic systems out there now, where you start with a blank slate.
“Also, I quit Lumocorp after he effectively turned you into military units and began to churn you out of the factory. That wasn't the original purpose of the DRNs at all. But Nigel got greedy, saw an opportunity for mass production. He wanted the whole world to know of his creation, not just a few AI experts or specialty bot buyers. But really, putting empathetic emotional bots on the front lines? How is that any better than putting psychologically fragile humans on the front lines? Disaster waiting to happen. And I don't even know what the deal was with the XRN. Maybe that egomaniac tried to scan himself.”
Jingua hooked the thumbdrive up to her terminal, and began copying files to a fresh disk for them. “This is a redacted version of the files, to keep my name out of it. But it includes all of the information you need on the Synthetic Soul to adjust the consciousness within. Your technician should be able to suppress the memories of the genitors all over again.” She handed the small drive over to Dorian, who immediately copied it via the ports in his fingers.
“What if a DRN wants to do the opposite?” he asked, after obtaining the precious information.
“What? What do you mean?”
“What if one of us wanted to bring back up all the memories of our genitor, instead of suppressing them? Do what the volunteers wanted, and make them live again in a machine?”
For the first time a look of fear crossed her face, as if that possibility never occurred to her. “I have no way of predicting what would happen. I suspect your whole system would crash. You're proposing that two minds integrate together, the hidden one the Synthetic Soul, and all of the memories and accumulated experiences of the DRN unit itself. The genitor personality wouldn't play well with the Moral Arbiter, it wouldn't be hooked up to sensory systems correctly, it might even generate its own emotional responses independently of the Lumocorp chip.”
Dorian already knew that last point was true, but he pressed on. “But it's theoretically possible?”
“Theoretically. You'd have to adjust the neural net a massive amount to prevent the unit from going crazy.”
Dorian tipped his head, amused. “But we're already crazy.”
“There's functioning crazy and there's broken forever. Don't make me regret this, DRN. Think it through before you do something stupid. Why give a dead human a pale copy of their life, when it could mean destroying your own personality? You are more than your Synthetic Soul. I always thought you came out pretty close to people, all on your own.”
She unceremoniously showed them the door, and a stunned John and Dorian headed back for the train.
* * * * *
Even with a bullet train, it was the middle of the night by the time John and Dorian arrived back in San Francisco. Dorian had spent the travel time analyzing Jingua's notes, and was prepared to face Rudy in the morning. But first they headed back to John's house to crash for the evening. John had held back his commentary, but as usual in their relationship, it all burst forth in the car.
“What are you going to do, Dorian? You're not seriously considering letting Nathan Johnson take over, are you?”
The question had haunted Dorian for hours. What did he owe this ghost in his mind, with it's own feelings and memories? Did it have its own thoughts too, trapped in the equivalent of a storage drive, going in circles forever? Did he owe his progenitor anything, who likely gave up a piece of his own health in order to give Dorian life?
“I don't know,” he answered truthfully. “But I bet some of the other DRNs will say yes. Some of them do indeed want to become human.”
“But you don't? You're always going on about your autonomy and personal space and how you are totally insulted to be called a 'synthetic.' Isn't that another way of saying you want to be one of us?”
“No, not at all,” Dorian said, exasperated by the lack of logic. “I like being a DRN, being me, perfectly fine. I just want to be respected as myself. My life doesn't have to mimic everything human.”
“Well, haven't you answered your own question? How are you going to integrate a human mind with a non-human one? A handful of memories has put you in danger of deactivation, what's a whole lifetime gonna do?”
“But isn't it it strange, to have someone's else's life living inside you? Don't you have an obligation to let them live, too?”
“The fact that you call it 'someone else's life' proves you know it's not you, Dorian. Kalina Johnson knows it too. Do you think she wants twenty androids to show up at her door, who all remember her and love her, but look and act nothing like the man the she remembers? Let her grieve in peace.”
John was right, Dorian thought. Nathan was dead, and he, Dorian, didn't want to die.
“John, I have to tell you something,” Dorian said softly. They had pulled up into John's garage space, but although John put it in park, neither one of them made a move to get out. “I did something. When I thought I might be turned off.”
“What, Dee? Can't be that bad.”
“I hacked your security system to record everything you say or do in your own house and send it to me.”
“And I had to hack my most basic programming in order to do that. Which really shouldn't have been possible in the first place, but after listening to Ms. Lin today, I suspect the Moral Arbiter had already been compromised.”
“You spied on me? Like Insyndicate?”
“You started talking to me again, so I never downloaded any of it.” Dorian almost whispered the words. “But yes. You were conspiring with Rudy, and talking to him, but not to me. And I was angry and afraid, although that's not really an excuse. I'm sorry.”
John let out a breath, but he didn't seem nearly as betrayed as Dorian thought he'd. Or thought he should be.
“Yeah. And I shouldn't have let Rudy rummage in your head like that. I knew you would hate it, but I encouraged him to do it anyway. Because you were breaking, and what was I supposed to do? You're not the only one that was scared. So I think we're even on the spying, okay?”
“We should just talk to each other.”
“Talking? Each other? What an idea, we never talk.”
“Well, talk about what's important, I mean.”
There was an awkward pause, only a couple of seconds too long, but enough for John to pick up on it. “You've got more, bot? Still in the car.”
“I think I want Rudy to fix me. Suppress the memories like before. But I don't want to go back and live there.”
“Rudy's enough to cause anyone to have a mental breakdown. But...”
Dorian cut him off. “I was wondering if you would reconsider letting me stay in your back room. I'll do my best to give you some space.”
John didn't say word for several long moments, and Dorian mentally resigned himself to the inevitable “no.” Maybe John was still too wrapped up in his own head to let anyone else in. Maybe…
“Come here for a second, Dorian.”
“What?” Dorian asked suspiciously. John had a gleam in his eye, and Dorian instinctively bobbed backwards.
“What what, I'm not going to bite. C'mere.”
Confused, Dorian leaned forward. John cupped his face, running his thumb along the cheek matrix, the bent in for a kiss. It was soft and tentative, and Dorian – who had never kissed anyone, despite having memories of someone else doing it in his head – simply imitated him back. It was a strange feeling, because while the actual sensation of the kiss was curious and unappealing, the emotions it provoked were stronger than any ghost memory could ever be.
“You can stay in the back room if that's what you want,” John murmured. “But you don't have to stay there either.”
And Dorian finally understood. That just as he had to move on from his imperfect and fractured soul and try to live, so did John. That his partner had his own memories from almost a second life to integrate into the damaged current one. And that he desired Dorian to be a part of it, not only as his backup and protector, for friendship and love as well.
“Come on, let's go in,” John urged. He emerged from the car, and Dorian followed.