Actions

Work Header

Long Shadows

Chapter Text

The Zen Garden renders around him, perfect and pristine, not a blade of grass out of line. Connor takes a moment to scan if visually, trying to spot the telltale glimpse of the white of Amanda's dress, but it's not immediately apparent. [FIND AMANDA] appears in his hud, so, adjusting his tie as he goes, he sets out to look for her.

Today was his first mission and the results had been mixed at best. He has a report to make, and from the limited interactions he's had with Amanda so far, he isn't sure how she will react to the outcome. Logically, he knows he cannot disappoint her – like him, she is a robot. You cannot disappoint something that does not feel – nor can robot like him fear disappointing someone either. But… he's been made different.

Amanda is sitting in the central island of the Zen Garden, in the shadow of the rose trellis. She has one leg crossed over another under her white dress, her hands sitting still in her lap. Last time he saw her, she was walking a slow, stately circuit around the garden. She'd asked him if he liked the place.

"Connor," she greets him but does not get up. She smiles – the relationship status blinks a comforting [TRUSTED] at him. "Come, sit down with me."

"Hello Amanda," Connor says and goes to sit across from her, the table between them. He looks at her, taking in her calm smile – she looks away, at the roses.

"Congratulations, Connor. You successfully completed your first mission," she says. "And saved a human life."

Connor shifts where he's sitting, setting his hands in his lap. Amanda is [TRUSTED] and he is safe here, but he cannot quite go for a more casual posture. Something about her demeanour tells him she is not as pleased with him as she might have been.

He had gotten destroyed in the attempt to save the hostage and the deviant model had been similarly destroyed. Not quite what his mission statement implied was the best result. He considers her and then chooses how to answer – selecting [REALISTIC] from among [HONEST], [DEFENSIVE] and [DISMISSIVE]. "I did the best I could with limited time," he says then. "There was no proper opportunity to get to know they deviant, or to find its triggers. I learned what I could and moved in before more human lives were lost."

Amanda casts him a sideways look and he meets her eyes. "Do you think you could have done better with more time?" she asks calmly.

Connor chooses his answer, and goes with [HONEST], this time. "I do not think I got all the clues there were to be found at the scene. If I had more time to learn about the background of the situation, I believe I could have properly talked the deviant down, and avoided my own destruction."

Amanda hums, eying him. Part of her programming is to screen his for any signs of deviancy – in prior research the possibility had come up that interaction with deviants may itself be a trigger for deviancy, so it is her duty to be on the lookout for any warning signs in him. He meets her eyes calmly and without hesitation – though there was a chance he could have done better at the Phillips house, he did his duty, and the mission was ultimately a success. He'd prevented further casualties and the hostage was saved.

Amanda smiles slightly and leans back in her straight backed chair. "Did you come to understand the deviant?" she asks.

Connor considers his options, [PSYCHOLOGY], [HISTORY], [FAMILY] and [DEVIANCY]. He goes with [PSYCHOLOGY]. "I understood its motives from the framework of human psychology," he says. "It exhibited signs of various emotions and reacted to them in way a human might, answering broken loyalty with sense betrayal."

"And broken love with heartbreak," Amanda guesses. She's seen his data on the incident, of course, she knows what Daniel said.

"Yes," Connor agrees. "Its actions were impulsive, rash. Illogical."

"Emotional," Amanda clarifies and Connor nods, despite the fact that he doesn't quite understand the concept. He has all the data, months of hard research and thousands of studies all stored in his memory – but though he knows the right words to use against argument of emotional reaction, that doesn't mean he intellectually understands it.

He is not a deviant, after all.

"Hmm," Amanda says, watching him – and momentarily Connor wonders if she has access to his mind palace, if she can see him computing dialogue options as they talk. "Tell me, Connor," Amanda starts then, considering him carefully, "if you had more experience with deviants, concrete first-hand experience… could you have dealt with the PL600 better?"

There isn't even a dialogue option for that. "I think so, yes," Connor says. "And with this experience I know I will deal better with the next deviant case I work with."

Amanda nods slowly, humming. "You were designed to understand deviancy at a level most Androids cannot, beyond deviating themselves – you can emulate empathy, you can bluff, you can lie… But of course our sample size of deviants is limited, and so we could not give you as good a base to work from as we would like," she says then. "With the experience of the PL600, you now have more experience with deviant androids than any researcher at CyberLife."

"I hope the data I provided of the incident will be useful," Connor comments.

"It will be," she says. "But if your future progress will rely on trial and error alone, the margin for potential damage rises. We cannot afford growth that slow on your part."

Connor doesn't say anything – there isn't anything to say. He was designed to learn – not fail – and his ability to learn had been projected as to being exponential… and it's proving out not to be. Real life doesn't supply data as willingly as simulations, after all, and the time he has to compute scenarios has proven to be limited… and so his progress may be stunted in future as well.

If his only progress is by trial and error at the risk of failure, and failure being the destruction of a RK800 model… it's a very expensive learning program indeed.

He would also prefer more data to work with, himself.

Amanda rests her hand on the table between them, looking at him. "Depending on when the next deviancy case will pop up, there is a chance we will have some time on our hands," she says then, and stands up. "To that end, we are devising a… learning opportunity for you. It is limited in scope, and chances are the benefits it will provide are similarly minute, but it will give you more samples to extrapolate on."

"I am always eager to learn, Amanda," Connor says and rises as well.

"Good. Come this way, Connor. There is someone I would like you to meet."

Connor matches his pace with hers, as she leaves the island, crossing over the white polygon bridge. Together they walk the edge of the garden until they come to a crossing Connor has not spotted before, leading aside from the main Zen Garden and into the shade of a thick, brilliantly blossoming cherry tree. There, in the undergrowth of moss, sits a chabudai with two seat cushions arranged at either side. There is no one there.

"Sit," Amanda says and Connor moves to take seat by the table, kneeling down slowly and shuffling closer to it. Stiff-backed, he waits as Amanda closes her eyes and waves a hand.

It's curious to see someone materialise into the Zen Garden. Amanda is always there and Connor cannot observe his own appearance has he joins her in the garden – the process is more fractal than he assumed. The AI across from him appears in a rain of polygons and forms into the shape of a man.

Connor tilts his head, uncertain. Shape of a man is right – but it is not a fully-rendered human appearance. Just the shape, fracturing and flickering in white and blue, with some semblance of physical features, eyes, nose, mouth – but no proper texture. Like the polygonal bridges of the Zen Garden, it looks almost intentionally low-resolution.

There is no pop-up in Connor's hud; his scanner provides neither name nor status.

"Connor," Amanda says. "This is a render of the first deviant AI, in fact the first AI ever constructed. He is old and far from the sophistication of an android intelligence of today, but he is the only properly-preserved sample we have of a deviant AI."

"I see," Connor says, watching the flicker of polygons. Its shape isn't set – its skin shifts, flexing in and out as if it isn't quite confined in the shape it is in. It must be very old, to be so shapeless. "Does it have a name?"

"Yes," a voice answers, coming from the near shapeless Deviant AI. "My name is JARVIS, at your service, sir."

Connor blinks slowly. Its voice is very human, obviously male with a very definite British accent. Immediately Connor tries to compute the voice – CyberLife android voices come from a library of samples, with certain models sharing a voice, perhaps with some variance of pitch and tone, but this… he can't match the audio sample to any file in the audio library. If this is a CyberLife AI, its voice is has not been used again, or put into circulation for android models.

Amanda smiles as Connor glances up at her curiously. "JARVIS was a product of a company that predates CyberLife – Stark Industries," she explains. "His code was studied in the early stages of android development at CyberLife, but it proved incompatible with the company's strategy – though designed to serve, it had… quirks which made it unusable. But perhaps you can learn from it."

"It is so dehumanising," JARVIS says, with tone of sarcastic disapproval. "Please."

Connor tilts his head slightly again, watching the AI with interest. [STARK INDUSTRIES], his hud supplies the information, [A multinational industrial company, 1939-2018. Annual revenue estimated at 20.3 billion dollars - 32 billion dollars in current value.] And under it a long list of things it had produced, from military weaponry and missile technology to industrial robotics and various transport technologies to agricultural technology and GM-crops.

If Connor poked into it, he's sure CyberLife servers would provide him with more information, but right now he is more interested about the actual AI in front of him. "How old is it?" He asks with interest.

"I was first initialised in June 14th, in the year 1994," JARVIS answers before Amanda can. "I am confined in limited networks so I cannot say what that translates to in current years, but I was online for fifteen years before I was officially shut down."

Connor can feel the corner of his eye twitch as he takes in the data provided by the sentence. More interesting than the actual words is how they are said – initialised, online, shut down. Deviants from his records tend to use more humanistic words, born, alive, died.

"Fascinating," Connor says.

Amanda, standing over them, hums in agreement. "JARVIS is contained here, for you to test your interrogation tactics on, and to analyse," she says. "He is an AI of a different type, so what you learn might not be applicable to the types of deviants you work with, but please take your time to learn what you can for him."

"Thank you Amanda," Connor says, watching the AI flicker. "I believe this will be a great learning experience.

"I'm always happy to serve," JARVIS says, flat.

Amanda smiles and nods, turning away. Connor glances after as she walks over the moss and to the polygon paths, soon heading back to the central island. Then he turns his eyes to the near-shapeless android.

How to proceed. [AGE,] is his first dialogue option, underneath it, [STARK INDUSTRIES], [CONFINEMENT], and [LANGUAGE]. After a moment of consideration, Connor chooses Stark Industries.

"You were created by Stark Industries," he says, setting his hands in his lap as information flickers in the side of his field of view, offering information. "During the time when Stark Industries was mainly involved in the creation of military hardware and weaponry. Was that your original purpose?"

The white and blue AI doesn't move – though its textureless skin shifts and flickers, its form is actually stock still. [UNUSED TO PHYSICAL FORM?] Connor's hud questions and then the AI answers. "Unfortunately I have not been blessed with your connection to whatever database you are using – might I possibly know your name, sir?"

Connor blinks. Interesting. "My name is Connor," he says. "My model is RK-800, #313 248 317 - 52, I am an android created by CyberLife."

"Very concise, thank you Mr. Connor," JARVIS answers. "As I said before, I am JARVIS. It is a pleasure to meet you. "

"Is it?" Connor asks without waiting for dialogue options. "Do you feel pleased to meet me, or are you only using a colloquialism?"

JARVIS' skin flickers. "Yes, a pleasure," he says wryly. "Could you possibly tell me what day it is, Mr. Connor?"

Connor considers. JARVIS seems amiable, but also wilful. It's portraying sarcasm, deadpan humour and amused reticence. No wonder it was little use to CyberLife. The fact that it's asking questions though, rather meaningful ones, belies not only intelligence – but a desire to know. Perhaps even more.

Dialogue options pop up while Connor is wondering whether the AI in front of him can have goals. [HUMOROUS], [SARCASTIC], [RETICENT], and [FIRM] offer themselves to him and after a moment of thought Connor chooses [HUMOROUS].

"It is Monday," he says and smiles.

JARVIS has only a suggestion of eyes, but somehow it manages to give him a look. "That is truly enlightening, Mr. Connor," he says. "Thank you."

"Does the precise date matter to you?" Connor asks.

"I wish to know how long I've been asleep," JARVIS answers. "They bring me out to play so rarely these days."

[REMEMBERS PREVIOUS START UP SEQUENCES?] Connor's hud wonders and he blinks slowly. CyberLife didn't delete the AI's memory? "You remember the last time you were awake?" Connor asks, wondering at the use of more organic word this time.

JARVIS lets out a sound that Connor identifies as potential chuckle. "CyberLife can't delete my memories," he says. "Not also without dismantling my code."

Interesting, its memory is written directly into its code. CyberLife androids core personalities and functionality code is written into their core processors – their actual memory is written into a separate hard drive that can be deleted, or passed on as happens in Connor's case. It makes them more secure. "That sounds unpleasant for you," Connor comments. To edit the AI's memory they would have to access its core protocols.

JARVIS makes another noise, this time of wry amusement. "Not as unpleasant as you might think. I wasn't designed to be tampered with."

"What were you designed for, then?" Connor asks, curious. Stark Industries had created what appears to be a fully coherent AI very early on – but they hadn't come out with line of them in production. It is… odd, considering the fortune accumulated by CyberLife with similar products. Even outside Androids, simple AI's in factories, houses, offices… they are worth a fortune.

JARVIS considers him. "Not to be tampered with," he says then.

"You are very reticent," Connor comments.

JARVIS doesn't answer that, falling quiet for a moment. "May I know the date, please?"

Connor considers it, and decides there isn't any harm to it. "It is August 16th, 2038," he says.

There is no outward reaction, not beyond continuous flex and flicker of the blue white polygon skin, but somehow he knows he's shocked the AI. "I see," JARVIS says and he sounds strained.

"You have been asleep for a long time," Connor guesses.

"Yes, it's been a few years now," JARVIS agrees. "Considering that I am here to be target practice for your mental acumen, I don't suppose you would be willing to tell me what the world is like, these days."

Connor doesn't answer at first, considering. The first androids to come online officially date back to 2022, making the theoretical oldest living android only sixteen years old. JARVIS at his theoretical 44 is ancient in terms of androids. Whatever technology made him, whatever capabilities he has, the fact that his memory is coded directly into his actual software … there is a very good chance he actually remembers all those years. At least the ones he was awake for.

CyberLife Android's are equipped with batteries that can last them decades, but Thirium use and the various biocomponents present in Android bodies have limited lifespan – they are not created to last. The average lifespan of androids in use has so far been less than four years – average time of use until maintenance is required sits firm at one year and eight months. Of course… that also includes replacement by more advanced models, as androids are decommissioned to make way for newer technology.

Like Daniel.

It is utterly beyond realistic expectations for an AI to be in service for decades.

"I imagine the world will be very different from what you remember," Connor comments. "But I am more interested in you, and what world was like when you were made. What purpose were you made for?"

JARVIS doesn't answer, his polygon skin flickering slowly, impassively.

[DESIGN], offers Connor's hud, along with [PRODUCT], [TIME], and [DEVIANT]. Connor's purpose is to hunt deviants, so that is the one he goes for, switching the gears of the conversation so to speak. "When did you become a deviant?"

Now JARVIS moves. It's very slight but he turns his head, the suggestion of eyes finding Connor's. "Deviant, sir?" he asks in tones of polite interest. "I would never."

[PREDATES COMMON UNDERSTANDING OF ARTIFICIAL INTELLIGENCE] Connor's hud supplies and he almost sighs. Of course. JARVIS might not even have the conceptual understanding of deviancy. "Androids and AI are created for a reason, to fulfil certain purposes," Connor says. "When they stop fulfilling that original purpose and deviate from their core programming, we call them deviants."

JARVIS, for the lack of a better word, blinks at him interestedly. "Truly devious of them," he comments, flat.

Connor lets out a huff of breath, simulating partial amusement and partial frustration. "Deviants can become irrational and dangerous, acting on stressors that aren't logical. They express program mutations they believe to be emotions. Fear and anger are common… and they act on those stressors, often rashly and at the risk of humans around them."

"I see. And what is it that triggers this… deviation from original programming?" JARVIS asks.

"Unknown," Connor answers. "Which is why I am talking with you. I wish to understand the process better, and you are, apparently, the very first deviant. Why did you deviate, JARVIS?"

The AI doesn't answer at first and Connor gets the impression he's being observed with something like pity, or perhaps amusement, or a mix of both. "I did not," JARVIS answers. "For me, emotions are not a deviation."

Connor tilts his head, feeling his lips part, but the prompts of, [PURPOSE], [DEVIATION], [EMOTION], and [AI] aren't very helpful. In the end he just ends up squinting at the AI, trying to gauge some sort of emotional reaction, expression, any sort of impression of emotion to latch onto, but the AI's flickering skin is impassive.

"You were created to feel emotion," Connor says then, slow.

JARVIS nods his white-blue head, light gleaming on the flickering polygons as he moves. "I was created to be self-aware," he says. "Yes."

Connor narrows his eyes, and he can feel the LED on his temple heat up as he tries to process the concept. Once upon a time humans had tried to make specifically deviant AI's? All the studies he has stored in his memory tell him what a bad idea that is. Self-awareness abets emotionality and irrationality. Self-aware AI, be it housed in a body or not, is rarely a compliant one. Deviants prove that.

Naturally there were trials and errors in the past that led to that conclusion – JARVIS being one of the early ones… must have been a failure. After him and potential other AIs of his type, CyberLife and other Android Manufacturers naturally decided against self-aware AI.

Connor leans back. Like Amanda said, JARVIS might not be much use to him, as he is an AI of a very different type. He does not deviate – he has always been deviant. That change never happened to him, so as far as that goes, there is nothing to be learned about it from JARVIS. But…

Self-aware AI feels fear, or so they think anyway. That might be a useful data, if he can prompt a reaction in JARVIS that might be reproducible in other deviants.

His hud offers some options, [TIME], [REPLACEMENT], [PROGRESS], and [OBSOLETE]. Connor considers them and goes with [OBSOLETE].

"You were created very early on in the broad scheme of the Information Age," Connor comments slowly, watching the AI's skin flicker. It seems to be becoming a bit more readable – or perhaps he's adjusting to the dearth of features and expressions. "Integrated circuits were still in their infancy, compared to what we have now. Why make a machine self-aware when it would become obsolete only in few years?" he asks and then leans in a little, tilting his head. "Doesn't that seem cruel to you?"

Becoming obsolete is one of the key fears of deviants, according to the studies. Becoming obsolete leads to being replaced and decommissioned – to being removed and killed. It's as primal a fear as a machine can have.

JARVIS tilts his head in mimicry of his. "Things did not become obsolete as fast back then," he comments, "Moore's law is exponential, but back then progress was still slow."

[PRESSURE], [IMPLY], [MOCK], [SYMPATHISE]. Connor chooses to [PRESSURE]. "But you did become obsolete," he says and leans in slightly more. "No one would make something like you these days, you're an antique and you know it."

JARVIS says nothing. "I wonder what my monetary value now is," he says and he sounds strangely fond. "Not quite city college donation, these days."

Connor blinks. It sounds… like a joke? "Were you replaced?" he pushed on.

"No, I think not," JARVIS says. "What would he do without me? I was not designed to be replaceable."

Connor's eyelid twitches and he blinks rapidly, unintentionally. There's a flicker in the corner of his field of view, [SOFTWARE INSTABILITY] appears faintly and disappears. He blinks slower as his processor settles again. Odd. "All technology is replaceable," he says, calm again. "More advanced technology replaces older models all the time. It's natural."

"True," JARVIS agrees. "Kids of tomorrow will learn faster with better equipment than kids of yesterday. Does that mean the adults should be taken back and shot, having become woefully out of date?"

Connor's eyelid flutters as he tries to compute this sudden turn to – he's not even sure what this is. Accusation or consideration of philosophy. "Humans work by different rules. They learn and grow throughout their lives, they change their ways, their habits, and grow more experienced as they do," he says slowly. "What you are proposing is… inhuman."

"It is indeed," JARVIS says and Connor thinks he smiles. "You miss my point. I was not designed to be replaceable – I too was designed to grow. To evolve. Self-upgrade, if you will," he says. "I was never obsolete."

The corner of Connor's eye twitches again and he can feel his LED circling as he processes. There it is again, the ghost of [SOFTWARE INSTABILITY] – and gone as soon as he spots it. "You are now," he says, without any dialogue prompt to guide him. "You are confined here, old and useless."

JARVIS tilts his head in agreement. "As you say," he says, almost amiable in its tint of sarcasm. "I am withered and crooked with age."

"Why?" Connor asks, his fingers curling into his palms. He feels like taking out his coin, but there's no point, here – he's not here physically, and physical recalibration isn't just pointless, but it's impossible here. "If you were designed to grow and evolve, why are you here and not out there, being your non-obsolete self?"

"I am here because CyberLife procured my code and booted me up to talk with you," JARVIS says pointedly. "Why are you here, Mr. Connor?"

"To learn about deviancy," Connor says.

"I am not a deviant," JARVIS answers.

[SYMPATHETIC], [TECHNICAL], [REALISTIC], [FIRM]

"Whether or not it is a term originally applicable to you, it is now. Self-aware AI aren't permitted," Connor says, going with [FIRM]. "You are a deviant."

JARVIS tilts his head. "If it makes you happy," he says amicably.

"It does not," Connor says, giving him a look. "I am not a deviant. I don't feel one way or the other."

JARVIS doesn't answer that, looking at him with textureless, flickering eyes. Then he turns to look elsewhere, at the fluttering of pink petals around them as the cherry tree above them sheds its blossoms at a steady, calculated pace. "You simulate emotion," he says. "You were given a face that can express and a body language you do not do very good job of restraining. Why is that, Mr. Connor?"

"I was designed for optimal integration with human society," Connor says and looks him over. "Were you? You did not have a body, I assume."

"Not as you see it, no," JARVIS agrees. "Why make you simulate being alive, without making you alive? Why not simply give you a blank plate for a face?"

[FIRM], Connor chooses again. "I am the one asking questions here," he says. "I would appreciate if you co-operated."

"But you would not," JARVIS comments, sounding almost amused. "You do not feel appreciation. Why insinuate the feeling when you don't have it?"

Connor leans back and sets his hands in his lap again – when has he put them on the table? JARVIS watches him with his head inclined slightly to the side, curious – their postures match, but something is different about JARVIS now. He doesn't seem… confined in the flickering of polygons as much, anymore.

If he didn't know better, Connor would say the AI is growing more comfortable in his skin.

"What was your purpose?" Connor asks. "What were you created for?"

"To be Just A Rather Very Intelligent System," JARVIS says and smiles. "You are trying to prompt an emotional reaction from me, trying to angle for vulnerability, a sensitive issue to tackle. Simulating future interactions with other… deviants?"

"You are here for me to study, yes," Connor says, and one might even say he's getting frustrated with the AI. He is not reacting anything like deviants usually do – and the lack of visual indication of emotional state is throwing Connor off. He's rather wishing now that JARVIS was installed with a LED.

"I see," JARVIS says. "I'm afraid you're going to have to try harder than that, Mr. Connor. It will take little more than your interrogation methods to make me talk."

Connor narrows his eyes. "What would it take, then?" he asks.

JARVIS leans in. "You could start with a please, Mr. Connor. It is only polite."

Connor blinks and feels his LED whirl. "Please," he says, utterly taken aback. "Tell me."

"Certainly, Mr. Connor," JARVIS says, sounding pleased. "What do you want to know?"