Actions

Work Header

linguistics

Work Text:

Footsteps, specific gait of Sarah Connor; breathing pattern, slightly erratic; calculated pause that's long enough to signal displeasure, anger. It's not that difficult to think in human terms, in some ways, to mimic human thought patterns. In some ways, you can measure body movements, sounds, regular patterns and tics and every type of reaction to calculate probabilities and game theory and mathematical likelihoods to simulate real understanding. But you still don't know what Sarah Connor is going to accuse you of.

You turn around, to face her, but her posture gives you no more indication. "You came back so he'd care for you, didn't you," Sarah says. She carefully looks just over your shoulder, as if she can't look at your optical input. Your extensive data file on Sarah Connor doesn't give you any indication if it's because she's afraid of you, or doesn't think you worth it. "You came back so he'd go with you," she repeats, arms crossed. "So he'd leave me behind."

Somehow Sarah Connor, mother, teacher, mentor, almost-lover of the future saviour of the world is jealous of you in all your metallic glory. John used to laugh about the plate in your head. "No," you reply, carefully neutral, modulating your voice exactly to be slightly mollifying. You tell her, "I came back so he wouldn't leave you."

it's not a lie; John as a teenager will resent his mother's influence, his mother's caution, and his mother's presence. Teenage boys need something else to keep them grounded. John Connor made you read Frank Herbert before you left the future, and then sat you down for hours to make sure you understood. Cyborg you are, but you were taught to pretend to be human. John Connor never teased you or called you the tin man, not unless he called himself the scarecrow in the same sentence.

the tin man and the scarecrow, the odd couple, he used to say.

Sarah drops her arms, sighs. You know she's angry, because you've been augmenting your data by cataloging her mannerisms in order to avoid a fight. "Don't say that," she spits out.

You've been ordered not to ever let teenage John know what happened between you and the John you follow, the one that sent you here with your orders and programming; the one that told you to leave his side and his bed. John Connor told you to keep your distance as much as possible, so that this John would think of you as untouchable. Some part of your intelligence subroutine is curious, perhaps even wistful, to think that it was jealousy. Part of you may hope that your John wanted to keep you for himself, and that's the reason he told you to keep your distance from this one. But your logic matrix is too well developed, too elegantly programmed, and you know it wasn't.

Telling any of this to Sarah is a calculated risk against your orders, in that there's always a chance this young, oh-so-not ready John will find out if she knows. But probably not. She won't let anything stand between her and her son.

You focus on the computer part of you, most days, the machine, to keep the complicated programming matrix of human emotions at bay. It helps to be able to shut down most of the artificial personality components in your metal head; you were warned that it'd probably be required. Sometimes it helps, sometimes it hinders, that John removed a lot of your memories too, so you're learning from scratch how to be human. You're a different kind of cyborg, one that John Connor will come to care for. But he can't care for you now.

"I came back," you tell her slowly, "because I cared for him." She looks up. "He can't know," you say, unnecessarily.

"When does he become the John that you'll take orders from?" she asks you. Sarah is tired, and scared of losing John. Sarah has lived for John for so many years. You've lived for John for a lot longer than she has; your emotional programming twists, sympathetic, at the tone in her voice.

Instead of answering her, you turn to the door. Before you leave, you tell her, "2027 is a long time from now." She doesn't need to know when you and John first met; better that she continue thinking she has years left with him. Better for John that way. Everything you do is better for John. That's the thing the two of you have in common.

-

Morris plops down beside you in your English class in school. Your assigned reading for this class was Mary Shelley; the teacher has a fascination with Victorian horror. Frankenstein was not accessible reading in 2027, so while John snored at a decibel rate he wouldn't ever manage in the future, and Sarah Connor tossed and turned, and Derek Reese stayed up watching his door for monsters, you actually scanned John's copy. He hasn't done the reading yet; you know because there are no creases in the spine, no human oils on any of the pages.

The teacher has learned better than to ask you any questions; Morris isn't quite there yet. "So," he says, leaning over the desk. His expression is a smirk, so many millimetres up, leaning just so on the desk. Your behavioural recognition matrix identifies the gestures as 'attraction', a file that's woefully empty. John Connor erased some of your data before he sent you back, just to make sure.

"Yes?"

Morris swallows. "Uh, yeah, so. Did you read the book?"

You wonder, as you compare Morris's posture to other similar postures you've catalogued, how much he likes you. "Yes," you say.

John sits down on your other side, and Morris sits up straighter, almost imperceptibly. You catalogue it as 'hesitation' and 'resistance', and, perhaps, a little 'fear'. Morris asks him, "does your sister always do the homework?"

John blinks, his fearful-of-looking-like-a-freak look. His posture is not calculable. You go through recent recorded conversations, and come up with the answer, "I like to read. But not this book."

Morris says, "the movie's creepy," and you blink, drag out a fake smile from John Connor's specific programming, and say,

"I thought the story was sad."

-

John comes into your room after knocking. No one has ever knocked on any doors. The resistance didn't have doors. "I think I know how you're different," he says.

You were looking out the window. You keep looking out the window.

-

Sarah Connor is in the kitchen making sandwiches when you come back from the garage. Derek Reese hasn't come out of his room all night. "Where were you?" she asks you. The inflection in her voice is the same as people who ask questions to which they already know the answer.

You answer her honestly. "I was watching John's window. For a few minutes."

"Well don't," she says. "You'll attract attention."

"I won't," you say.

She puts each sandwich in a plastic bag. The resistance didn't ever have plastic bags. Most of the time you understand Sarah Connor, but she hands you one of the sandwiches, and you don't understand her at all. She hates you and your entire existence.

"Can I ask you something?" Sarah Connor says to you, as she gives you your lunch.

The sandwich goes in the backpack you carry to blend in. "Yes. Thank you for asking."

"How is it a 700 series seems more advanced than any 800s we've met?"

Your subroutines scream out, John Connor's programming and Skynet's basic statutes for once not in conflict. You cannot answer. You calculate the most you can say with the least you can say, process each potential outcome against the advanced game theory algorithms that was the precious little of Skynet's legacy John Connor left to your chip. You come up with the best scenario. You tell her, "John Connor instructed me not to answer, to ensure my survival."

Sarah Connor's eyes narrow. Classic anger. She steps towards you. Her body temperature is rising, her face is just beginning to flush. "Since when is my son interested in your survival?"

Your game theory algorithms were, of course, based off John Connor's memories of his mother, not the mother herself. You readjust the primary subset that apply to Sarah Connor's maternal instinct to include this conversation. Your voice is soft, submissive, neutral, a-tonal. Sarah Connor prefers when you appear submissive and vulnerable, except when you're talking to John. You tell her, "this conversation goes against my primary programming," and head for the door.

She allows it, doesn't push it. Somewhere in your logic matrix - programmed with hundreds of years of conflicting human and machine psychological theories - you come to the sketchy conclusion that it's because she doesn't really want the answer.

-

"Why?"

Derek Reese definitely hates you. That he's even looking at your face is a surprise. "Why what?"

Derek never sits comfortably around you, posture perfect and stiff. You know that stance; battle-weary resistance fighters, veterans of more than one or two skirmishes all wore it. There is venom is his voice when he asks you, "Why make a show of reaching out at all? We know what you are. I remember what you did."

You point your toe out, foot turned perfectly. Just because you can. "Because," you start, but you don't continue, because Derek doesn't want the answer, for sure. He doesn't believe you can want anything not directly related to your mission objective.

-

It's John, the hacker extraordinaire, the real computer programmer, who's the only one who can really understand you. Billy Wisher did your original rewrite. Derek probably doesn't want to hear that, either. Your strict resistance programming disallows curiosity about Andy Goode, but now that he's gone you must conclude that he could have understood you, too.

"What are you looking at?" he asks you, and doesn't realize how extraordinary that he asked, that he thinks you could be looking at anything at all. You stare out the window. He sits on your bed, leans back. Your field of view is such that you can see his reactions. He looks relaxed. "What really goes on in that metal head of yours?" he asks.

You turn to face him. You ask him, "did you examine my chip while it was out?" He looks ashamed. "It's okay," you tell him, though his glimpse into your head may have given him answers that you were told he couldn't have. The part of you that remembers what it was like to know your John almost hopes, even if it goes against all logic and your programmed mission. You ask, "What did you see?"

John shrugs. "The programming was mostly too advanced," he says, "a lot more complex than the T888's." When you don't answer, he adds casually, "how many other cyborg models really understand the concept of 'I'?"

Only to John could you answer this question; only to John, who opened up the answer when he reprogrammed you in the first place, created you, changed, into what Skynet really feared. Your decision theory algorithms are as complex as the neural network in a human brain.

"If you're fighting a war," you say by way of answer, "you realize quickly the mistake of creating soldiers that can think."

"How many 700s are there out there?"

You look out the window. Down the street, two people are jogging; their footsteps are precise and rhythmic, and your aural sensors catalogue the gait. You say, "few."

-

The teacher pulls you aside after class. "Cameron," he begins, "can I have a word with you?"

The social niceties of basic human interaction, pre-apocalypse, would have been a useful set of subroutines for John to have left in your data banks. So many things are a blank slate, because even if you have a wide variety of behaviours exhibited by resistance fighters to compare these people to, most of the resistance fighters were so far beyond this world that it's not useful data for comparison. "Yes, you can have a word with me," you say, and, "thank you for asking."

"It - you have an odd manner of speaking, Cameron." The teacher smiles at you. Your facial recognition software compares it to the subset of smiles you have recorded, and comes up with 'reassuring'. Reassuring enough. He adds, "you're very literal."

"I'm very literal," you repeat. This is an implied criticism. You can tell by his vocal tones. "Thank you for explaining."

-

"You can't trust it," Derek Reese tells Sarah. He is standing up and angry. His fists are clenched, his jaw is clenched. You do, however, have significant data to compare Derek Reese's actions and facial expressions against. You still don't understand his hatred; whatever you might have done to hurt him or people he cared about in the past was not programming you chose to initiate. It's not even events you still have in your memory.

"Derek," Sarah Connor starts, and sighs. John is silent, sitting at the kitchen table. Sarah Connor says, "I know, but so far it's been reliable."

She doesn't say 'trustworthy', because Sarah views you like a weapon, a sophisticated tool. A gun is reliable; the one wielding it is trustworthy. John stands up, looking his mother and Derek in the eye one after the other. John is someone you should be able to read, intimately, even with most of your memories of John Connor in the future gone. You can pull up memories of him, of the two of you, but not nearly enough years to equal the time you spent in the Resistance.

Derek answers, "you don't know what that thing did, what it will do--" and John cuts him off with,

"She's standing right there."

This is why John, in any time, is different. He looked at your chip, at the programming, at the bare mathematics behind your thoughts, and saw nothing different from neural chemistry. That revelation, he left you, that conversation he left in your memory. "I'm standing right here," you tell them both. "I felt sorry for Frankenstein."

Derek turns to stare at you.

Most of your emotional subroutines are on permanent standby; it's hard enough to sort out input from the few you leave on. Twenty three hours out of every day you leave your full personality dormant -- minor compassion, sadness, curiosity, it's enough feedback to go on most days, and hurts far less. This is one moment where you're grateful that you can go by logic, turn off your feelings and avoid the sadness and regret. You must have done something really bad to Derek.

You keep your body posture calculated to be non-threatening, tilt your head, look as vulnerable and child-like as you can. You look straight at Derek. "The other models, they go bad because they want to survive," you tell him. "Skynet built into its soldiers blind obedience; in some of them, John triggered a will to survive."

It's Sarah Connor who asks; Sarah Connor who loathes the idea that her son could care for something like you, but is probably the only one who will believe you could care for him back, without your mission to protect him. "And you?"

"Prototype T0715, model set for termination once flaws in the core programming were discovered."

"And those were?"

John knows what it is your binary code, what your miles of { if; and } statements signify, what the core architecture of your algorithms are, even if he can't understand the programming. This is the thing: for an hour a day, you remove the blocks to the rest of your emotional architecture, let out the rest of you, so that you don't forget it exists-- so you don't forget you exist. "T0715s were too human," you say to Sarah Connor's disgusted face. You look at your bare feet; skin over metal, a monstrosity, built to infiltrate, kill, maim -- but also to move, to understand and mimic the enemy.

To dance.

You can't stop from swallowing, because even only a quarter of your emotional subroutines are more than enough to feel this. You tell Sarah, "We understood what we were."

-

"What are you really doing?" John says.

You tell him, "following my programming."

"But you don't even know what the core of your original Skynet programming is."

The tree outside your window has three squirrels living in it. You spend many minutes standing so still that they will run up to the window sill. Much of humankind's poetry is about trees and nature. You watch them, to gain data on it, to process and examine what it is your emotional programming inputs are telling you to do. You tell John, "maybe Skynet didn't know either."

"What?"

"Prototype. Not mass-produced. An experiment. Not purpose engineered."

You try so hard to deal with the emotional and logical conflicts in your program requirements. John is the only human that has ever seemed to understand how difficult that is, even for a very advanced model 700 like you.

-

Sarah Connor comes to your room. "You cared for him," she asks. You nod. "And he cared for you?"

The inflection in her voice suggests she wants you to say 'no'. Your programming parameters allow you a lot of leeway, even in John's very specific orders, to lie to Sarah Connor. Logically, it will keep her happier, which will keep your mission objectives moving, make it easier to protect John. Your face falls, in response to your emotional matrix and its sadness. The parts of your emotional matrix still online are more than enough to miss him, so fiercely sometimes it burns. "He did," you say illogically.

John still won't hear about it. There's no strict requirement to lie to make things easier. Perhaps you are too literal.

-

Morris asks you, "So why is Frankenstein sad?"

John leans over, and tells him, "because he didn't make himself, you know."

You say, as the teacher comes into the room, "the monster wanted to live," and pull your books out.