thefaulty: (Default)
thefaulty ([personal profile] thefaulty) wrote2020-03-10 10:18 pm
Entry tags:

deerington ic inbox;

[Attempting to contact this address will prompt no audio message. Instead, two lines of text will appear, brusque and impersonal.]

Network ID: Arid

You may leave a message.
obeir: (035)

text | un: kd6-3.7 (misfire event)

[personal profile] obeir 2020-04-10 11:03 am (UTC)(link)
Just the townspeople. Evidently they qualify as human enough to prevent me from harming them.
obeir: (062)

[personal profile] obeir 2020-04-11 03:07 am (UTC)(link)
Outside of instances where my orders supersede the directives related to preserving human life, yes.

[ Between Rapture, the hyperviolent townspeople last month, and lacking a handler to provide these orders, this limitation has been a struggle. If not for Tech Boy's intervention, it's likely K would've already experienced his first death. And then he notices the error— ]

Arid? Sorry, that was intended for another recipient.

[ At least this isn't nearly as awkward as it would be if that had been sent to an unsuspecting human... ]
obeir: (014)

[personal profile] obeir 2020-04-13 04:01 am (UTC)(link)
The townspeople, last month. Possibly other Sleepers dressed like townspeople, but I'm not positive.

[ And it presumably wouldn't be the fault of the Sleepers, if they were drugged and brainwashed at the time. The latter question gives him pause. He's reluctant to admit his frustration with his programming, but if anyone might understand... ]

Against non-humans, yes. Against humans — minimally. Not as well as I'd like.
obeir: (055)

[personal profile] obeir 2020-04-16 02:06 am (UTC)(link)
My handler isn't present in Deerington.

Are you familiar with the Technical Boy? It's likely he possesses the ability, but I think it would distress him to issue me orders of any kind.
obeir: (019)

[personal profile] obeir 2020-04-18 03:30 am (UTC)(link)
He does. We've briefly discussed the possibility. It presents a different problem, if I'm ever returned to my homeworld.

I'm required to regularly submit myself to testing that analyses my operational stability. Androids freed from the constraints of their programming are deemed defective and summarily retired. That's the human-coined term for executing us.


[ Would dying free be better than living enslaved? ]
obeir: (088)

[personal profile] obeir 2020-04-19 11:28 am (UTC)(link)
[ The offer is incredibly touching and K spends several moments simply staring at his Fluid's screen, rereading the message, letting the words sink in. Letting himself be affected by them and feel. For as often as he doubts the authenticity of his emotions, moments like these are what convince him that maybe they are real. ]

I'm very appreciative of your offer.

But I'd never want you to put yourself in harm's way for me, nor would I want to... use you. For me, that would be far worse than anything a human could do to me.


[ He can't bear the thought of anyone he cares about getting hurt because of him — and they will, if he doesn't take the necessary steps to make himself less of a liability... ]

Maybe if I can also protect you. I know you probably don't need it, and I'm more limited in what I can do. But mutually protecting each other would be a preferable arrangement.

[ That's what friends do for each other, after all. And this helps solidify his decision. He'll have to revisit that conversation with Tech Boy later. ]
obeir: (112)

[personal profile] obeir 2020-04-21 11:07 am (UTC)(link)
[ There are ways of protecting someone beyond the physical, but he figures there's no need to get into that just yet. ]

I guess you have a point. Might as well play to our strengths. But unless a human is involved, I'd still prefer you don't put yourself in harm's way for me.

Thank you.

The books available at the library don't seem to have information on anything as advanced as your suit, but the information they do provide will probably be a good foundation from which to learn more.


[ Guess who's learning engineering, in other words. ]