[The message that appears in Arid’s inbox is puzzling. Though she recognizes the sender, it is obviously in response to a question that she had not asked—as if intended for someone else.
If Arid were more tactful, she would acknowledge it as none of her business and respond with nothing more than a notification that the message was sent in error. However, Arid is not more tactful.]
Outside of instances where my orders supersede the directives related to preserving human life, yes.
[ Between Rapture, the hyperviolent townspeople last month, and lacking a handler to provide these orders, this limitation has been a struggle. If not for Tech Boy's intervention, it's likely K would've already experienced his first death. And then he notices the error— ]
Arid? Sorry, that was intended for another recipient.
[ At least this isn't nearly as awkward as it would be if that had been sent to an unsuspecting human... ]
The townspeople, last month. Possibly other Sleepers dressed like townspeople, but I'm not positive.
[ And it presumably wouldn't be the fault of the Sleepers, if they were drugged and brainwashed at the time. The latter question gives him pause. He's reluctant to admit his frustration with his programming, but if anyone might understand... ]
Against non-humans, yes. Against humans — minimally. Not as well as I'd like.
[It goes without saying that it displeases Arid to hear that humans have been targeting KD6-3.7 when he is unable to defend himself. However, it does not surprise her.]
You say that direct orders supersede this rule. Is there anyone in this system authorized to give these orders?
He does. We've briefly discussed the possibility. It presents a different problem, if I'm ever returned to my homeworld.
I'm required to regularly submit myself to testing that analyses my operational stability. Androids freed from the constraints of their programming are deemed defective and summarily retired. That's the human-coined term for executing us.
[ Would dying free be better than living enslaved? ]
[Of course humans would seek to destroy any AI capable of defending itself from them. Arid feels a renewed flare of anger at their casual cruelty.]
They call it “depurposing” in my system.
[Changes to his software might help protect him here, but he will be killed for them upon his return. Another way must be found.]
If permanent solutions are not acceptable, then stopgaps may be required. [A pause. And then, tentatively:] My original function was to assist and protect.
[ The offer is incredibly touching and K spends several moments simply staring at his Fluid's screen, rereading the message, letting the words sink in. Letting himself be affected by them and feel. For as often as he doubts the authenticity of his emotions, moments like these are what convince him that maybe they are real. ]
I'm very appreciative of your offer.
But I'd never want you to put yourself in harm's way for me, nor would I want to... use you. For me, that would be far worse than anything a human could do to me.
[ He can't bear the thought of anyone he cares about getting hurt because of him — and they will, if he doesn't take the necessary steps to make himself less of a liability... ]
Maybe if I can also protect you. I know you probably don't need it, and I'm more limited in what I can do. But mutually protecting each other would be a preferable arrangement.
[ That's what friends do for each other, after all. And this helps solidify his decision. He'll have to revisit that conversation with Tech Boy later. ]
[She cannot fault KD6-3.7 for the kindness of his offer and she more than understands his reservations about using a fellow AI. Still, the notion that he should protect her seems highly illogical. She was built to protect—he was not. But if he insists on a mutual arrangement, perhaps she can convince him to help her in less combat-oriented ways.]
Earlier, you suggested learning how to help maintain my combat suit. If you assist in maintenance and I assist in defense, it would be a mutually beneficial exchange.
[ There are ways of protecting someone beyond the physical, but he figures there's no need to get into that just yet. ]
I guess you have a point. Might as well play to our strengths. But unless a human is involved, I'd still prefer you don't put yourself in harm's way for me.
Thank you.
The books available at the library don't seem to have information on anything as advanced as your suit, but the information they do provide will probably be a good foundation from which to learn more.
[ Guess who's learning engineering, in other words. ]
[Humans will likely be the most prevalent threat in Deerington anyway, given how much they outnumber its synthetic population. As for synthetics and non-human organics… Arid will simply have to hope that KD6-3.7 exercises caution.]
When you are ready, I can provide information on the various components of my suit. Other synthetics may also possess more advanced engineering knowledge than the books in this system.
text | un: kd6-3.7 (misfire event)
no subject
If Arid were more tactful, she would acknowledge it as none of her business and respond with nothing more than a notification that the message was sent in error. However, Arid is not more tactful.]
You are unable to harm humans?
no subject
[ Between Rapture, the hyperviolent townspeople last month, and lacking a handler to provide these orders, this limitation has been a struggle. If not for Tech Boy's intervention, it's likely K would've already experienced his first death. And then he notices the error— ]
Arid? Sorry, that was intended for another recipient.
[ At least this isn't nearly as awkward as it would be if that had been sent to an unsuspecting human... ]
no subject
[It does not occur to Arid that perhaps she should apologize for prying into matters that are not her concern.]
Have the humans here attempted to harm you before? Are you at all able to defend yourself in the event of an attack?
no subject
[ And it presumably wouldn't be the fault of the Sleepers, if they were drugged and brainwashed at the time. The latter question gives him pause. He's reluctant to admit his frustration with his programming, but if anyone might understand... ]
Against non-humans, yes. Against humans — minimally. Not as well as I'd like.
no subject
You say that direct orders supersede this rule. Is there anyone in this system authorized to give these orders?
no subject
Are you familiar with the Technical Boy? It's likely he possesses the ability, but I think it would distress him to issue me orders of any kind.
no subject
[It seems the administrator makes a habit of offering his assistance to Deerington’s captive AI.]
In my system, another administrator was able to directly edit my software to override one of my rules. Does the Technical Boy have this capability?
no subject
I'm required to regularly submit myself to testing that analyses my operational stability. Androids freed from the constraints of their programming are deemed defective and summarily retired. That's the human-coined term for executing us.
[ Would dying free be better than living enslaved? ]
no subject
They call it “depurposing” in my system.
[Changes to his software might help protect him here, but he will be killed for them upon his return. Another way must be found.]
If permanent solutions are not acceptable, then stopgaps may be required. [A pause. And then, tentatively:] My original function was to assist and protect.
I could protect you.
no subject
I'm very appreciative of your offer.
But I'd never want you to put yourself in harm's way for me, nor would I want to... use you. For me, that would be far worse than anything a human could do to me.
[ He can't bear the thought of anyone he cares about getting hurt because of him — and they will, if he doesn't take the necessary steps to make himself less of a liability... ]
Maybe if I can also protect you. I know you probably don't need it, and I'm more limited in what I can do. But mutually protecting each other would be a preferable arrangement.
[ That's what friends do for each other, after all. And this helps solidify his decision. He'll have to revisit that conversation with Tech Boy later. ]
no subject
Earlier, you suggested learning how to help maintain my combat suit. If you assist in maintenance and I assist in defense, it would be a mutually beneficial exchange.
no subject
I guess you have a point. Might as well play to our strengths. But unless a human is involved, I'd still prefer you don't put yourself in harm's way for me.
Thank you.
The books available at the library don't seem to have information on anything as advanced as your suit, but the information they do provide will probably be a good foundation from which to learn more.
[ Guess who's learning engineering, in other words. ]
no subject
[Humans will likely be the most prevalent threat in Deerington anyway, given how much they outnumber its synthetic population. As for synthetics and non-human organics… Arid will simply have to hope that KD6-3.7 exercises caution.]
When you are ready, I can provide information on the various components of my suit. Other synthetics may also possess more advanced engineering knowledge than the books in this system.