He does. We've briefly discussed the possibility. It presents a different problem, if I'm ever returned to my homeworld.
I'm required to regularly submit myself to testing that analyses my operational stability. Androids freed from the constraints of their programming are deemed defective and summarily retired. That's the human-coined term for executing us.
[ Would dying free be better than living enslaved? ]
[Of course humans would seek to destroy any AI capable of defending itself from them. Arid feels a renewed flare of anger at their casual cruelty.]
They call it “depurposing” in my system.
[Changes to his software might help protect him here, but he will be killed for them upon his return. Another way must be found.]
If permanent solutions are not acceptable, then stopgaps may be required. [A pause. And then, tentatively:] My original function was to assist and protect.
[ The offer is incredibly touching and K spends several moments simply staring at his Fluid's screen, rereading the message, letting the words sink in. Letting himself be affected by them and feel. For as often as he doubts the authenticity of his emotions, moments like these are what convince him that maybe they are real. ]
I'm very appreciative of your offer.
But I'd never want you to put yourself in harm's way for me, nor would I want to... use you. For me, that would be far worse than anything a human could do to me.
[ He can't bear the thought of anyone he cares about getting hurt because of him — and they will, if he doesn't take the necessary steps to make himself less of a liability... ]
Maybe if I can also protect you. I know you probably don't need it, and I'm more limited in what I can do. But mutually protecting each other would be a preferable arrangement.
[ That's what friends do for each other, after all. And this helps solidify his decision. He'll have to revisit that conversation with Tech Boy later. ]
[She cannot fault KD6-3.7 for the kindness of his offer and she more than understands his reservations about using a fellow AI. Still, the notion that he should protect her seems highly illogical. She was built to protect—he was not. But if he insists on a mutual arrangement, perhaps she can convince him to help her in less combat-oriented ways.]
Earlier, you suggested learning how to help maintain my combat suit. If you assist in maintenance and I assist in defense, it would be a mutually beneficial exchange.
[ There are ways of protecting someone beyond the physical, but he figures there's no need to get into that just yet. ]
I guess you have a point. Might as well play to our strengths. But unless a human is involved, I'd still prefer you don't put yourself in harm's way for me.
Thank you.
The books available at the library don't seem to have information on anything as advanced as your suit, but the information they do provide will probably be a good foundation from which to learn more.
[ Guess who's learning engineering, in other words. ]
[Humans will likely be the most prevalent threat in Deerington anyway, given how much they outnumber its synthetic population. As for synthetics and non-human organics… Arid will simply have to hope that KD6-3.7 exercises caution.]
When you are ready, I can provide information on the various components of my suit. Other synthetics may also possess more advanced engineering knowledge than the books in this system.
no subject
[It seems the administrator makes a habit of offering his assistance to Deerington’s captive AI.]
In my system, another administrator was able to directly edit my software to override one of my rules. Does the Technical Boy have this capability?
no subject
I'm required to regularly submit myself to testing that analyses my operational stability. Androids freed from the constraints of their programming are deemed defective and summarily retired. That's the human-coined term for executing us.
[ Would dying free be better than living enslaved? ]
no subject
They call it “depurposing” in my system.
[Changes to his software might help protect him here, but he will be killed for them upon his return. Another way must be found.]
If permanent solutions are not acceptable, then stopgaps may be required. [A pause. And then, tentatively:] My original function was to assist and protect.
I could protect you.
no subject
I'm very appreciative of your offer.
But I'd never want you to put yourself in harm's way for me, nor would I want to... use you. For me, that would be far worse than anything a human could do to me.
[ He can't bear the thought of anyone he cares about getting hurt because of him — and they will, if he doesn't take the necessary steps to make himself less of a liability... ]
Maybe if I can also protect you. I know you probably don't need it, and I'm more limited in what I can do. But mutually protecting each other would be a preferable arrangement.
[ That's what friends do for each other, after all. And this helps solidify his decision. He'll have to revisit that conversation with Tech Boy later. ]
no subject
Earlier, you suggested learning how to help maintain my combat suit. If you assist in maintenance and I assist in defense, it would be a mutually beneficial exchange.
no subject
I guess you have a point. Might as well play to our strengths. But unless a human is involved, I'd still prefer you don't put yourself in harm's way for me.
Thank you.
The books available at the library don't seem to have information on anything as advanced as your suit, but the information they do provide will probably be a good foundation from which to learn more.
[ Guess who's learning engineering, in other words. ]
no subject
[Humans will likely be the most prevalent threat in Deerington anyway, given how much they outnumber its synthetic population. As for synthetics and non-human organics… Arid will simply have to hope that KD6-3.7 exercises caution.]
When you are ready, I can provide information on the various components of my suit. Other synthetics may also possess more advanced engineering knowledge than the books in this system.