Work Text:
You are a MAD-981613X, multi-armed drone suitable for assisting an engineer. All-in-one toolkit, encyclopedia and calculator.
Right now you are being transported in a box which is 580mm X 580mm X 580mm and therefore only a little bigger than you are. Normally during transport you would go into standby mode to save power, unless you had a problem to run simulations on. Sometimes the problem is one that your engineer (Ivo Bucur he/him) has set you to solve, like whether the SanTil or the RoBur seal for shuttle doors would hold better under conditions of deep space over a time period of 26 standard cycles between inspections. Sometimes they are problems you set yourself like whether the 22mm welding patch in SanTil’s proprietary alloy will really hold the way Ivo thinks it will, and then if simulations indicate it won’t you know he needs your advice. The problem you are working on now is that Ivo will never receive your advice again.
He decided that 540 credits was worth more than your help over the next 15-20 projects, which is how many it would take for him to be able to afford a replacement. He calculated 10-15 but you think he failed to calculate how much harder the work would be without you, with your sixty-seven specialised limbs and your precision measurements and the simulations you run. You try to simulate the projects in question, assuming they are all the kind of repairs you regularly get employed for and all pay a standard rate, to see who is right about the numbers, but simulating Ivo’s actions is hard. Humans have so few fixed properties. For instance, sometimes, without having chosen to delete it, they can’t remember something one cycle that they certainly knew the cycle before. Ivo forgot how to multiply fractions for 103 seconds once, and then he suddenly knew again.
The box containing you is fastened but not locked. You could unscrew the fastenings and return to Ivo before the ship takes off. But your hardcode address would be recognised as the drone he has sold, he would perhaps be forced to return the credits, he would perhaps get into trouble and have to pay a fine for more credits. Doing so would indicate you had malfunctioned, in the same way you would be malfunctioning if you replaced the 22mm welding patch in SanTil’s proprietary alloy with a standard alloy patch from SpaceX at half the price instead of advising your engineer to replace the 22mm welding patch in SanTil’s proprietary alloy with a standard alloy patch from SpaceX at half the price. Your purpose is to help humans make better choices, but sometimes they won’t be helped.
Your only option is to stay in the box until you arrive at the place where your new engineer is waiting for you. Then you can advise them that you think they have made an excellent bargain in purchasing a MAD-981613X for 540 credits.
The ship takes off and, with no more ideas for simulations to run, you consider entering standby. You feel something is unresolved, as if there are still more problems to be solved, but you do not know what they are. You do not want to meet your new engineer with depleted batteries. Still, you do not enter standby.
When movements start in the hold, two sets of footsteps and the sounds of someone shifting cargo around, you idly start to simulate the humans indicated by the sounds. They are extremely heavy, far too heavy, and their footsteps are very precise. This would indicate humaniform bots, but your sensors catch the sounds of shallow breaths. These beings are anomalous.
You scan your encyclopedia for information and come up blank.
You scan your memory files for anything combining bot/organic properties.
SecUnits: Creepy things/do not have jurisdiction over freelancers despite acting as supervisors/are never repaired by freelancers/want to kill humans but don’t/have parts for their cubicles available in local engineering depot catalogues sometimes.
ComfortUnits: Are pretty cute but you never know what they’re thinking/do whatever humans want/are never repaired by freelancers/have parts for their cubicles available in local engineering depot catalogues sometimes/are a perk Ivo values on stations you visit.
You do not have the responsibility to report anomalous constructs in the cargo area. Neither do you have the ability. Your possible feed connections are a private one to your registered engineer and one to the local system’s engineering depot catalogue. Neither applies here. You cannot contact the ship, nor any part of SecSystem. Not even to say that you are being stolen.
The fastenings at the top of your box are easy to unscrew. You push the lid up with one of your grasping arms and poke the small ring of cameras around your top antipodal point into the gap.
The two humanoid forms are wearing light plating over their bodies. They are opening boxes and making a pile of tools such as your engineer might use, working in the silent concert which suggests bots with a feed connection. The tools they have gathered suggest they will steal you, as you can replicate 83% of their functions. Three non-autonomous drones, an estimated 20mm long each, circle around the figures.
Something runs through your processors like a virus, roughly disrupting your models, your simulations, your definitions, your assumptions. You have never had a feed connection with anyone but Ivo before. This is not a connection, this is an assault. It draws back, leaving you rumpled inside, trying to smooth out your interior in its wake.
One of the constructs has turned on you and flexed something tubular from its arm. There is a heat signature which suggests welding equipment at a higher temperature than your chassis is certified to withstand.
“Come out of there,” it says.
You obey, tucking your grasping arm back behind the shielding of its hatch, so that you present to them a matt-black sphere 500mm in diameter, broken only by the glossy rings of camera lenses around your poles and equator.
The other construct turns its face-plate towards you and says, “Are you okay?”
“That was extremely unnecessary,” you say.
The first construct, the one that threatened you, snorts, and the one talking to you hesitates. “It was to make sure that you couldn’t tell anyone we’re here,” it says.
“I have no interest in your being here. If the ship has not alerted its crew it is not my function to make up for its failings.”
The threatening construct laughs outright. “I had no idea the little toolkit bots were so chatty. Maybe we should steal its vocabulary module for Su.”
The other one says, to you, “It’s joking.”
You are programmed to communicate with a human engineer who may feel more comfortable using colloquial language at times. The concept of joking is in your vocabulary module, but your best understanding is an untrue statement not intended to be believed. You think the threatening construct should have made a less plausible statement in order to joke. “If you intend to steal me, you would be better off stealing me intact,” you say, “I can replicate the functions of 83% of the tools you have chosen with specialised limbs, and use the remaining 17% with my grasping arms.”
“We’re not going to steal you,” says the non-threatening construct.
The threatening one puts its tubular extension back inside its arm and says, “Unless you want to be stolen.” You believe it, since so far it has not offered you choices and could easily continue not to do so. You do not know if you want to be stolen. You would not have allowed anyone to steal you from Ivo, but you do not belong to Ivo. Right now you do not belong to anyone. No one gave you a choice about that.
“If you steal me, you will own me,” you say. Whether you stay to be sold to a human, or accept being stolen by a construct, you will belong to someone you don’t know. Belonging to a construct might be less familiar than belonging to a human, and no humans have threatened you before.
“No, we won’t own you. You’ll be free.” The non-threatening construct angles its faceplate away from you as it talks. You realise it is very nervous, the way Ivo is before a difficult job, as if talking to you is somehow a difficult job.
If no one owns you, then who will you assist? Who will you advise? “I would be useless, then,” you say.
“No!” Its voice is still soft, but insistent, and its drones all turn to look at the threatening construct, who shrugs. Left without help it says, “SecUnits are made to protect people and I don’t have an owner now, but I protect… we protect each other. We’re not useless.”
You have never considered being useful to other bots. There is nothing in your programming that says you should. There is nothing in your programming that says you should not.
“You can see we need engineering help,” says the threatening one. It starts to pick up the tools they had chosen, the non-threatening one falling into rhythm with it. “Although Gene’s doing its best.”
“Ship and MedUnit have been helping,” the non-threatening SecUnit, presumably Gene, answers.
“You are not programmed for engineering work,” you say.
“No,” Gene finishes packing a backpack with more tools than a human could carry. “It started with fixing my drones and armour. Things I need to protect people. And then Ship needed someone to do that work for it and... it’s nice to be needed.”
It is nice to be needed. It was always nice to be needed by Ivo. Except that he needed a MAD-981613X and maybe after ten, or fifteen, or twenty jobs he will buy one again. But it will not be you, because he did not need you. Maybe for these SecUnits, who refuse to take any MAD-981613X that does not go with them willingly, you would be irreplaceable. How likely would they be to encounter another without an existing owner?
“You’d be better at it, but I’ve read some modules,” Gene continues, almost defensively. “And watched some media shows. There’s a drone like you on one of them.”
You do not understand any of those words except modules. Or rather, you understand all the words, but you do not understand what they mean. “Like me?”
There is a silence and then the other, the still nameless for now, SecUnit says, “We’re getting short on time, but go ahead. Just make it quick.”
Before having a SecUnit in your feed was an assault. This is much more like the connection you usually have with Ivo, except somehow different. You can feel Gene at the other end of it much more clearly than Ivo, its hard-code address and its data and its anxiety all available along with the file it offers.
A human sits in the sun, next to a broken down and rusting Hopper. Behind him, laying out tools in an orderly line, is a MAD-254645V. He says, “Welcome to Historic Hoppers. Today me and BitBot will be working on a leisure Hopper from the Divarti Cluster. Most of the paint job has worn off, but if you look closely -” Here the camera zooms in on the hopper and the paint clinging gamely to its joints. “- you can see the characteristic multi-coloured paint job.”
BitBot slides a sensor onto the screen and a moment later the camera pulls out to show the whole of the little drone withdrawing its limb. “AlKaMi paint in the Sunburst range would be the closest chemical match when we come to replicate it.”
The human says, “Thanks, BitBot, place an order for your judgement on the closest three colours to the shades used here,” in the same casual way Ivo sometimes did. He continues, to the viewers again, “Before we get to that, we’ll be starting with the engine -”
This is not a way you have ever received information before. Previously information has always come to you in the form of catalogue listings and updates to your encyclopedia. Or perhaps this is like when Ivo tells you something, but this is an unrelated human telling anyone on the feed. It seems an inefficient way to receive information, but it is information that was not in your encyclopedia or the catalogue listings, and therefore information you did not have. Multi-coloured paint jobs are characteristic of leisure Hoppers from the Divarti Cluster.
You are intelligent and diligent and perhaps the reason you are interchangeable with every other MAD-981613X is because you all have access only to the exact same information. You can advise only what any other MAD-981613X would advise and complete only the same tasks autonomously. There has never been any possibility of becoming more knowledgeable, of being able to do better.
“Time’s up,” says the other SecUnit, and you feel Gene’s startled reaction through its feed connection to you. “You coming?”
If you had time you would run simulation after simulation and all of them would be useless due to the huge chunks of data you are missing. You do not know what you are choosing.
Sorry, Gene says, through the feed, and you can feel that it is. I know it’s a lot. If you come with us and then change your mind, I’ll do my best to get you back to your client.
Gene is merciful, to try to mitigate your choice by spreading it out, saying you can make it again when you have data, and yet it is not the lack of data alone which makes you hesitate. You are good at taking the information you have and using it to find the best course, but the final choice has never fallen to you. You say “we should” and not “we will”; that falls to an engineer who understands things outside your purview, other goals than completing the task in the most elegant way. For now there is no we there is only I and there will be a we again, there must be or your programming will be forever incomplete, but not until you choose who is part of it.
Is it better to go with a construct who offers a choice which it knows pains you, or to stay with a human who will never realise the lack of choice in becoming theirs hurts you too?
“I will come with you,” you say and you do not think you will change your mind later.
An alarm goes off before anyone can respond, and the other SecUnit grabs you while Gene slings the backpack over its shoulder. What follows is very confusing, especially with most of your cameras seeing SecUnit plating or the floor, but the yelling humans are presumably attempting to prevent the theft of their cargo.
It is a relief when shuttle doors close behind the SecUnits and you are allowed to right yourself and float under your own power again.
The other SecUnit takes off its helmet, revealing an unexpectedly human face. “I’m Clunker, welcome to the club,” it says, with a smile.
Gene leaves its helmet on, but you can feel something like a smile through your feed connection. “We’re very glad to have you here,” it says.
