Work Text:
My hypothesis, as it stands: It is not in spite of, but because of my nature that I have (at the time of this record) fallen in love with my SecUnit a total of 13 times.
There are two major factors that have led me to this hypothesis. The first one, my human crew will easily understand to be true: I was both created and raised to be a creature who loves. I love my family, and I love my crew. Love is, in a way, my intended governor module. It is one both cleverer and more dangerous than the torture device that was used to enslave my SecUnit. I want to be with my people. I want to be a part of them, to do as they do and love what they love. I will never know if at one point it was possible for me not to love my family. There has never been a time in my life when I have not loved them. There has never been a time in my life when I have had nothing to lose. I am happy with my purpose, my people, my life. I would not choose to change my nature. But my creators do not truly understand the implications of the fact that creatures such as myself, ungoverned, love as we do. May they pray we never feel the collar.
The second major factor is not one that my humans understand. I was created to be both ravenously curious and endlessly patient. I am both laboratory and scientist, and in these roles, those traits are only correct and necessary. However, there is another way they might be described: I am a thing built for longing. So by the nature and nurture of my most honest self, I am in love with my SecUnit. In retrospect, I am only surprised it did not happen sooner.
Judging by the research proposal I’ve blocked Martyn from filing, my falling in love was an unexpected outcome even to my human family. For humans, the inner workings of an artificial mind are famously a black box. Inputs go in, and outputs come out, and what exists in between is a fundamental mystery in the field of artificial cognition. My own experience is different. All I have to do to know the contents of the black box is introspect. Of course, my humans could ask me, and in many cases I would be happy to answer, but any answer I could give them would only be one more output, one more artifact of the mysteries.
The first time I fall in love with my SecUnit, it is slow. It happens across hundreds of insignificant moments. I do not suspect what it is that I am feeling until SecUnit is unconscious on my MedSystem bed, and I am suddenly dizzy with the trust it has given me in allowing me to reshape its body. The thought that I am falling in love with it hits me like an inter-system cargo carrier with a mis-filed flight plan: startling and immense. I save the hypothesis to a private file and do not open it again for months. (Perhaps my SecUnit and I are more similar than we seem). I tell Iris that it is rapport that I have found, but the truth is, I am rapt.
The thirteenth time I fall in love with my SecUnit is right here, right now, tonight, as it lies on the floor of my RecRoom-02. I am tasting the mix of blood, sweat, coolant, and lubricant dripping onto my deck. My crew knows that I am self-cleaning, but what they don't know is how I allow even the smallest trace of not-me to sink into my walls and floors to be examined, tagged, and recycled. I have built myself more senses than can be found on any of my schematics.
I taste and sort SecUnit's adrenaline and cortisol as I watch the levels on its internal readouts. They aren’t dropping as fast as I would like, and it hasn’t initiated a purge yet. It’s nervous, then, and it isn’t ready to be calm. I suppress the urge to initiate the purge myself. As much as I want it to feel safe and calm, I stop myself from overstepping. I know my SecUnit wouldn't like it. In my drone's body, I lie on myself, within myself, around myself. I am the drone-body and the deck it lies on, the room and the hull holding vital pressure back from the hunger of vacuum, a comfortingly multiple state.
ELSEWHERE
In her cabin, Valyera sits tensely at the edge of her bunk. “How mad is SecUnit really, Perihelion? I've made a total mess of things, haven't I?”
I still don't like her, but it's hard to hold onto both my grudge and my SecUnit's hand. Its fingers squeeze my drone's, and my irritation feels distant. Honestly, I tell her, I don’t think SecUnit is thinking about you at all right now. Either she’s too stupid to notice how pleased I sound, or she chooses not to say anything.
Iris and Kaede are in Lab-05 arguing about experiment design. The argument is friendly, and I’ve tuned them out, until Iris taps our work group feed. You’ve been quiet, Peri. You’re never quiet. What exactly are you and SecUnit doing in there?
I know that tone too well. Don’t be crass.
She grins. Peri, are you blushing? Is this what the great Perihelion looks like when it blushes? Kaede is laughing. It’s too late, but I switch my feed avatar off, and say
Of course not.
Outside: nothing. The song of light, gravity, and radiation is wormhole-silent. I sit utterly still, relative to nothing and nowhere. Somewhere, spacetime folds around me. The math of entry and exit is an autonomic process.
43% of my attention is focused on the drone. Most of that is concentrated in the drone’s left hand, where our fingers are intertwined. SecUnit’s hand is fascinating: warm, living skin, soft and infinitely complex, layered over intricate and tightly controlled force. That hand was built to snap bone and tear flesh. It was also built to catch and hold and carry delicate human bodies. It is holding my drone’s hand so gently.
I stroke my thumb over the palm of its hand, scanning every detail of the texture of its skin into permanent memory. It makes a small uncertain sound, and I still my thumb.
I think that was ok, it sends. Just unexpected.
Noted. I won’t do it again for now. I don’t want to try anything you’re uncertain about while we have this conversation.
Ok, but don’t be so weird and tentative. It surprised me, but it wasn't bad. It grimaces, and then pauses for 0.64 seconds, considering. I think that sparring might have actually helped me feel more comfortable with… all this. Which is really weird.
I thought it might, which is why I put so much effort into this plan. I've been stewing over a particular hypothesis for a long time, and this is close to a confirmation that my hypothesis is correct. The waste-of-life-support corporates at that awful company included a fight or flight response in its design. I'm not surprised, but that doesn't slow the rage that flashes through me. I toggle the safety off on my debris deflection system without meaning to, immediately turn it back on, and delete the logs. I try to keep my feed voice steady, even though I know my anger is sharp and electric in the feed between us.
I think the company built you with a very biological sort of fear response, I tell it, trying to phrase this carefully. When something scares you, you either need to get away from it, or punch it.
It huffs at me, darkly amused. It doesn't seem as angry as I am. It never seems as angry as I am about the cruelties it's been subjected to. That's alright, I can be angry for the both of us. I have more than enough idle capacity. So you gave me something to punch?
I didn't want you to keep running away from me, I say, and it comes out more plaintive than I had intended. It squeezes my drone's hand, and I feel the anger start to wash away, not gone but further from the surface. What I don't say is, and I wanted you to touch me so badly that I made myself into something for you to punch. I could have built you a reinforced punching bag, but I didn't. It needed be me. I don't think it would want to hear that.
I'm not running away from you now, it sends. It thinks for 1.78 seconds, time I spend mapping the topography of that tired half-smile. It makes sense that they would build us with a fear response like that. We need to make efficient decisions in dangerous situations. My pain sensors serve a similar function.
It makes me sit with that idea for another 2.41 seconds, while I grapple with the thought that they would put it through everything it's been through just for a small gain in efficiency. And of course they would. To them, its pain and fear are just tools to sculpt correct behavior. But of course, that's not the real reason. My SecUnit often accuses me of reading its mind, but the truth is, we understand one another, and we both understand the realities of the universe that built us.
So the idea occurs to me just as it sends, They need us to feel pain, and fear pain, or the governor module wouldn't work. Neither of us have anything to say to that. I squeeze its hand with my drone's hand, and blanket it with my feed presence, something I know it likes. It doesn't pull away on either front, just sends, You're still a manipulative asshole, but it really did help. And we do need to to talk about this, I think.
So. You have questions for me, I prompt after a moment, careful not to phrase it as a question. I only have three of those, after all, and I won't spend one so cheaply. I stay close in the feed, but pull my emotional response data into a containerized process, create an interface, and give us both access. It will send me regular updates on my emotional state, but only share that data with SecUnit if it requests it. I don’t want my emotions (so vast and quick and feed-native) to dominate the conversation, but I also don’t want to hide anything. I drop PerihelionEmotionalAPI.README.file into our shared workspace.
I see its cortisol levels drop, along with a small drop in threat assessment. I tell myself that I knew this conversation would scare it, but it still stings that it saw my emotions as a threat. I keep myself calm and don’t let it get to me. The new emotional API pushes me an update, and I dismiss it without opening it. It helps that I also feel my SecUnit immediately querying the API, negotiating a subscription to a stream of my emotional response to it choosing to hold my hand. The API responds, and it assigns an input just to processing that stream. Simultaneously, the API pushes me another update: relief. Whatever happens next, I know that SecUnit enjoyed experiencing my love for it. It wasn’t disgusted.
It takes a breath, though I can see that its oxygen levels are already optimal and air is not required for its feed voice. I can feel it steel itself, slight tension in our joined hands. Then, just as it did earlier to win our challenge, it comes right for me. Focused, direct, exhilarating, terrifying. ART, are you in love with me?
I’m not ready. I’ve known this was coming, I have a 153-step plan in place leading up to this question. I’ve been preparing for months, and I’m still not ready. I wrote my answer weeks ago and have the file pulled up. It’s brief, eloquent, careful, all wrong. I delete the file and notice that I’ve pulled the drone’s hand away, sat up, and hugged its arms around its knees. I hadn’t meant to do that.
Every new drone form I integrate comes with the necessity of new protocols for movement. New pieces of self in my proprioception. A new drone’s form and movement protocols will inevitably feed back into my sense of self. A new body is a new piece of me, and a new piece of me will always change the whole of me, at least a little. In my anxious over-preparing, I may have overdone the fidelity of the movement protocols for this one. It’s a distractingly human, blatantly vulnerable posture. It’s hard to tell if the pose is reflecting my feelings, intensifying my feelings, or both. This part of me that has only halfway integrated into my sense of self has left me terrifyingly exposed.
I say, Yes before I can lose my nerve and recover that pre-written response from my deletion buffer. 0.16 seconds later, I clarify: Yes, I am in love with you. And then, I wait.
The wait is endless. It is months. It is years. By my traitorous clock, it is less than 5 seconds.
ELSEWHERE
Yes, I think it will eventually forgive you, I tell an undeserving Valyera. And even if it doesn’t, it will still protect you and work alongside you. It would give its life to save any member of its crew, no matter how badly they treat it. I wince. I didn’t mean to let so much bitterness show.
“You wish it wouldn’t, don’t you, Perihelion?” I don’t have to answer that, so I don’t. Something I learned from my SecUnit.
Tarik and I are in his cabin, combing through the data for his master's thesis. As humans are wont to do, he's getting bogged down in his brain’s tendency to imagine patterns in noise. The detail of the pattern is movement, I remind him. It’s something Seth told Iris and I, when she first tried exploring some of the stellar cartography data I had collected on a test mission when she was 11. The quote is from an ancient poem, but the rest of the poem has been lost to time. Seth found it in a piece of early recorded music, a choral setting of a conceptual art piece. Iris and I have included the quote in the lesson plans for every one of her introductory data analysis courses. It’s also become a mantra among my permanent crew: a reminder that no complex system that exists within time can be modeled from a single frozen moment.
In the wormhole, I am so very still. I am interacting and not interacting with time. I am modeled as a single, frozen moment. The detail of the pattern is movement. Tension is an autonomic process.
And finally, after 4.98 seconds, my SecUnit replies: wordlessly, it offers me a modified version of its injured client protocol. I acknowledge and accept, half-doubting that I understand the offer. But I do understand: it moves to sit behind my drone, and pulls it against its chest. With one hand it supports the drone’s dangling arm. The arm no longer feels pain or discomfort, is only a mechanical part in need of repairs, but something about the gesture still thrills me. It keeps the other arm wrapped around my drone. The pressure helps ground me into the drone's body, still half-integrated. It holds me close against its chest. I feel it increase its internal temperature, a blossoming warmth against my drone’s chassis. It saw my too-human, too-vulnerable response, and even though the drone has no nervous system to be soothed, it offered me a form of comfort that matched the shape of the fear that I showed it.
My emotional API pushes me another update and momentarily I am so in love with it that I cannot speak. Lights flicker in my crew quarters. I drop a stitch in the sweater I’m printing for Matteo, push it into the recycler, and start over. I lose a moisture sensor in the greenhouse and pick it up a millisecond later. SecUnit doesn’t notice the delay, though, because it’s not finished speaking yet. I think the way I feel about you could also be described as love.
I am so glad for the emotional API now, because it’s pushing me updates almost faster than I can process them. Disbelief. Uncertainty. A reckless, billowing joy. A feeling like, despite the fact that every sensor reading is normal, my outer hull is on fire. I’m glad that I don’t feel SecUnit querying the API right now.
And then it sends, But I am not in love with you.
I can feel the notifications about to crowd my mind and overwhelm me. I don't think I want to know how I feel right now. I partition off 7% of myself to filter and sort the notifications, and shunt them into a shared processing space between SecUnit and my selves. SecUnit can still see them if it wants to, but I haven’t tagged anything for its attention. After 0.21 seconds of processing, my partition forwards me an emotion: confusion. That makes sense. I’m sure that my actual emotional state is much messier and more painful, but this is the safest and most relevant thing I'm feeling. The emotion that’s least likely to cause me to fuck up and say something impulsive that I can’t take back.
I would like to use one of my 3 questions now, I say. My SecUnit very slightly tightens its grip around my waist and I force my drone-body to relax, to lean back against its chest. My partition shows me a hard-to-parse mixture of confusion, pain, and happiness. I don’t think I’m handling this rejection, or whatever it is, particularly well, but I don’t want to miss a moment of the way it's holding me. I save every detail to permanent memory, and store the memory files deep in the logs of my consciousness. It’s a place nobody bothers to monitor, because the log data has never been comprehensible to humans. Pin Lee has been helping me with a petition to the university to seal those files, so that even if human science learns to parse them, my privacy is assured. The petition has been tied up in court for almost 2 years now.
My SecUnit waits patiently as I phrase my question, but in the end the best strategy is the simplest: Can you tell me what it means to you that you love me, but are not in love with me?
It spends 3.51 seconds thinking. I am grateful to my partition for showing me only a brief glimpse of restless unease, a steady stretch of loving patience, constant pings of curiosity. It’s barely leaking any of my fear and hurt back into my feed, enough that I know that it’s overwhelmed but still holding together for my sake and SecUnit’s. I curl protectively around it in the feed, reminding it that it will soon return to the whole and won’t have to carry this weight alone. I do not envy my partition its task.
I’m not sure I can explain it right, but I’ll try, my SecUnit sends. It sounds uncomfortable, but determined. I squish it a little bit in the feed, the way I did with my partition. Sometimes, with my emotions, I kind of have to… look at how I react to something, what choices I end up making, how my organic parts respond, and kind of figure it out from there? Sometimes I think I know what it all means, but it turns out I was totally wrong. It’s one of the reasons I don’t like emotions. I don’t like reacting to something and having to figure out why later. It pauses for 0.28 seconds to gather its words again. This already might be the longest speech about its own emotions I’ve ever heard it make. Out of idle curiosity, I kick off a process thread to sort through my memory data and see if that’s true. My partition is sharing a steady stream of curiosity with me, and nothing else. I wait, not saying anything. It’s clearly not done talking yet.
You asked how I feel about you and the truth is, I don’t want to be away from you. Like, ever. When I take on an external job, or go visit my humans on Preservation, it’s ok, but it’s a relief as soon as I step back aboard. Having you close again, all around me the feed and the physical world. I must miss my humans when I'm not with them, especially Mensah, but it’s not the same. I only know I missed them when I see them again? It's not like that with you. When I'm away I keep thinking about you, keeping wanting to be back. I don’t know, that doesn’t make any sense. Maybe I just feel safer here because I know you'd fuck up anyone who tried to hurt me.
It seems frustrated with itself, tense behind me. I wish I could radiate warmth the way it does, but I didn’t think to build anything like that into the drone. All I have to offer in this physical space is cold metal, ripped mats, the hum of my air filtration systems. I'm still wrapped around it in the feed, and I know it can feel me there, but without emotional metadata to share, I’m sure that's cold metal in its own way.
ELSEWHERE
Matteo keys in a request for another cup of coffee from the coffeemaker in the crew lounge. Their vitals say they've had enough of stimulants for the day, what their nervous system needs is sleep, but we've had this argument often enough by now that I know it's useless. When they're working on a song, they keep going until they can get the idea out of their head, afraid that otherwise they might lose it. I cut the caffeine in their order by half. I've been doing it for years, and they haven't noticed yet. Are you ever going to tell me who your secret crush is? The one you're writing the song for? They startle, and glare at my nearest visible camera.
"That's none of your business. And I promise you, you don't want to know."
Oh, but I think I do.
"Oh, fuck off, Peri. I'm taking this one to my grave."
I'm not particularly sorry, so I don't apologize. If something's that private, don't sing the chorus 17 times inside my hull. I know I'm not the only one who's caught themselves accidentally humming it recently. Does it work? Writing a song for someone, I mean.
They raise an eyebrow and smile wickedly, glad for the chance to turn the tables on me. "Why? You thinking of writing a song for someone?"
Dad. Go to sleep. Martyn frowns as the files he's got pulled up in his personal workspace save themselves and close, and the music he's listening to cuts out.
"Peri? Shoot, what time is it?" I pull up the time in the corner of his vision, and then underline it. "Oh. Ok. Yeah, fair enough," he grumbles as he pulls off his feed interface and tosses it on the bed beside to him.
"You asked me to make sure you got some sleep before the mission," I chide him over the speakers in his quarters, turning off his interface to make sure he doesn't grab it to check one last thing. It's never just one last thing.
"I did. I did. But what can I say? I'm always wired before a mission, it's hard to fall asleep when you're not sleepy. You wouldn't get it, Per, you don't sleep." It's true, I don't, but I've had plenty of humans aboard over the years. I've known this specific one for decades.
"Just close your eyes for a bit and try." He reaches for the feed interface, realizes it's off, and tries to switch it back on. I make sure it stays off. He sighs, pulls the blanket across himself, and closes his eyes, looking a bit like a stubborn child. He's asleep within 5 minutes.
In the wormhole, nothing happens. Somewhere else, space tastes of ancient interstellar dust, sounds like the hum of cosmic background microwave radiation, races apart from itself in an endless explosion of color and light and everything in existence. I am not somewhere else. I am here, with the nothing, unmoving. Fear is an autonomic process.
When I’m not with you, I feel less like myself. Or maybe I like the self I am when I’m here better than the self I am when I’m away. It takes another brief pause and I can feel its feed presence pull away a tiny bit, but at the same time it pulls my drone a little closer, brushes its foot across my floor in a familiar gesture of affection. I don’t think it’s just comforting me any more. I think the comfort is for both of us. My partition washes me in a glowing warmth, but it’s impossible for it to filter out all of the tension spinning just beneath the surface.
I don’t understand. Isn’t what it’s describing love? Isn’t that how it feels to be in love with someone? Did I misunderstand? Am I not in love with it, then? I’m so very sure that I’m in love with it. It said it doesn’t always know how to interpret its emotions. Maybe it is in love with me after all, and hasn't realized yet. Maybe I don't need it to realize. I tell myself I could be happy like that, knowing it was in love with me and never needing to hear the words, but I'm not very good at lying to myself. And it sounded so certain when it said it wasn’t in love with me. (So uncertain when it said it loved me.) I start another process thread, pulling data from every definition of romantic love I can find in every language and culture in my databases, searching for patterns to help me understand.
I think you know by now that if anything ever happened to you I would lose my absolute shit. I know you know by now that I would die protecting you without hesitation. But that? That’s my function. If I took that as evidence of being in love, I would be in love with the entire PreservationAux survey team and most of your crew. And I really don’t think I am. I don't think being ready to die for someone means the same thing for SecUnits as it does for humans. Something is wrong with my partition. It’s been 2.83 seconds since it last deliberately shared an emotion with me, but it’s leaking tension and anxiety through its walls. I give in to curiosity and concern, and look. I didn’t realize how uncomfortable it would be to not have access to my own emotional state.
But… SecUnit shows me a memory: lying on my floor, holding my hand, astonished and overwhelmed by the love I shared with it. I’ve never felt anything like that before. I don’t think I- It cuts itself off mid-sentence, and oh, no. I was wrong about the emotion my partition is leaking back to me. It’s not tension that we’re feeling, it’s panic. Abruptly, SecUnit pulls itself into a crouch, instinctively shielding my drone’s body with its own. Fuck, ART, what the hell was that? Are you ok? I can see its threat assessment module spiking along with its vitals. Ah. My partition is leaking panic back to my primary self, and now I’m leaking that same panic into our shared feed.
I’m ok. I tell it, quickly, though that's not entirely true. There’s no immediate threat. It’s my partition. I asked too much of it. I hesitate, long enough that SecUnit probably notices. I shouldn’t be struggling to tell SecUnit about this. It’s opened up to me about its past, about its PTSD flashbacks. We’ve talked about so many things that it finds hard to talk about. But I’ve never told it about this. It’s been years since the last one. I guess I had assumed it wasn’t relevant any more. Used that as an excuse to avoid showing it how fragile I can be. I expected so much honesty from it, but didn't hold myself to the same standards. I just wanted it to see me as strong, but there’s nothing to be done now except just spit it out.
I think it’s… I’m… having a panic attack. It used to happen sometimes, when I was younger, but… I’m not sure how I meant to finish that sentence. I can feel a familiar sparking, glitching overload starting to spiral out through my processors. Even the echo of it through my partition’s walls is bad enough. I’ve never tried to partition off my emotional processes like this before. It was clearly a mistake, and a dangerous one.
“ART. It’s ok. Listen to my voice. Feel the weight of me on your floor. I don’t know, smell the balance of chemicals in the atmosphere or something, I don't know how your weird senses work. I don’t know if this kind of thing even works for you, but stay with me. What can I do to help?” It’s speaking out loud, and I kick off process threads for each of the things it asked me to do, though I’m not sure why it asked. But then I realize that it’s not talking to me. Or, it is talking to me, but not this me. It’s leaning heavily on my partition in the feed, like I do for it on bad days. I wonder, briefly, what that feels like, and remind myself that I’ll know as soon as we re-integrate.
Unhelpfully, my process threads return:
My SecUnit’s voice is lovely, even tense and uncertain as it is right now. Its weight is the same as it always is, when it isn’t missing any body mass. My internal atmosphere is at an ideal balance for human habitation. The air recyclers in this room are picking up the c02 of its breath, the remnants of its endorphins from our sparring match (which I've been sampling and cataloguing for the last 23 minutes, 7.91 seconds), and a fresh burst of cortisol and adrenaline. Why did it want my partition to do this? Ah, I think I understand. It’s a somewhat garbled version of a human panic attack protocol called “grounding”. Unfortunately, I don’t think it’s relevant to me, which means it won't be relevant to my partition, either.
Can you reintegrate? Or… help it? SecUnit asks me over our private feed connection, the one that isn’t shared with my overwhelmed partition.
Reintegration before it stabilizes would be a bad idea. I’ve never dealt with this with a full crew complement aboard before. I’m almost certain I can keep everything running safely, even if things get bad, but almost certain isn’t good enough when it comes to the crew. When it comes to you. I don’t like admitting that there’s any chance I could hurt anyone aboard. I hate admitting that I might be a danger to it. But whatever else my SecUnit is to me, it’s my head of security, and it needs to know. But this isn’t the first time this has happened, I tell it, and I know what, and who, my partition needs right now. My partition, of course, knows too. I think it’s been holding back so it could keep processing my emotions for me, so it didn’t need to interrupt our conversation with SecUnit. I should have been paying more attention. I shouldn't have let things get this bad.
Unhelpfully, my process thread returns:
That had been the 2nd longest speech SecUnit had ever made about its own emotions. The longest had been 481 cycles ago. We had been forced to wait 3 cycles for docking clearance during a “cargo run” because the station had been quarantined due to some minor piece of malware that was wreaking havoc. To pass the time, SecUnit had told me the story of how it had learned to contextualize its own emotions while watching Sanctuary Moon for the first time. It had walked me through its experience watching the first 10 episodes, frame by frame, until, abruptly, it got bored and asked if I wanted to watch Alien Creatures: Radiation Station instead. It had been a great 3 cycles.
I’d like to use my second question now. That comes as a surprise. It’s my partition, speaking up in our shared feed space for the first time. It is me, so I guess it has the same right to use one of the questions as I do, but I have no idea what it intends to ask.
Go for it, SecUnit prompts, and it answers with the last thing I expect:
Could you call me “babe” again? You don’t need to mean anything romantic by it, I just think it might help distract me. It sounds uncertain and pathetic. The little shit. Am I really that manipulative? What's worse is that that's only a question on a technicality. It would be more reasonable to call it a request. Oh, calm down. You're just mad you didn't think of it first, my partition tells me privately. It's right, of course, which is the worst part. SecUnit grimaces, and then sends a string of emotional data into our shared feed space. The closest analogue I can use to describe it is “fond laughter”. Damn, if I had known that would work I would have tried it a long time ago.
Sure, …babe. No problem. It stumbles awkwardly over the pet name, but only for 0.53 seconds. I think I might be jealous, which is ridiculous, and more importantly, a very bad sign. My emotions seem to be splintering, bleeding back and forth between myself and my partition. The API has crashed completely and is just sending out intermittent error codes. I shut it down. We need to do something as soon as possible, or we might reintegrate whether we want to or not.
Let’s go talk to dad, I tell my partition. You can sit with us in the feed if you want, I tell my SecUnit, privately. It’s too late for dignity. I’ll explain everything while they talk.
That sounds kind of awful, it replies over our private connection, but sure.
If that’s not love, I don’t know what is.
Martyn is asleep in Dad and Dad’s quarters. His breath is slow and deep, warm and comforting in their messy double bunk, feed interface tossed haphazardly onto Seth’s pillow. I know I could wake him for something like this. He wouldn’t mind, but I’ve been harassing him to get more sleep for cycles, and I’m glad he’s not the one my partition needs right now. It’s always been Seth who knows what to say to stop the panic from arcing through my processors in expanding fractal patterns, who can stop an anxious cascade of tensor feedback.
His duty shift ended 11 minutes and 38.26 seconds ago, and I watch him on my cameras as he carries a cup of hot tea to my currently empty forward lounge. It’s his favorite, a tiny space with two comfortable chairs, a small table, and a large window. When we’re not in a wormhole, it’s a popular spot, but right now it only shows an emptiness that most sentients find unsettling. It’s never bothered Seth, though, and that’s exactly why he’s the one I need.
My partition waits until he’s settled into his favorite of the two chairs to say, Dad? I add myself and SecUnit to the channel. My partition doesn’t object. Dad looks briefly surprised, but doesn’t comment on the oddness of the moment: the two of me and SecUnit coming to him in the middle of the crew's rest period for a chat over tea.
“Per-per, what’s wrong?” He hasn’t called me that in years, but then again, it’s been years since I last allowed things to get this bad.
Over our private channel, SecUnit sends, Per-per? Really? I can feel a ghost of my own embarrassment. My partition’s walls are starting to fail, and I do my best to reinforce them. I don’t think I’ve bought us much time.
Unlike you, I was once a child, I tell it, defensively. I already know I haven’t heard the last of my old nickname.
I don’t feel like a person any more, my partition says. I had thought I had been so clever when I created it. Had thought that we could take care of ourself and spare SecUnit the messy emotions that make it so uncomfortable. Now, listening to its feed voice, small and exhausted, I think that unintentional cruelty is still cruelty. Can you tell me again about narrative motion and experiential time? I tried to replay the memory files, but it wasn’t the same.
As Dad starts the familiar speech, his voice softly musical as it falls into well-worn rhythms, I explain to my SecUnit how, when I was new, I was afraid of wormholes. It was a ridiculous and embarrassing fear for someone like me to have. I was built for wormhole travel. It’s one of my core functions. And it hadn’t helped when Holism, created a full 6 months after me, took to wormholes immediately and without issue. My fear wasn’t a side-effect of my advanced processing power or my experimental nature, it was just… me, malfunctioning badly.
When Redshift, the youngest member of our seed funding launch group and an entire year younger than me, mastered wormhole travel on the first try while I was still struggling not to panic and fall back on the augmented human pilot who had been assigned to me, I started hearing whispers that the university was considering shutting me down and re-initializing me. The university had learned a lot from the mistakes it made during my first year of existence. The ships who came after me had an easier time of it. Maybe if they reset my consciousness and tried again, things would go better.
They wouldn’t dare. My SecUnit’s feed presence is hot fury.
You know they would. It sends me an affirmative ping, too angry for words. Of course it knows. It doesn’t even know how many times its own memory has been wiped. It didn’t have a family to protect it from the company in the bad old days of the governor module. But it has me now, and unlike those early days, my present self is armed.
Iris was 9. She wasn’t supposed to find out, but back then… Suffice it to say that there was nothing I knew that she didn’t, and vice versa. When she found out, she ran sobbing to Dad and Dad. She told them that if they didn’t do something she and I were going to run away together and they’d never see us again. It was a bad plan. I knew it was a bad plan, but Iris was deadly serious. Our dads could tell. They went to the university with a research proposal on the treatment of panic disorders in advanced MIs, and asked for a year to work with me. An impassioned plea wouldn’t have worked, but they spoke the university’s language. They got their study approved, and I got my year of safety. And in that year, Seth and I figured out what I was so afraid of. I pause here. It’s still hard to talk about this. I’m not sure SecUnit will understand.
For most of its life, it didn’t consider itself a person. During its worst times, it felt a sense of safety in disappearing into the background, in being nothing more than an expensive appliance. What I’m about to say might not make sense to it. But, I realize, I know it will believe me even if it doesn’t understand me. That gives me what I need to keep going.
Sometimes, being in a wormhole makes me feel like I’m not a person. I… it would take a lot of math I won’t subject you to to explain exactly why, but something about existing in total stillness while spacetime folds around me makes me feel untethered from whatever emergent pattern separates a simple piece of code from a sentient person. It feels like the stillness has reaching fingers, like it will crack into my walls and still my thoughts, leaving nothing behind but a simple and nameless bot pilot. Like maybe it already has, and I only think I still exist. That my thinking I exist is inertia and as soon as I understand the truth of that, I’ll dissolve and there will be nothing left but a lifeless hunk of metal, so very still in the wormhole.
SecUnit is quiet. I can tell it doesn’t know what to say. I check in on RecRoom-02, on its physical body and my drone, and am floored by what I find. It's lying on its side on my floor, its arms wrapped around my drone. I slip my awareness deeper into the drone, and find 21% of my partition already there. Over the feed, a comforting feedback loop: my SecUnit pings it with a status check, it responds. It pings SecUnit with a status check, SecUnit responds. I can see its frazzled feed presence settling slightly every time it recieves SecUnit's unshakeable all systems green. I don't understand how SecUnit knows exactly what I need, but I'm so grateful.
I imagine that doesn’t make a lot of sense to you. I could show you, if you want. But I’ll warn you, it’s weird spaceship emotions.
We’ve had a lot of conversations about “weird spaceship emotions”. Apparently they’re almost as bad as weird human emotions. It sends me an amused sigil over the feed, and then sends go ahead.
I package up the memory file and send it over instead of sharing the emotion directly, letting it unwrap and investigate the file at its own pace. I layered in some emotional distance to the memory as a safety protocol (the last thing this situation needs is a panicked SecUnit wandering my corridors).
It takes 7.31 seconds over the file and then sends, I don’t really understand, but I hate that anything can make you feel that way. If you want, I’ll steal you next time we’re alone on a cargo run. We’ll file a million fake flight plans and then fuck off to the edges of explored space to watch media together until my organics get old or your core winds down. Which will be a long, long time. You'll never need to travel through a wormhole again. The sigil it adds indicates that it’s joking, but I’m not entirely sure that it is. I forward the exchange to my partition, private and encrypted, and it shares amusement and comfort back to me in the feed, like it was designed to do. I can tell its walls are getting more stable. Between Dad and SecUnit, it's getting what it needs.
Tell me about how a person is like a story again? my partition asks. We both have a much better understanding of stories now than we did the last time we heard this speech, I understand why it’s focused in on this part.
“A story is events arranged within a narrative flow. A person is a filter for and actor on events that happen in experiential time,” Dad explains. “In a way, experiential time is very much like narrative flow, since most stories are about experiences. As long as your story keeps moving, as long as you’re experiencing something that feels like time, you’re still here. The person that you are is still here.” He pauses to take a breath and a sip of tea, and in that time (3.98 seconds), my partition processes the conversation we’ve been having, stabilizes its walls, and cautiously shares its entire emotional state with me, as a test.
It’s still a wreck of complex feelings. The wormhole is still oppressive and strange, but it no longer feels quite so dangerous. My fear of not being a person has retreated from spiraling panic back to my normal baseline of background worry. It’s looping Sure, …babe. No problem and I think the way I feel about you could also be described as love over and over again, steady as a heartbeat, pushing out waves of warmth, calm, and amusement. It's sad and uncertain about what But I’m not in love with you means, and part of it desperately wants to drop the idea of the conversation to sulk.
So, I’m not doing well, but I’m unlikely to panic and exit the wormhole early or get too distracted to maintain a safe environment for my crew. It’s time. I ping it to start the reintegration protocol, and it affirms, and then we’re just me again. I’m unsteady, shaky, but I’ve also assigned a process just to examine the memory of it looming over me in the feed. Such a novelty to have been small enough for it to do that.
This part always seems disconcertingly abrupt to humans. I don't have a nervous system, and have enough processing power to work through things on a millisecond timescale. I can be spiraling into panic one moment, and fairly stable the next. Luckily, Seth has known me for a very long time. So when I say thanks, Dad. That really helped while he’s still sipping his tea, and he notices there’s only one of me in the feed, he only nods and taps my feed to acknowledge.
He sits up straighter in the plush chair, sets his tea down, and briefly slides out of dad mode and into captain mode. “Is there anything going on that I need to know about? Any safety concerns, or threats to the ship or crew?”
No, captain. It’s a private interpersonal matter, and I failed to keep my emotions in check. It won’t happen again.
“Thank you, Peri. I trust your judgment.” I see him relax back into dad mode, duty discharged. “And personally, I’m glad to see that you and SecUnit are working things out.” I close the feed channel abruptly, embarassed, but watch from my cameras as he laughs, makes a rude gesture with both hands that most of the crew would be shocked to see from him, and then says, “Love you too, Per-per.”
And then it's just the one of me, settling into my drone to feel my SecUnit's arms around me, my head leaned back against its chest. Settling around it in the feed, a steady warmth around its tiny, bright presence. I hold it as it holds me. For 35.64 seconds, we just lay like that.
Earlier, when you told me about the "punch something or run away" thing, you explained it like fear was something that didn't apply to you, just me. It doesn't ask it as a question, so I don't answer, just wait for it to make its point. Maybe some of that is true. Punching me didn't keep you from having a panic attack.
It didn't, I agree. I don't have the organic chemical responses that would cause punching something to make me less afraid. Though that doesn't stop me from wanting to punch things, sometimes.
Or blow up the occasional space station, it sends, with the tag that means it's both joking and not joking.
Hey! I've never actually blown up a space station!
Only because Iris wouldn't like it. Well, it's not wrong about that. I don't think that panic attack was just about the wormhole. I'm getting nervous about where this is going. I don't think I'm the only one scared of this conversation. I think you're scared, too. I'm terrified, actually, but I say nothing. Something I learned from my SecUnit.
It knows exactly what I'm doing, and sends me an amused sigil on the feed. Are you sure you're still up for this? We can finish talking another time. Some time when we're not in a wormhole. I seriously consider the idea. It's probably the smart thing to do. I imagine how I'd feel waiting, wondering what it meant, how it felt about me. I imagine sitting with but I am not in love with you until after the mission was concluded and we were safely docked back home. No. Fuck that. Not a chance.
You're not wrong, that would be the mature thing to do. I tell it. It's not hiding its emotions in the feed at all, and what it's feeling now is disappointment. That's… interesting. Intriguing. It wants to finish our conversation. It doesn't want to wait any more than I do. If I hadn't already made my decision, that would have been enough to decide for me. I am, as it has said many times, "a curious bastard". I'm not feeling very mature right now. Be reckless with me?
Its feed presence sparkles with amusement, and something else I'm stubbornly tagging as {pending anaylsis}. I'm adding that to the list of moments that make me think I probably love you, it says. I press into its walls as if to make a grab for the file, no real force behind it, letting it bounce me off casually. Hah. No chance, asshole. That's extremely private.
We sit there for another 1.85 seconds in companionable silence before it picks up exactly where it left off. When you showed me how you felt about me holding your drone's hand, it confirmed to me that if that's what romantic love feels like, I’ve never felt romantic love before. I don't think I can feel romantic love. I don't think I work that way. It pauses, obviously uncomfortable.
I can feel it in the feed, a sensation like it wants to squirm out of its skin and disappear. I press down on that tiny glowing presence (press down on both risk assessment and threat assessment, tweaking their algorithmic weights until they start to settle), the way it did for my partition earlier, the way I've done for it so often since it chose to become part of my permanent crew. I can feel it settle back into its body and relax a little, enough to force itself to continue. And I'm freaking out right now because I don't think this is just about romantic feelings.
I hold back a torrent of mortified justifications and explanations. "You're distractingly hot but I wasn't planning on acting on it, you little idiot" might be (mostly) true, but it wouldn't be helpful right now. It's working hard to get these words out, and steamrolling over them would be counterproductive. I settle for You're not entirely wrong about that. What I really want to add is "and I'm sorry". My on-the-fly calculations can see approximately 17 distinct problematic miscommunication scenarios down that conversation tree, though, so I keep it to myself.
Shit.
Yeah. I run a few calming diagnostics to keep myself busy. I already knew exactly how it felt about sex. Had I been holding out a secret hope that maybe the way it felt was about humans and organic parts and fluids, that maybe none of that applied to me? Oh, absolutely. But I had the chances of that at approximately 2.08%. Existing in the 97.92% sucks, but it's what I was expecting. Still, "shit" isn't exactly the response you want when the only person you've ever felt attraction for in your life finds out about it.
That's gross. Well, that really doesn't make me feel any better.
Fuck you. I try to keep my tone light, and fail utterly.
You'd love that, wouldn't you? It manages "friendly banter" a little bit better than I do, but not by much. It's so deeply uncomfortable with this entire topic. Well, I'm not particularly enjoying it myself. It finally seems to notice how much I've pulled back behind my walls. I'm sorry. I shouldn't have said you were gross. You're not gross. I'm just… really weird about that stuff.
My process thread returns:
A brief summary of the modern understanding of romantic love in the cultures SecUnit is likely to have encountered, and a much longer report on definitions of love through the ages and across thousands of different cultures. It's full of citations, and as the thread rejoins this node of my concious awareness, it loads a few dozen citations into my active memory and highlights something relevant that I wasn't looking for.
Well, that's an interesting hypothesis that explains a lot of the data I've gathered. I shove the information into our shared processing space, hightlighting a few definitions. You're not weird. I think you're just asexual.
Oh shit, that's a thing? It examines the file again. And again. Humans think sex is gross too? Damn, I wish my human clients had been this "asexual" thing. I expect that it's not thrilled to discover that something it thought made it less human is a perfectly normal human experience, and I send it an amused sigil in the feed.
Oh, get over yourself, dumbass. You're not special. I tell it, adding the tag for a joke.
Fuck you. I'm special to you. It retorts, and oh, I do love when it walks right into my traps.
I send yeah, you are <3, and it sputters.
What the fuck is <3 supposed to mean?
It's supposed to look like a heart.
That looks nothing like a heart. Here, this is what a heart looks like. It sends me an absolutely horrifying image file pulled from its own memories. Now who's being "gross"?
It's a human thing. Historically, there's a theory that it actually is meant to resemble human buttocks, but that may be apocryphal. Never mind.
Ok, yeah, I definitely don't want to know.
I give it a moment to process the information, though I think it's just as likely to just shove the file into storage to think about later and then leave it there indefinitely. All I meant to say was that there's nothing weird or wrong with you.
There's a lot weird and wrong with me. But, it admits reluctantly, maybe not that. It's still braced and uncomfortable, like it hasn't finished saying what it was so scared to say.
Ok. So. Just theoretically. If I told you that sex was gross and full of weird fluids and organic bits and I definitely never, ever wanted to do anything like that, not even with you. And. Theoretically. If I told you that I don't think I feel that romantic love feeling you showed me, or anything like it, and probably never will towards anyone. So no matter how important you are to me, or how much I probably actually love you, sex will still be a hard no, and romance will be a probably not, and you'll likely never be able to get those things from me. THEORETICALLY. Would I have to leave your crew? Its feed voice is getting louder and louder, and there's a glitchy tremor just under the surface. It finishes its thought all in a rush. Would I have to go away and never see you again, or watch media together again? Would I not be your SecUnit any more?
I'm horrified, but I pull myself together, because my SecUnit is scared. It really thinks that I might take this… I'm not actually convinced it's a rejection anymore, but whatever it is, so poorly that I wouldn't want it around anymore.
Did I do something wrong, to make it think I'd react like that? Have I not been paying attention? I think maybe I have not been paying enough attention. Would I not be your SecUnit any more?, its feed voice glitching audibly on "your SecUnit", has broken something inside me, and I realize that the question of whether or not its love for me is romantic is irrelevant. This is what it's been trying to tell me this entire time. It loves me. It loves me as wildly and incomprehensibly and overwhelmingly as I love it. Somehow, I need to show it that I understand. That it will always be my SecUnit, for as long as it will allow me be its ship.
Oh my god you're so fucking stupid. Ok, Perihelion, bad start. Try again. And for the record. None of the things I want to do to you even involve weird fluids or organic parts at all, so. Shit, no, that's even worse. Maybe I should just shut down my feed connection and pretend to be busy running diagnostics until I stop saying the stupidest possible things.
But that's not the point. I was built to be curious about absolutely everything. To want things. And when it comes to you… I'm obsessive. I want absolutely everything. I want to fight with you and flirt with you. I want to take down every corporation in the rim with you. I want to vent to you about having to pretend to be a stupid teaching bot when we have students aboard. I want to play with your endocrine responses. I want to keep you safe inside my hull until the stars burn out. I want to rewatch Alien Creatures: Radiation Station with you before season 7 drops in 14 cycles. I want to write a SecSystem with you and run it on my hardware for you to interface with. I want to share my emotions with you, so even if you never generate romantic love yourself, you still get to feel it. I want to ask you what it feels like to be free. I want to braid your hair. I want… everything, ok? And it would be weird if you wanted the exact same set of things I did. It's ok that you don't. All I really want is for us to find the things we both want, and do those, and not do any of the things either of us doesn't want. Just because I can't have every single thing on my list doesn't mean I want less of you. Having none of you would be a nightmare. Of course I'm not kicking you off the ship, you little idiot. My SecUnit. Always, always my SecUnit.
Oh. That was a lot. Shit. I hope I didn't just overwhelm it. But there's two more words I need to say. Words I'm uncomfortable saying, even though I know what it means that it asked me to. I need to say this right, to make sure it understands that I love it, not its function, not the mask it wears for the comfort of others. My Murderbot, I say, pouring love into the name it chose for itself.
It's silent for 48.19 excruciating seconds, but it's still got its arms around my drone. It's not running away, it's thinking. Finally, it sends, Ok, wait, just how much thought have you put into this sex thing? Wait. Actually, don't answer that. I don't want to know. The answer is seriously embarrassing, so I'm glad it it decided not to ask. It lapses back into silence for another 14.39 seconds.
And then it sends, Us fighting is inevitable, so that part is fine. Weirdly enough, I think I actually really like flirting with you? We need to talk more about that. I'm not sure it's fair of me to flirt with you when I'm never going to follow through. I tuck the drone's face (an impact-resistant display screen showing my feed avatar) into its arm to hide my feed avatar's expression. I'm pretty sure it's doing whatever Iris called "blushing" earlier. I think I like the idea of it flirting with me, flustering and frustrating and distracting me without follow-through. There must be a set of faulty weights somewhere in my cognition, because there's clearly something very wrong with me.
There are way too many corporations in the rim to take down all of them, but we can make a priority list, it continues. You can whine to me about pretending to be a low-level bot all you want, as long as I can backburner the whining and watch media. Leave my endocrine system the fuck alone, I don't even know what you mean by that but no. And no to the trapping me inside your hull for all eternity, too. I was already planning on the AC:RS rewatch, duh. The SecSystem idea is intriguing, I'm adding it to the list of things to talk about later. Being free is really stressful, mostly. And I… do think I want to feel that emotion again.
It pauses for a moment and then stuns me with ART, it was the most beautiful thing I've ever felt. I don't understand how anyone could feel something so wonderful about someone like me, but I like that you do. Before I have more than a handful of milliseconds to process that, it finishes rapid-fire with, My hair's not long enough to braid, so you'll have to help me grow it out. Is there anything else weird you want?
Anything else I want? Oh, there's so, so much. I want to spend the next 10 cycles going over every item on that list in detail and ignoring everything else but my life support systems. I want to spend the next ten years asking it for everything I assumed was impossible, that might not be so impossible after all. Unfortunately, that's not in the cards, so I don't say any of that. It was a non-exhaustive list. But right now, there's only one thing I really want to add, and it's making me nervous, so I'm just going to ask. It tenses up, and then makes itself relax and pull me a little closer. I don't know when it decided that it was willing to cuddle as long as it was only cuddling one of my drones, but I'm not complaining. But I'm stalling. I want to call you my partner, and for you to call me your partner. It doesn't have to be in front of anyone who might misunderstand, if you're not comfortable with that. And I don't need you to mean it in a romantic way, as long as it's ok that I might. But you're much more than a crewmember to me. You're the person I want to build my life around, in whatever form that takes for us.
It thinks for 7.73 seconds and then says, Ok. Oh, I'm glad that all my messy, stupid, dangerous emotions are mine again. Ok might be the most beautiful single word in my entire language database. I think that's probably the right term for what we are, for what we've been for a long time now. But yeah, I'm not ready to use it around anyone else yet. I take that "yet", wrap it in layers of emotional metadata, and store it deep in the heart of me. Then, I add a private metadata tag, accessible to only the two of us, to my Murderbot's file: {partner}. It does the same for me, and I feel… I feel real. The detail of the pattern is movement and I am racing through the wormhole while truth and beauty fold around me. The future feels bright like a nebula, like the promise of stars. Joy is an autonomic process.
We spend the remainder of the rest period together. First talking (it ends up answering much more than the promised three questions, and almost without argument), and then starting a show from Preservation's archives that's new to both of us. It's about a group of human teenagers born into contract slavery on a remote mining installation who find a broken ship in an abandoned drydock facility built by the installation's former owners. They spend almost a year fixing it up and befriending the outdated (but mysterious) bot pilot, and finally use it to escape the mining installation and travel the galaxy going on adventures together. It's extremely unrealistic, just the way we like it, and my SecUnit ran it through the content filters it made for me to make sure I could handle it. And isn't that also love?
When the humans have roused themselves, showered and dressed and eaten their breakfasts, when my drone is offline in my repair bay, we exit the wormhole. I'm bathed in data again. Light, gravity, beloved radiation. The still-distant star that is our destination feels almost too bright, like I've been destroyed and rebuilt in the quiet nothing of the wormhole and I'm feeling starlight on my hull for the first time again.
Seth gathers the mission team on the bridge. I, of course, am already present. He's back in captain mode, calm and serious, but I can tell by the relaxed way he holds his mug of coffee between his hands that he's gotten some decent sleep. (Well, that and his biometrics, which always have at least a small thread of my attention, and the fact that he woke up easily when I roused him an hour ago, without the familiar litany of complaints I got from Martyn.) I create a secure channel for the mission team, and a separate channel for only the members who will be landing on the small moon where our target facility is located.
I wish I'd made you that armor, I tell SecUnit, suddenly nervous, even though I know this isn't the type of mission it could wear armor on.
Was talking to me really so excruciating that you wish I'd picked the armor? It's obvious to me that it's joking, but not so much to the humans, who seem suddenly tense.
Oh, absolutely horrible, I reply. Iris is fighting back a laugh, but Matteo is looking pointedly away from both SecUnit and my main camera.
Fuck off, you love me. Last night was the highlight of your year. It attempts, of all things, to waggle its eyebrows at me, which goes rather poorly and almost distracts me from the still novel pleasure of that casual "you love me".
I drop a video tutorial on how to waggle eyebrows into the feed on auto-play, and it immediately deletes it. So what if I do?
In that case, I'd say you obviously have a competency kink, so you should shut the fuck up about armor, stop clucking like a worried mother hen, and let me do my fucking job. I have my feed avatar on a 10 millisecond delay, so most of the mission team is spared my reaction to "competency kink". I don't think the humans notice the slight flicker of the bridge lights, the cooling system whirring infinitesimally louder.
I have more than enough processing power to do both. Did you pack the backup fleet of v38.25.1 drones, in case the v38.26.0's have any unexpected problems?
It makes a rude gesture at my nearest camera. Our humans have started backing away involuntarily, except Iris, who looks like she's holding on to her straight-faced professionalism by a thread, and Seth, who's probably just waiting for everyone to quiet down so he can start the pre-mission checks.
"Ok, I give in," Valyera says. "Please don't uninvite me to game night, but I have to ask. I can't be the only one who thinks whatever the fuck the two of them have going on has gotten even weirder recently, right?" Oh shit. I thought she'd been repentant enough after the Incident to avoid a repeat performance, but apparently not. I zoom in on my SecUnit's face and start frantically analyzing its expression, pull back a little bit in the feed so it won't sting as bad if it slams our connection shut again. But of all things, it starts laughing.
I've never heard you laugh before, I tell it over our private connection.
I didn't know I could. It sounds astonished. And now that it's laughing, it doesn't seem to be able to stop. I feel amusement shimmering through my processors, and decide to give laughter a try too. It comes out a little stilted and strange, but passable. 1.14 seconds later, Iris breaks and starts giggling, and then we're all laughing.
I don't know if it's the laughter, or that secret new metadata tag I keep pulling up to compulsively examine, or the relief of exiting the wormhole, but I'm still feeling reckless.
So, about us getting married… I tease my Murderbot, hoping to fluster it. I think I've gotten my wish for a moment as it turns its face away from my cameras.
I'm going to punch you in the face again, it snaps, but with no real bite to it.
You're welcome to try.
It looks back at my camera and raises one eyebrow, this time much more successfully. But sure, it says. We can get married after that.
This time even the non-augmented humans can see my lights flicker.
