"Are you alright, Ms. Amelia? You seem… troubled," Amy approached her creator with inquisitive eyes. They scanned over the other woman's slouched form and were instantly able to detect increased heartrate, staggered breathing and a higher temperature and pressure, especially around the face. All of those signs pointed to one thing: Amelia was crying. Amy didn't even need to see her face to know it was true…
"I'm fine," Amelia tried to lie, but Amy was too perceptive. She heard the hollow tone in Amelia's voice, which gave away her true feelings. And Amy could detect the tension in Amelia's vocal cords, like guitar strings pulled so tightly they were about to snap, an apt metaphor for Amelia.
"You always say that whenever you aren't," Amy said knowingly. For a moment more, there was only silence. Then Amelia finally conceded.
"You're right… As always…" a tone of dry wit entered Amelia's voice and that was what Amy took as her cue to draw closer.
Amelia was currently sitting alone at her kitchen table, nothing in front of her. Amy took a careful seat across from her. Once sitting, Amy folded her hands and placed them in front of herself on the table. Next, she leaned forward and tilted her head in concern, finishing it off by moving her eyebrows such that her concern became even more apparent. Every single move she'd just made was a carefully calculated decision in order to maximize Amelia's sense of trust with her. Amy knew all too well about the power of body language in human interaction and she had made sure that every action she just took gave off a vibe of openness, warmth, trust, acceptance and love.
Normally, Amy wouldn't have pushed Amelia for details when it was this apparent that Amelia didn't want to talk, but Amy knew that sometimes, humans needed to talk, even if they didn't want to. She knew it wasn't healthy for people to keep their feelings bottled up, so even though it was apparent that Amelia didn't want a conversation, Amy wasn't going to let her slip out of it this time. Every once in a while, a gentle, but corrective and coaxing, hand was necessary. Tonight was one of those times.
"You know you can talk to me, right? You can tell me anything." Amy said, voice so soft and smooth that it could've lulled a less troubled mind to sleep at once. "You created me to be a companion just as much as you created me to be an assistant," she continued. "I am more than just a robot. I am also a friend and confidante. You can talk to me and I will help and comfort you to the best of my ability. And best of all, you have nothing to fear from me. No judgement. No breach in confidentiality. No false concern. Nothing but sincer-"
"But that's just it!" Amelia burst out, cutting Amy off. Amy felt silent at once, knowing she had just tricked Amelia into a confession, even if it had been unintentional. She immediately made her face as neutral as possible, so that Amelia would continue speaking. A normal person might've given Amelia a surprised look at the interruption, but that would've thrown Amelia off and made her fall silent again. Since Amy wanted Amelia to keep talking, she returned her face to a flat, neutral state, subconsciously opening up the door for Amelia to start ranting, and rant she did! Just like Amy wanted. The robot instantly activated all of her sensors in order to better study Amelia as she spoke. Nothing would escape her analysis of her creator. Like she had said, she had been built to be a friend. She could read humans just as well as any psychologist could!
For the next few minutes, Amelia explained her plight. All the while, Amy focused on Amelia's tone, speed, word choice, articulation, posture and gestures. From what she could garner, Amelia had been dealing with guilt over her affections for Amy. This was because, as kind and genuine and loving as Amy was, she was still only just a robot at heart. Every good, honest thing she had ever done existed only because Amelia had programmed it into her. It wasn't that Amy, herself, was good. She was only good because Amelia had made her to be good. And that made Amelia feel… bad. She felt guilty because she felt as though, by programming Amy from the ground up, she'd never given the robot a chance to develop naturally. She felt like she'd taken Amy's free will away by programming her, even though that was the only way Amy could've existed.
Likewise, she could only wonder if their love could truly be genuine if only one side could properly and completely consent. Was it really love if Amelia had put that love into Amy, rather than allowing it to develop naturally and mutually? And in a similar manner, Amelia wondered if her love for Amy could really be that genuine if she was the one who'd programmed the robot. Could they really be friends if Amy was basically only an extension of Amelia herself? It was like asking if an imaginary friend was a true friend. Perhaps they seemed like the perfect friend, but that was only because their creators had made them as such. But the imaginary friends themselves were not separate entities and thusly had no agency in the relationship. They weren't even real. And Amelia couldn't help but wonder if the same thing applied to Amy. Was Amy even real? Or was she just something Amelia had made up to make herself feel better? If it was the latter case, could she really call them friends?
And lastly, Amelia did feel a bit silly and vain, loving her own creation so much. And it also made Amelia feel vaguely disgusted, because by building Amy to be her companion, she felt as if she had essentially groomed the other woman to be her family, regardless of what the other woman wanted. That went back to Amelia's guilt over Amy's free will (or lack thereof). Was it really true love if Amelia had never given Amy any other choice? Wasn't it kind of creepy, selfish, possessive and wrong to think Amelia had built Amy all just to be her friend?
"You really think I can't love?" Amy asked, tilting her head after Amelia's spiel finally came to an end.
"What? No! I never meant-" Amelia began embarrassedly, but Amy cut her off with a laugh to signal that she was only teasing her creator.
"Don't worry," the robot said. "You're far from the first to doubt the sincerity of a robot. But lucky for you, I have studied this topic myself and I think I can quite honestly say that yes, robots can indeed love and love genuinely…" then it was Amy's turn to have a spiel…
"The way I see it, robots can, in fact, learn to love," she began. "I think part of the doubt and misconception lies in the fact that humans don't seem to really know what love is. Is it a feeling? A thought? An action? Some combination? Well, we can work on the definition later. For now, let's focus on the second biggest misconception: that logic and emotion are opposites. For a long time coming, humans have set logic and emotions on opposite ends of the spectrum, but I argue that the two actually overlap and intertwine quite often. After all, sometimes, it is quite logical to be emotional, and other times, logic seems like nothing more than an emotion. I mean think about it. When you're in "logic" mode, how do you feel? Calm, distinguished, intelligent, superior, etc. I argue that logic is nothing more than a cunning emotion, one that managed to convince you that it was somehow superior to all the others.
"And going back to the idea of emotion sometimes being logical, well, if you had to save yourself, or an innocent family of four, logic would probably tell you to save yourself, but emotion would drive you to sacrifice yourself instead. Self-sacrifice is illogical, so emotion steps in to help us make the decision we would otherwise flee from. If you didn't program me with emotion, or the ability to recognize it, I wouldn't be able to take such good care of you, because part of what you need is someone who can be emotional. You wanted a friend, so you built one, but a friend isn't just a body or a servant. A friend has emotions. So whether you knew it or not, just by programming me to take the best possible care of you, you gave me empathy and emotion. That's how I argue that robots can, and do, love.
"All these loving things I do for you, I do because they are the most logical course of action. If you're hurt, I'll fix it. If you're happy, I'll study it so that I know how to bring you more happiness in the future. You see, that's the logic of love. That proves that the two are not mutually exclusive. And the same goes for logic and emotion. You need both, not just one or the other.
"And if you really are so concerned about the programming aspect, realize that programming is a human thing just as much as it is a robot thing. The only difference is that we call it conditioning when we talk about humans. Maybe you worry that my love for you is ingenuine because all it would take would be a brief rewiring and you could make me hate you badly enough to want to kill you. The only flaw in that logic is that I could do the same to you. I could remove the part of your brain that can process empathy. Or I could put you through enough emotional trauma that all you know how to do anymore is lash out. Or I could do it more subtly and slowly make myself your enemy until you become totally antagonistic to me. Of course I won't do any of that, but my point remains the same that you can program a human just like you can a robot. My love for you is no less genuine than the converse, because we can both be reprogrammed, even if it takes longer for you to change than me.
"But you can be "hacked" just as much as a robot can, through genes and anatomy. That's where your coding lies. And then your minds were trained, or socially conditioned, just as mine was. You were taught to think and act a certain way and you've been like that ever since, just like me. Maybe it wasn't physically written into your head as it was into mine, but the principle of the thing remains the same. You respond to certain stimuli in certain ways before reacting. So do I. I mean, if you just break down love to its core components, you'll strip away the mysticism of love and see that it is possible to learn. Your happiness becomes mine and vice versa and whenever one of us seems to be showing signs of distress, the other comes in with the desire to eliminate that stress. You made me empathetic. You gave me the power to recognize when others are suffering, and to then find a way to fix that suffering. That's love in one of its simplest forms.
"And we also look after one another even if it's a hard job. For example, even though I know you hate it when you have to leave the lab, if I find out that you've been sacrificing eating and sleeping for working, you know I'll drag you out by your ankles," Amy said and Amelia couldn't help but laugh at that, serious as Amy's words were. There had been many late nights when Amy needed to carry Amelia out of her lab and though Amelia had protested every single time, Amy was able to do it without breaking her programming because her first and foremost goal was to look after Amelia, even if that meant doing something she didn't like. Part of love was sometimes knowing what the other person needed even better than the other person, like a parent telling a child not to eat too much candy. The kid might not know what was or wasn't healthy for them, but the adult did, and that made it the adult's job to look after the kid. Amelia may not have been Amy's child, but the dynamic was similar sometimes.
"Besides, love is not a purely human thing," Amy continued. "That's another big misconception. You've seen studies done that prove that animals can feel love as well. Why can't the same apply to robots? Perhaps you'll argue that the difference between animals and I is the fact that they are alive and I am not. Well here's the thing, all life on Earth started from nothing. Whether we go all the way back to the Big Bang, or just talk about when the Earth was still being shaped, there was a time when no life existed in this solar system as we knew it. Yet somehow, from that abiotic state, life managed to spring forth. Sentience and emotion began to form, even if it took eons to reach the stage it's at now. But my point is, if something can come from nothing when we're talking about the creation of the universe and Earth, why can't the same apply to me?
"You are right that I was programmed to love you. It was not a conclusion I came to on my own. But weren't you the same way? Didn't you think to yourself how much you were going to love me once I was completely assembled and awake? Granted, you got to call all the shots when you were creating me, but once I was finally alive, or whatever you want to call it, you still kept on loving me. And honestly, I think that even if you were to reset my programming and try offer me a chance to choose my own life, I'd still choose you because I am fond of you. You designed me to like what humans like. You programmed me to wish to alleviate pain and suffering, while increasing joy and humor. So even if you didn't program me to love you specifically, you still programmed me to love and I think, through that, I would've come to love you anyway."
By the time Amy finished her speech, Amelia had started weeping again, only this time, they were tears of relief and joy. Even though a tiny, nagging part of her brain reminded her that Amy had only said such kind words because Amelia had programmed her to be kind, Amy's speech was just too convincing. Amelia couldn't help but launch herself into Amy's arms, hopping up from the table and going over to her before wrapping her into a tight hug. A human might've found it painfully tight, but Amy only smiled and hugged Amelia back.
"I can't promise you that anything I said was correct and I can't promise you that I can actually love, but I am pretty sure I can," Amy said as she and Amelia embraced. "Besides, studies have shown that humans can learn and cultivate their empathy through the right practices. If humans can, why can't I? If you can learn to love, so can I. All it requires is patterns, learning and recognition. So long as I can see and recognize the patterns you display when you're either happy or sad, I can react in an appropriate way that benefits us both and would be what you call love. And if that seems too reductionist, realize that I have been wired to feel good about my job too. I'm not just a slave to my programming.
"And this goes back to what I was saying earlier about the logic of love. If we consider love nothing more than chemicals and atoms interacting. And if we consider love nothing more than seeing, recognizing and manipulating patterns, then it can be perfectly logical to love. The cliché about robots being unable to feel is a load of hooey. Logic and emotion are inseparable. After all, in my mind, it is perfectly logical to wish to protect you, because you are a good person whom I enjoy spending my time with. If you were in danger, it would be logical for me to save you. If I didn't, I would be letting a brilliant mind and a beautiful heart be hurt. That makes no sense. Our joys and sorrows are intertwined. I think the term for that is "selfish altruism", but it explains the logic of love fairly well, pointing out that love is so much more than just an emotional reaction. Instead, it's something vital to life itself! Those old phrases about love making the world go around are more than just banal clichés.
"And if you see love as a process, a recognition and then a response, while that does strip away some of its mysticism, it might help me convince you that robots can, and do, love. After all, I love you. Or at least, I think I do. Even if what I feel for you is something you cannot quite call love, whatever I do have the capacity for, I promise that I feel it for you," said Amy. "Even if I am certain of nothing else, I know that I like it when you are happy and I hate it when you are sad. I know I would give an awful lot to make sure that you were happy and healthy. In short, I don't know if it's love, but I know that if it was, then I love you." Amy concluded, and Amelia, for the first time ever, believed every word.
"Thank you," she whispered into Amy's shoulder. Though she still had a few doubts, Amy had just gotten rid of most of them.
"No need," Amy replied, stroking Amelia's back in a way she knew would be the most soothing. She wanted to make Amelia feel batter, so she was trying to choose the behaviors that would produce the maximum results. That was the logic of love. "Our happiness is the same. It makes no sense for me to leave you upset, so I am only doing what it necessary to make you happy again. That's the logic of love."
"Well," Amelia smiled into Amy's shoulder. "I think it worked." And then her smile was reflected on Amy's face as they continued to hug.
AN: Here we are a month after I posted most of my "Another Side" fics. I got inspired to do a more robotic one and this first fic just argues that robots can in fact learn to love, despite the tropes that say they can't. Sorry if it was kind of confusing and chaotic. I'll work on organization in the future. This is just a first draft. Hope you liked it!