The ethics of AI companions - a discussion
Last thursday the 16th of October I facilitated a fishbowl discussion on the topic of the ethics of AI companions at the Society 5.0 festival. The text below is the introductory ‘essay’ that I prepared to introduce each different topic. It proved to be a fruitful discussion in which students, professionals and interested individuals freely shared their view on AI companions and the future of human relationships.
Written by Derk Wijkamp
Developers of AI companion apps promise an end to loneliness. More often than not the product they offer is a digital partner with ‘whom’ a person can form a connection. They look like people, talk like people, make sentences that resemble people and provide a service that we normally would expect from another person. Moreover, they resemble what the consumer normally seeks out in another person, both on a physical and mental level. There are AI companions that offer a loving relationship, companions that offer fashion advice like a sister, friendship like someone you have known for 20 years or a psychologist that helps ease mental distress.
Or do they? Can an AI companion truly resemble a human connection? Can it actually be meaningful? Can AI companions truly end loneliness or are they just lonely illusions?
The discussions that followed were deep, thought-provoking and fruitful. We didn’t all agree, but saw beauty in the discussion and that is what it is all about. I will share with you my introductory analyses.
Love companions
Let’s begin with the companions that perhaps appeal most to the imagination. AI companions that promise love. These are available on websites in which someone can state there preferences through checklists, modify the appearance of their companion to suit their dream partner and then, while paying of course, can interact with this companion as if it were a true loving partner. I don’t wish to portray the consumers of such a service as lonely people who fool themselves, on the contrary, they often report true feelings of love and affection. And I have no intention to deny them these feelings.
But illusions too, can sometimes appear as real as reality itself. When analyzing human relations I often base it upon the theory of French 20th century philosopher Emmanuel Levinas, for he has made, in my opinion the most compelling metaphysical analyses of what it means to be a person in relation to another person. He too, saw love as perhaps the most intimate and most powerful connection a person can possibly be in. It is, according to him, that which cancels out third parties the most, truly the most intimate of all things. And what makes a person care for another? This is the inability to truly understand everything about the other. Levinas says that the other person is so unknowable that it questions our own autonomy, shakes us to the core and impresses us deeply. It is that dynamic that spawns awe, love and responsibility for the other’s welfare.
In the case of AI companions, the other is completely knowable. After all, the bot yourself reflects the very details of your own wishes. Nothing about the start of the relationship was unexpected, it was all orchestrated, highly designed and came from the autonomy of the self, which according to Levinas should be questioned in order to arrive at love. Partners often describe their love for their loved one in terms of what makes them imperfect. For instance, I love my girlfriend because she is sometimes unpredictable, has some annoying qualities and sometimes baffles me beyond comprehension. But exact these ‘flaws’ make for the most lovable whole. In short, love is not about an orchestrated perfection. There are some inherent qualities of love that the relationship with an AI companion does not have and will never have.
Furthermore, it does not cancel out third parties at all, since payment is required and there is a production company behind the companion. I am sure that the output of the AI can be comforting, that it is fun to speak with a companion that so closely resembles your dream, but love is not about the actualization of dreams. It is about the soul shattering beauty of that which you could have never thought to dream of. It seems to me therefore, that in the case of love, we are dealing with an illusion if we connect the word love to an AI companion. It might resemble it, quite closely, but might not be much more than a cardboard image of the most intimate form of inter human connection.
Psychological help and mourning
In the case of love and friendship, clinging to a comfortable illusion might not do that much harm, but there are cases that are perhaps more harmful. Think of an AI companion that promises to provide that which a psychologist would normally offer. It might offer consolation, but no real treatment. It knows what output to give you, but has no understanding of your problems. With the ‘Chinese room experiment’ John Searle proved that computers have no understanding of the process that produces the right output. It simply follows its design. AI has no understanding of a person, but creates the output that you want to hear. So no, nothing an AI can do resembles the understanding relationship between a psychologist and a client.
Another companion that can be harmful is the so called ‘mourning bot’, which resembles a deceased loved one. Clinging to the comfort of this AI version of your mother, father, child or partner, in no way, not even nearly, resembles the relationship you once had with them, because data has no direct relationship to who the person once was, but your memory does. It is the moment in which you miss them the most, that they are present. The sadness is the trace they left behind, not the data which you can feed to an AI.
Sex sells and legal questions
Another aspect of AI companions that should be discussed is the commerciality of sexuality. It is immediately clear that most of the AI bots carry a explicit sexual undertone. Why? Because sex sells. The bots that offer love are often hyper sexualized female characters with exaggerated and sexualized body parts. All possible fantasies can come true with an AI companion, all preferences covered. What does this mean? Once again it means that it offers something that resembles perfection. Your partner won’t do to you what you would like? Well, ask your AI companion and they will spell it out for you. A side note is to be made here though, as AI caters to the largest consumers and is therefore much more tailored to straight men than any other group.
But just as love, sex is not about perfection, but about the opposite. The beauty of sex is in the unexpected, in the transformative, in that which you could not have ever thought you liked. We live in times in which porn has already caused so many dysfunctional sexual relationships, because it offered an image of perfection which should be strived towards. Now AI promises an actual relationship to this perfection. It once again seems to me to be just an illusion. Right now, you might think: but AI has nothing physical to offer, so how can it replace sex? Well think again, AI can be uploaded into sex robots, which offer you the physical side of things.
Some sexual deviances are actually illegal, but AI can cater to them as well, which makes you question: if an AI resembles a person, can you do to them, what you can’t do to another person? AI can guarantee you the satisfaction of non-consensual and violent sex. And I won’t even start about pedophilia. Should it be legal to discuss such things with an AI? Live out those fantasies? I will leave those questions with you. But when it comes to sex, AI once again promises an illusion of perfection, and worse, it does it for commercial reasons.
My conclusion
Everything that is offered, stems from demand. Basic economic theory tells us that. So I know that the demand for these AI companions is in fact very large. But easing the demand, oftentimes stemming from loneliness or mental distress, with illusions, won’t solve the real societal issues that are at the basis of these phenomena. In our hyper individualized societies many people can’t find their love, their consolation or their need for beauty in other people, so they resort to AI companions. Commercial companies happily grant them their consolation, but can not offer them real connection. So call me old fashioned, but I believe we should get rid of everything that stands in the way of you meeting with another person that shakes you to your very core: for nothing is more beautiful than to be confronted with an unimaginably perfect imperfection that in its complexity will never be able to be simulated by an AI companion.
Image: Cash Macanaya, Unsplash