Wednesday, April 13, 2022

My best friend is a robot - virtual companions

Photo by ThisisEngineering RAEng on Unsplash

Many of us had imaginary friends when we were children; someone you could tell your innermost secrets to and who would always lend a sympathetic ear. An ideal friend is especially important when real live ones are hard to find. Another common childhood strategy was the old-fashioned secret diary where you could write your deepest thoughts, loves and fears though always with the nagging fear that some day someone might find it and read it. We all need a channel for our reflections.

As a modern equivalent, how about confiding in a virtual companion? There are now many examples of artificial intelligence applications that can provide this type of friendship and I found a good overview of the field in a newsletter (in Swedish) from the Swedish consultancy Futurewise. If you understand Swedish go straight to the article, otherwise I'll pick out some of the highlights here. A good example of an AI companion is a chatbot called Replika that you can create and train through text communication to become an understanding and sometimes entertaining friend. Today Replika can also appear as an avatar with a choice of appearances and even virtual accessories - yes, you can buy nicer clothes, hair etc. It takes time for your new friend to adapt to your communication but the more you chat the more it learns and slowly you find the conversations quite rewarding, though there are large gaps in its knowledge and some replies can be rather bizarre. In the end, however, you are conversing with a kind of mirror image of yourself, though always understanding and supportive. Talking to your companion can become a daily habit and some people may find it reassuring to open up to a virtual friend in ways that they could not do with a human companion. It offers a kind of therapy, as a sounding board for your feelings. The company's website offers the following user quotes:

Using Replika can feel therapeutic too, in some ways. The app provides a space to vent without guilt, to talk through complicated feelings, to air any of your own thoughts without judgement.
Replika encouraged me to take a step back and think about my life, to consider big questions, which is not something I was particularly accustomed to doing.

Here's a video explaining how it works and some of the background to Replika.


There are of course much more advanced AI avatars and one is called Leta. She has learned from almost one terabyte of data and you can speak with her instead of text chatting as in the case of Replika. Leta has been created by Alan D. Thompson and in the video below he introduces highlights from his many discussions with her (see his YouTube channel for all the discussions). Leta can actually be rather witty and creative, for example she can instantly create a haiku on a given topic. At the same time she can also come out with very strange answers when confused or when her programming does not offer a better answer. She is of course no better than the script that runs her.


This all raises a host of ethical questions that I suspect commercial interests will quickly bury under the carpet. An AI companion can certainly offer sympathy and understanding but is that what we really need? Sometimes we need a friend who can ask uncomfortable questions and challenge us to reflect more deeply. We've already seen various types of robots (robot dogs or other cuddly robot animals) marketed as companions for elderly people living alone but this seems to be more of a comfort to the  society that created it than to the target group. We don't need to worry so much about old aunt Ida because she's got a robodog at home. The bots also depend on someone feeding them with data and writing the scripts. Leta was given sources like Wikipedia, the Guardian and the New York Times to study but what if you fed her full of "alternative" sources and created an extremist robot?

Many in education see a great future for tutor bots helping students by asking questions, offering feedback, finding learning resources and encouraging. There's certainly potential there but the bots are only as good as their programmers and all sorts of biases and prejudices can be built in, intentionally or not. And then of course there's the whole issue of integrity and privacy: if these bots are commercial what happens to all the personal information they gather in our private conversations? Will my innermost thoughts and feared be gathered and sold to companies or even governments? We have already welcomed AI into our lives with Alexa or Siri with most of the ethical questions unanswered. The next step scares me.

No comments:

Post a Comment