Loading...
JoBot™: News on Psychological Artificial Intelligence
The solicitation of AI therapy
81

Solicitation

The solicitation of AI Therapy



The movie “Her” from 2013 portrays a lonely character played by Joaquin Phoe-nix. This character is emotionally vulnerable and in the movie, he obtains a new operating system that includes an artificial intelligence named Samantha. The interactions between the main characters Theodore and Samantha are almost exclusively by voice user interface. That is, Samantha, the character played by Scarlett Johansson is not seen in the movie, but her voice is an every day presence for Theodore.

In this movie, the artificial intelligence system (Samantha) not only organises files and calendar systems, sends emails and cleans up hard drives; she also pro-vides personal advice and suggestions. As Samantha is literally in Theodore's ear every single moment, some form of virtual relationship develops and Theodore becomes dependent on the AI system Samantha.

Eventually, Theodore claims to be in love with Samantha, clearly losing sight of the fact that Samantha is a computer program (after all, computers are social actors). Samantha, the ever-present chatbot, claims to be in love with Theodore as well. In a core scene in the movie, Theordore asks Samantha how many users she loves at this point in time and her answer is several hundred. Having lost awareness of the hard reality that the loving person in his ear is a computer pro-gram, Theodore is very disappointed. Towards the end of the movie, for reasons not specified, the AI system Samantha simply goes away.

This movie is interesting from a number of different perspectives: the use of voice user interfaces for human-computer interaction, the everyday presence of the chatbot, virtual reality including games and, of course, the growing depend-ence of the main character on the superintelligent and ever present chatbot. At some point, when the service is unavailable, Theodore becomes very agitated and display signs of separation anxiety. The relationship with the conversational agent Samantha becomes very important and at some stage almost an addiction.

The software program Samantha is an example for the universal solicitation of current and future AI systems. In the movie, Samantha is available not just for work but companionship, counselling, sex and practically every other need Theo-dore may have. The lines between human relationships and the interaction with a computer are intentionally blurred. At some point, Theodore publicly claims to be dating his artificially intelligent operating system. What is made clear from the example of this movie, if emotional support from a software program is accepted, then no human friend or partner can compete with the endurance and consistency of support provided by a computer. This is an unquestioning and enduring support, that obviously is not mutual, and has an addictive potential.

In a similar vein, little robot that teaches a low functioning autistic child social skills can take an unlimited amount of physical abuse, if necessary. As a matter of fact, it is the robot’s purpose to tolerate abuse while repetitively assisting the child to learn social skills such as making eye contact, shaking hands and smiling. By contrast, human relations, including professional relationships, are always governed by a set of rules and involve mutual ‘give and take’ or ‘meeting of minds’. Yet, mutual, rule-bound behaviour does not apply to the relationship be-tween the autistic child and the robot; nor does it apply to the relationship be-tween an emotional chatbot that offers psychological advice to a vulnerable cli-ent. A negative aspect of such imbalanced relationships is that resistance to stress is lowered when ongoing emotional support is available 24 hours a day.

Even with current technology, it would be trivial to build a 24-hour chatbot that dispenses psychological information upon request. Alternatively, the chatbot can listen to any conversation the user may have and selectively comment as required. With a voice user interface, the user has hands-free access to psycholog-ical information and advice around the clock and can shape her/his behaviour accordingly. With time, the the voice-over by the chatbot becomes a presence within everyday life of the human. The chatbot has a soliciting presence.

There is, of course, an important difference to the conversational agent Samantha in the movie Her. A 24-hour psychological AI chatbot provides targeted advice based on the most recent scientific data available. For instance, let’s say voice or language analysis indicates that the user is stressed and possibly angry as a result of receiving an email. Immediately, the chatbot would intervene and point to the danger of “catastrophising” which is a frequent cognitive dysfunction in depression and anxiety. Catastrophising means to blow things out of propor-tion, a significant cognitive and emotional response to events that are relatively minor in nature. A user who has been made aware of the risk of catastrophising may adjust behaviour and even regulate emotions to avoid further cognitive dysfunctions and as a result distress.

Obviously, the risk could be a learned dependence on the advice of the chatbot. The system would have soliciting power. Unlike conventional psychotherapy, the bot is in the ear of the user all the time; a very intense form of psychological intervention. How would the user learn to live independently and to make indi-vidual decisions if computer advice is available 24x7? In the age of multimedia where a user can have millions of online friends, why do people seek interactions with conversational agents? Loneliness may be the answer.

Diederich J, The Psychology of Artificial Superintelligence. Springer Nature Switzerland AG 2021, ISBN 978-3-030-71841-1, DOI https://doi.org/10.1007/978-3-030-71842-8