I am a Bot?

Us(c)hi Reiter on Anthropomorphized and Stereotype Bots

published in the June edition of Versorgerin

We talk to cars, animals, devices, and plants. We humans have always had a tendency to anthropomorphize things and animals. But today the things are responding. Software programs are taking on the form of persons or characters and entering into our world in games and social media. We are not always aware of this and interact with them as with other humans. When “social bots” are mentioned, this means programs that digitally support us or show up in social networks disguised as humans and actively participate in our communication. They are even said to be responsible for reinforcing certain opinions or tendencies in social media channels. Here the question arises: Who are the real authors? The bot or its programmers? How autonomous does a bot have to be, in order to have independent action attributed to it?

However, the anthropomorphization of algorithms also highlights another problem. Inherent to the design and representation of digital assistance services, social bots and humanoid robots is also the danger of reproducing stereotype social roles – not least of all, gender roles. The way that bots learn from users today does not change predominant patriarchal and social circumstances and role expectations. Instead, there is a danger we could get stuck in a loop.

In the view of the author Laurie Penny,i which name, which voice, and which responses robots are designed with, is not just an abstract academic question. She sees a connection in the way fembots are treated in our society on the one hand and real women on the other. Many bots that appear on the market and do not necessarily need a gender are furnished with smart female personalities. These include Microsoft’s Cortana, Amazon’s Alexa, and Apple’s Siri.

Microsoft made a clear decision by giving its intelligent software the name Cortana and her voice from the Microsoft videogame Halo2ii. As in many mainstream video games, the female figure Cortana assumes a problematic role here. On the one hand she is strong, but on the other devoted and also beautiful – and dependent on a mastermind. Cortana’s clean blue circuit-board skin cannot really distract from the fact that this figure is actually naked, while her male colleagues hide in the armor of modern robot-knights. The wet dreams of post-pubescent Silicon Valley programmers?

Even though the Cortana app, unlike its model, remains just a blue, speaking dot with a personality, the functions and possible verbal interactions of this and similar apps often remain stereotypes of female subservience. In digital space as well, outmoded ideas of gender roles seem to be taking the upper hand at the cost of possible diversity.

And in the research field of artificial intelligence there is a problem. In the interview “Inside the surprisingly sexist world of artificial intelligence,”iii Marie des Jardins, professor for computer science at the Maryland University, says: “Since men have come to dominate AI, research has become very narrowly focused on solving technical problems and not on the big questions.”

Most bots with personality and natural-sounding female voices remind, assist, note, search, behave well, function – and do not speak unless spoken to. They do not say no. However, they also put their “progressive” parent companies in a moral predicament. The practice in dealing with personal virtual assistants is to verbally challenge them and to probe certain boundaries.

This is what Leah Fessler, an editor at Quarz magazine, set out to do, in order to find out how bots react to sexual harassment and what the ethical implications of their programmed responses are.iv She collected extensive data. The result is alarming: instead of combating abuse, every bot contributes to reinforcing or even provoking sexist behavior with its passivity. As she posted on Twitter:

“I spent weeks sexually harassing bots like Siri and Alexa – their responses are, frankly, horrific. It’s time tech giants do something.”v

Companies allow the verbal abuse of (fem)bots by users without limitation and thus support the way behavior stereotypes continue to exist and may even be reinforced. Abusive users may thus consider their actions as normal or even acceptable.

Submissive and Subservient

Companies like to argue that female voices are easier to understand, and this is why they give their digital assistants female voices. Even the historical tradition that women established telephone connections,vi thus assuming an important role in the history of technology, is used as an argument for why giving bots a female character is positive. That small speakers cannot reproduce lower-pitched voices as well also serves as a justification for the preferred use of female voices. These are all myths, as Sarah Zhang explains in an article: studies and technical investigations show that this is not the case.vii

With all the hype about artificial intelligence that has come out of hibernation, one could think that robots and bots are already autonomously acting entities due to “Deep Learning.” Yet algorithms are currently more than dependent on human trainers and designers.

Bots in Fiction

It is interesting that the word “robot” was used for the first time in the Czech science fiction play R.U.R from 1920.viii It is derived linguistically from “slave and servitude.” The robots in the play were not mechanical, but more like humans and made to carry out work, until the point where, as an oppressed class, they began to rebel against their masters. From antiquity up to modern times, the artificial woman has also been the idea of the ideal woman. And that is still the case in modern science fiction up to the present – new films and series hardly deviate from historical inscriptions about the idea of woman.

Also in the new HBO series Westworld, released in 2016, female robots are created, controlled, used and abused by men. Westworld is based on the novel by Michael Crichton and the eponymous film from 1973. In a theme park that replicates the Wild West, human-like robots (hosts) accommodate the – apparently – most deeply human needs of their guests. What is brutal and abysmal about humans themselves emerges again and again.

In Westworld, experiencing an adventure means doing things that are not uninhibitedly possible outside the artificial world. This includes, among other things, sexual violence. The female characters are very different from the female killer-sex-robots from the Austin Powers films, which are intended to be funny, yet still remain simply tasteless with their inflated breasts, just shooting everything up. Yet the growing self-awareness of the Westworld fembots at least contributes to the way the endearing, innocent woman (the figure Dolores) or the competent boss of a brothel (the figure Maeve) begin to question their creators over the course of the series and start to defend themselves. The series’ stereotype female roles are opened up by intelligent dialogues and at least prompt reflection.

Fembots in films and TV series often deal with dynamics of power, sexuality, and our relationship to technology. But what about the bots we encounter more and more often in our everyday life? The trend is obviously that we will encounter machines in the future that are as human and as “feminine” as possible. The real reason why Siri, Alexa, Cortana, and Google Home have female voices is that this is intended to make them more profitable and accommodate proven needs of customers.

Apart from the fact that we need to pursue the question of what our relationship to machines could look like, and whether it is necessary to anthropomorphize them, it would also be satisfying to imagine that every offensive harassment or presumptuous question to a digital assistant leads to a system crash or, perhaps even better, to fits of rage that could be reinforced with quotations from feminist manifestos.

In any case, this is precisely where a field for experiment and art opens up, which could introduce an important different perspective outside the logic of the market.