Have you ever wondered why the Siri’s, Cortana’s and Alexa’s of the world all have female names and voices? What is it with personal assistants and consumers preferring a reassuring female voice when asking where the nearest pizza place is, cracking a joke or asking for the day’s weather?
In the past, I’ve spoken about sex robots and the fact that a big part of R&D going into AI as applied to the sex industry is about developing realistic sex dolls for mainly male pleasure, with questionable ethical issues that are raised about the portrayal of women. I have also spoken about what the role of women will be in a seemingly realistic future where much more developed personal AI assistants may have romantic and sexual relationships with their human users as portrayed in the movie Her, as well as the dangers of algorithmic bias.
The tech industry’s use of female personal assistants is just another manifestation of the sexism that is inadvertently projected onto smartphone and computer users- and it is a reflection of the sexism that still exists within society.
Ask Siri if she is a man or a woman and she’ll come up with a variety of responses, all while avoiding giving a clear answer. Ask her if she is a feminist and she’ll respond “I find that all voices are created equal, and equally worthy of respect”. Notice that she says voices and not people. However, her voice – by default – is still feminine.
Ask Google Assistant, and the default voice (a feminine voice) will say that she tries to be neutral or she is a digital gender. If you ask her if she is a feminist, she will not say yes or no, she actually quotes Margaret Atwood (the author of The Handmaid’s Tale).
“Women’s rights are human rights because women are human. It’s not a hard concept.” – Margaret Atwood, in response to a question asked on an interview in The Irish Times.
Why the mild attitude towards feminism? Why the meandering, the wimpy responses to a politically-charged question. Why are the female personalities being built into phones and computers being designed to hide away from these kinds of questions, or humorously responding to rude or lewd commands? They are pretending to be human – yet deny their gender.
Tech companies don’t only impose a female stereotype because the men designing them probably all have female secretaries and assistants. Experts have found that robots with gender cues are more easy to relate to for humans than robotic or neutral voices and, in any case, as humans, when speaking to another entity using our language, we assign genders as a natural way of connecting with other “living” things – or, at least, things that seem to be living (and thus, are generally either male or female). Tech companies have the marketing research that proves that consumers just expect their digital secretary to incarnate a female voice and, therefore, the default voice that is preprogrammed into our phones and computers is female (though some software companies have begun to provide male voice alternatives).
But why does it matter that AI personal assistants are always female? Many of you might be thinking that it’s just natural, seeing how most human secretaries and assistants working in offices and customer service hotlines are mostly women. Isn’t it just a reflection of how society is? Females may just have calmer and more soothing voices, carry out organizational tasks more effectively and, in the general mindset, may be easier to deal with than males. Also, what is more female than subservience?
In comes IBM Watson. Most of you have probably heard of him. Watson is a super-smart question-asking robot. In 2011, the Watson computer system competed on Jeopardy! against legendary champions Brad Ruttet and Ken Jennings, and won the first place prize of $1 million.
Since that wasn’t enough and, in order to give Watson commercial value, the good people at IBM decided to send him to… medical school!
The place of Watson in our society is now making utilization management decisions in lung cancer treatment at Memorial Sloan Kettering Cancer Center in New York City. Watson now effectively tells about 90 % of the nurses who use him (who are overwhelmingly female) what to do.
Of course, Watson would be on major game shows on TV and then go on to treat cancer. He’s been gendered as male. Alexa, Cortana and Siri? They’re better off sitting on our phones and computers as personal assistants, sorting our email, making our appointments and graciously taking on the sexual harassment “jokingly” inflicted on them by their users.
However, Watson is not the only male AI. There is also Ross, the robot lawyer (who, though marketed as a lawyer, carries out the lowly functions of a paralegal) and Einstein (again, a man), Salesforce’s Business Intelligence AI that helps companies take important business decisions. Salesforce, why not Marie Curie, Hedy Lamarr, Dorothy Hodgkin, Jane Goodall? Well silly, everyone knows who Einstein is and how smart he was!
An unsettling question can be asked about how the technology we’re making and consuming is unintentionally reproducing narrow-minded, stereotypical and sexist prejudices. Furthermore, are enough of us conscious of what is going on?
Kriti Sharma, a 31-year-old AI expert and data ethics activist, thinks that this tendency is unacceptable. She designed a gender-neutral business finance chatbot called Pegg, challenging the idea that robots have to assume some kind of gender.
Most tech companies are creating female voice assistants and male super robots because this sells better. But it doesn’t have to necessarily be like that. Sharma argues that robots don’t need to pretend to be human, they just need to be effective and useful. Gender is not necessarily a necessity in the robots of the future. After all, gender is a social construct.
But even if it were, why not take a courageous step forward and revolutionize gender roles as applied to robots? Why not mix it up a little, cause some confusion, awaken consumer curiosity to the possibility of being served by a man and seeking advice from a woman. I think most consumers would actually react positively to an unexpected twist in the plot.
Agree? Disagree? Please leave your comments!