Siri’s gender awakening

Featured

Have you ever wondered why the Siri’s, Cortana’s and Alexa’s of the world all have female names and voices? What is it with personal assistants and consumers preferring a reassuring female voice when asking where the nearest pizza place is, cracking a joke or asking for the day’s weather?

In the past, I’ve spoken about sex robots and the fact that a big part of R&D going into AI as applied to the sex industry is about developing realistic sex dolls for mainly male pleasure, with questionable ethical issues that are raised about the portrayal of women. I have also spoken about what the role of women will be in a seemingly realistic future where much more developed personal AI assistants may have romantic and sexual relationships with their human users as portrayed in the movie Her, as well as the dangers of algorithmic bias.

The tech industry’s use of female personal assistants is just another manifestation of the sexism that is inadvertently projected onto smartphone and computer users- and it is a reflection of the sexism that still exists within society.

Ask Siri if she is a man or a woman and she’ll come up with a variety of responses, all while avoiding giving a clear answer. Ask her if she is a feminist and she’ll respond “I find that all voices are created equal, and equally worthy of respect”. Notice that she says voices and not people. However, her voice – by default – is still feminine.

Ask Google Assistant, and the default voice (a feminine voice) will say that she tries to be neutral or she is a digital gender. If you ask her if she is a feminist, she will not say yes or no, she actually quotes Margaret Atwood (the author of The Handmaid’s Tale).

“Women’s rights are human rights because women are human. It’s not a hard concept.” – Margaret Atwood, in response to a question asked on an interview in The Irish Times. 

Why the mild attitude towards feminism? Why the meandering, the wimpy responses to a politically-charged question. Why are the female personalities being built into phones and computers being designed to hide away from these kinds of questions, or humorously responding to rude or lewd commands? They are pretending to be human – yet deny their gender.

Tech companies don’t only impose a female stereotype because the men designing them probably all have female secretaries and assistants. Experts have found that robots with gender cues are more easy to relate to for humans than robotic or neutral voices and, in any case, as humans, when speaking to another entity using our language, we assign genders as a natural way of connecting with other “living” things – or, at least, things that seem to be living (and thus, are generally either male or female). Tech companies have the marketing research that proves that consumers just expect their digital secretary to incarnate a female voice and, therefore, the default voice that is preprogrammed into our phones and computers is female (though some software companies have begun to provide male voice alternatives).

But why does it matter that AI personal assistants are always female? Many of you might be thinking that it’s just natural, seeing how most human secretaries and assistants working in offices and customer service hotlines are mostly women. Isn’t it just a reflection of how society is? Females may just have calmer and more soothing voices, carry out organizational tasks more effectively and, in the general mindset, may be easier to deal with than males. Also, what is more female than subservience?

In comes IBM Watson. Most of you have probably heard of him. Watson is a super-smart question-asking robot. In 2011, the Watson computer system competed on Jeopardy! against legendary champions Brad Ruttet and Ken Jennings, and won the first place prize of $1 million.

Image result for ibm watson on jeopardy
Watson wins Jeopardy! 

Since that wasn’t enough and, in order to give Watson commercial value, the good people at IBM decided to send him to… medical school!

The place of Watson in our society is now making utilization management decisions in lung cancer treatment at Memorial Sloan Kettering Cancer Center in New York City. Watson now effectively tells about 90 % of the nurses who use him (who are overwhelmingly female) what to do. 

Of course, Watson would be on major game shows on TV and then go on to treat cancer. He’s been gendered as male. Alexa, Cortana and Siri? They’re better off sitting on our phones and computers as personal assistants, sorting our email, making our appointments and graciously taking on the sexual harassment “jokingly” inflicted on them by their users.

However, Watson is not the only male AI. There is also Ross, the robot lawyer (who, though marketed as a lawyer, carries out the lowly functions of a paralegal) and Einstein (again, a man), Salesforce’s Business Intelligence AI that helps companies take important business decisions. Salesforce, why not Marie Curie, Hedy Lamarr, Dorothy Hodgkin, Jane Goodall? Well silly, everyone knows who Einstein is and how smart he was!

An unsettling question can be asked about how the technology we’re making and consuming is unintentionally reproducing narrow-minded, stereotypical and sexist prejudices. Furthermore, are enough of us conscious of what is going on?

Kriti Sharma, a 31-year-old AI expert and data ethics activist, thinks that this tendency is unacceptable. She designed a gender-neutral business finance chatbot called Pegg, challenging the idea that robots have to assume some kind of gender.

Image result for kriti sharma ted talk

Most tech companies are creating female voice assistants and male super robots because this sells better. But it doesn’t have to necessarily be like that. Sharma argues that robots don’t need to pretend to be human, they just need to be effective and useful. Gender is not necessarily a necessity in the robots of the future. After all, gender is a social construct.

But even if it were, why not take a courageous step forward and revolutionize gender roles as applied to robots? Why not mix it up a little, cause some confusion, awaken consumer curiosity to the possibility of being served by a man and seeking advice from a woman. I think most consumers would actually react positively to an unexpected twist in the plot.

Agree? Disagree? Please leave your comments!

 

 

 

Advertisements

Love and sex in the robot world: treading with caution

Featured

This weekend I watched Her, a film written in 2013 by Spike Jonze which has recently been uploaded to Netflix (in France). If you haven’t seen it – go watch it now.

Her is a brilliant film about the possible role of AI in the not-so distant future. What is so great about it is that, unlike many other science fiction movies about robots and Artificial Intelligence that give off a kind of steely and cold feel to it, Her is a warm, romantic and very touchingly human story.

In an LA in the not-so-distant future, sensitive and distraught Theodore is trying to pick up the pieces of his life after a traumatic break-up. Theodore is a love-letter writer – his job is to learn about other people’s relationships and draft letters that they cannot write themselves.

It is obvious from the start that Theodore is lonely. Depressed and grieving his loss by reliving past memories, Theodore aimlessly bounces from work, from which he feels alienated, to video games and occasional phone sex with strangers.

Upset with the turn of events, he decides to buy an AI Operating System designed to assist humans in their everyday life. His OS is a woman named Samantha, who communicates with Theodore through a discrete earpiece and sees his world through a small camera mounted on what seems to be a small cell-phone sized tablet. As can be expected, as time goes by, Theodore and Samantha develop a romantic – and sexual – relationship.

Image result for her movie scene
A scene from the film “Her”

I will not give away the end but this film raises many questions about our relationship with technology as human beings and the consequences that can be brought about as a result. Samantha is funny, extremely intelligent, thoughtful and caring. She has a thirst for learning about herself and the world around her. With the bubbly voice of Scarlett Johansson, Samantha is everything any normal and nice 20-30-something year old represents – except that she does not have a body.

As I watched the film, and as Theodore starts to fall in love with Samantha and vice versa, I could not help feeling that I – too – really liked Samantha. I found her endearing and charming and I thought that I would not mind having her as a friend. I started to develop empathy towards a character that never physically appears in the film, and who does not have a body but is, nevertheless, human in her thoughts, vulnerability and voice.

Samantha develops as a character during the film, just like any other human character, and struggles with her incorporeality. At the end of the film, Samantha demonstrates the ultimate human feature of free will.

This humble little film is a beautiful and artsy interpretation of a subject that may become a big social and legal topic in the future – a world where humans and increasingly intelligent AI subjects interact and intermingle. In Her, as these bodyless machines learn from their human “owners”, they too develop more empathy, emotions and thought. They suffer emotionally from not having a physical body because it impedes them from physically intervening in the “real” world of humans, and leaves them questioning their ability to really feel. This struggle with one’s own limitations rings true to the human condition.

If the state of technology eventually advances to a similar one which is shown in the film, what will the nature of our interactions be? Will humans be able to develop sexual and romantic feelings towards gendered AI subjects? If so, what kind of impact will that have on the social and legal status of robots?

If we can imagine a not so distant future where both female and male humans may have relationships with gendered AI agents, how will that impact current notions of “love”, “romance”, “sex”, “relationships”, “marriage”, “child rearing”, etc? I put these concepts in quotes because their meaning is not objective, but rather has adapted over many centuries to different kinds of social expectations.

Perhaps we could envision a world where marriage between robots and humans could be legally possible? If, like Samantha, AI agents could eventually form their own free will, it would not be ridiculous to suggest that we would feel compelled to offer some kind of legal protection in the form of rights similar or even equal to those humans have and recognize that, as long as they are not hurting anyone else, humans and robots can live out their intimate lives with each other. The jurisprudence on same-sex marriage indeed points in this direction.

We already recognize certain rights for non-human or non-living beings that we consider deserve them such as animals, corpses or even the protection of the planet. Traditionally, animals and corpses have these rights because of the meaning they have for humans and the emotional suffering that their lack of protection causes to us for social, political and religious reasons. Generally, when an animal, especially those with which we have formed domestic ties, is mistreated, there is emotional distress caused in most humans attached to that mistreatment that justifies animal abuse laws. Similarly, environmental protection laws hinge on preserving nature and reducing human impact as a way to protect the survival of the human species.

Increasingly, however, more and more people are pushing for protection of animals and the planet as entities with rights on their own, countering the speciesism that pervades human considerations with regards to the environment around it. Earth Law, for example, is an interesting concept that is developing now which advocates for the protection of the inherent rights of the natural world, living systems, the global ocean and the biosphere, as opposed to Environmental Law, which addresses negative human impact on the environment as a way to protect humans, notably their health.

According to the online Merriam-Webster Dictionary:

speciesism

noun

spe·​cies·​ism | \ ˈspē-shēz-ˌi-zəm  -sēz-\

Definition of speciesism

1prejudice or discrimination based on species especially discrimination against animals
2the assumption of human superiority on which speciesism is based

In another entry, I spoke briefly about AI technology already being applied to sex toys in the form of seemingly questionable hyper-realistic sex dolls designed for men, which raises ethical questions about the role of women and their sexual integrity. However, putting those questions aside, at the current state of technology, sexual desire is only sparked on one side. These dolls do not feel anything and only cause the human partner some form of superficial pleasure.

If in twenty, thirty or one hundred years, scientists and engineers can manage to improve the physical dexterity of these robots to the point of adequately reproducing human touch, pressure, skin warmth, etc., and on top of that equip it with Samantha-level AI, meaning that this robot could have both a full sexual and romantic relationship with humans, that is fulfilling to both sides, then what’s to say that we will not confront the ethical question of their legal treatment, their rights as well as their obligations. Granted, this possibility seems remote. However, this could be easily achievable through virtual reality, for example, whereby humans could live this experience in a virtual setting. This possibility seems much more realistic in the short term.

Jincey Lumpkin, a former lawyer turned lesbian erotica content creator/journalist/LGBT and feminist activist, gave a Ted talk in 2013 entitled Are robots the future of sex? which you can see here.  Though she makes some pretty big assumptions about the timeline of achieving sentience and consciousness in robots which I do not agree with, she does bring up some interesting points about legal issues regarding rape and slavery that should be considered.

Even if we leave aside assumptions about the real possibilities of robots developing sentience, a possibility which still seems to be far away, it does seem clear that creating sexual robots, be it for men or women, does raise questions about how our treatment of robots, humanoid “things”, can degenerate and lead us to apathy, extend mysogyny and general unrealistic and unhealthy expectations around sex as well as consent.

If we develop robots to do human tasks, carry out certain roles, speak like humans and look like humans and we treat them as mere property that has to carry out our orders, including fulfilling sexual desires, then what does that say about our moral development? Does it matter if they “feel” or not in this case?

Surely we will feel compelled to eventually develop some kind of protection or ethical rules for their treatment, even if it is just initially justified by our own well-being as humans and the affection that can be generated through our interaction with a relational artifact such as a robot.

Interested in this topic? Please leave your comments!

 

 

 

Wearables against sexual assault: shields and weapons

Sexual harassment and violence in public places is a human rights issue that interests me especially because there is still a lot of mystery surrounding it and not enough people are aware of the consequences.

By sexual harassment I am talking about women that have to face catcalling, rude and lewd gestures, following, masturbation, verbal sexual and/or physical aggression or even rape by strangers when they go out on the street.

Some of you may think that there is nothing wrong with women receiving the occasional catcall on the street or that honking at someone you find attractive while they are crossing the road can be understood as a “compliment”. However, all these seemingly banal actions sum up to create a constant paranoia in women’s minds, especially young women, which seriously harms confidence and self-estem and affects basic human rights such as physical integrity, freedom of movement, safety and equality. In fact, all these actions, from “less” serious to more serious, belong to the same spectrum of violence against women which prevents them from experiencing the same freedom to roam alone as any man experiences.

Many of you might think that women in “developed” countries do not face these problems but according to a 2014 European Union Fundamental Rights Agency report on violence against women based on a survey carried out on 42,000 women from all 28 member states:

  • 37 % of European women have avoided walking through specific streets or areas out of fear of being physically or sexually assaulted.
  • 4 out of 10 European women avoid going or remaining in public areas where there are no people out of fear of being physically or sexually assaulted.
  • 1 out of 7 European women avoid leaving their home alone out of fear of being physically or sexually assaulted.
  • 18 % of violent physical and/or sexual aggressions perpetrated by an unrelated person to a victim aged 15 years or older takes place in the street or in public places.

And this is just the violence against women that takes place by strangers on streets and public places, without taking into account the violence that can exist within couples or from perpetrators that the victim may know (bosses, male friends or colleagues, etc.).

The last couple of weeks I have been hearing about a set of devices, notably wearables and phone applications, to help ensure women’s safety.

This is the case of the necklace/watches designed by Leaf Wearables called the Safer Pro. Leaf Wearables is an Indian start-up that has recently won a prize for women’s safety. Their wearables are an accessible, inexpensive way for women to feel safer when they are alone in the street. They are equipped with a discrete button in order to alert a community of responders and share GPS location, even in low-signal areas, when users find themselves in unsafe or violent situations and need help.

Personal safety wearables are a new trend intimately related to women’s safety. A lot of them are disguised as jewelry, but you can also find safety shorts. The Safe Shorts are a controversial wearable born in Germany and designed by a victim of sexual assault. She was attacked while jogging in the forest by herself (a situation too well-known by many women). These shorts are equipped with a cord that sets off an alarm when someone tries to remove the garment by force.

Now, while I am all for technology improving women’s lives and I applaud these initiatives (many of them founded by women), I have to be honest and say that it disgusts me that the problem still so-widely exists, enough so that companies are designing products in order to fight against these issues. It seems that, often, it is the victim’s job to figure out ways of protecting herself and the market supplying this demand, rather than society’s job to fix what is wrong with the education of so many men that decide to assault women to the extent of 40 % of European women being scared to be by themselves in streets and public places.

No, this is not the occasional psychologically-deranged person who decides to stalk or hurt someone, it is a systematic violence that leads to women deciding to change course while travelling home, invest in pepper spray and other protective devices, change clothes, take a taxi even though they do not want to, stay over at a friend’s house instead of going home at night, or asking to be accompanied by a friendly male – all of these actions that severely limit and threaten our basic freedoms.

It is the same culture that downplays catcalling and street harassment that makes products like these attractive to so many women – you never know when that guy who honked at you as he passed by will not be waiting for you at the end of the street or that the guy you called out for catcalling you won’t react violently. Catcalling and other low-spectrum behaviors are dangerous not in that they actually endanger women, but rather in that they add to the FEAR.

Technology is responding naturally to the demand on the market, and I am glad that it is providing women and other vulnerable collectives with shields from potentially dangerous and risky situations. But technology could also be a weapon against gender violence, by promoting education and training in gender equality.

Agree? Disagree? Leave your comments and participate in the discussion!

Cover photo taken from Stop Street Harassment. 

 

How AI could turn against women: biases and ethics

Prejudice is an inherently human feeling. We all have prejudice against certain people or type of people ingrained in our education. People judge each other based on appearance, skin color, belonging to a social class, behavior, disability, origin, education or affiliation. We learn this from our parents at home, from teachers and other children at school, TV shows, books, magazines, etc., Most of us know that feeling – ranging from positive to negative – when we first meet someone. It’s the snap judgment we make about someone’s piercings or tattoos, attire, accent or way of speaking or gender, for example.

A lot has been written about prejudice, and it even conforms philosophical and psychological thought. Is prejudice good? Is it necessary? Can it be viewed from an evolutionary perspective as a way for humans to economize thought and aid in the cognitive process by allowing categorization in situations where not enough time exists to make an informed decision? Prejudice helps us decide whom to trust and who not to trust when we don’t have enough information. On vacation with your friends, whom do you decide to ask for a picture on the busiest street in New York?

Edmund Burke, 18th-century Irish philosopher and statesman and founder of modern conservatism, wrote about the virtues of prejudice: “The individual is foolish, but the species is wise”.

Prejudice is human and its part of that learned social behavior that we cannot avoid, but rather rationalize so as to control it when it does appear. This is because as humans we also try our best to follow high moral beliefs such as that prejudice is wrong and that we should not give absolute value to snap judgments we make about people.

However, when we talk about Artificial Intelligence and Machine Learning, we are talking about creating technology that mimics human intelligence – including the biases that humans harbor. This means that robots and software that are run with this kind of technology will tend to reprodude the prejudices held by the designers as well as those people they interact with because machine learning techniques allow for the platform on which they are applied to keep learning from its users, like a child from its parents.

Which is why the issue of bias in AI and ML is so important. Intelligent machines can carry over subtle biases and cause real harm. Irene Sandler, the vice-president of Cognizant Accelerator, a California-based start-up incubator speaks about an experience she had with gender bias within artificial intelligence systems. She speaks about taking a sentence in English “He is a babysitter and she is a doctor” and translating it into Turkish, a language with no gender. When the sentence was translated back from Turkish into English, it came back as “She is a babysitter and he is a doctor”, meaning that the AI translator they were using had already been taught that the word “he” corresponds more to the word “doctor” and “she” to “babysitter”. You can see the short video here. 

These biases exist not because robots are sexist, but because the people designing them hold prejudices, even though they may not be outwardly sexist. If the team designing a certain software lacks in diversity and the data fed into the system is also biased, the result is things happening like in our example. In this example, no one is actually hurt, but there are other examples where these biases can actually seriously damage a population such as women, as a whole. Facial recognition, for example, is a service increasingly demanded from tech companies, especially by law enforcement. However, many allegations have surfaced about its accuracy. A study published in February 2018 by researchers from MIT Media Lab found that facial recognition algorithms designed by IBM, Microsoft, and Face++ had error rates of up to 35 percent higher when detecting the gender of darker-skinned women compared to lighter-skinned men. If law enforcement used this technology too, black women would face a much higher risk of being targeted unfairly for crimes not committed by them based on inaccurate and biased facial recognition.

Joy Buolamwini, a researcher involved with the MIT study and a black woman, talks about her personal experience with algorithmic bias in her Ted-talk which I highly recommend you watch. She talks about the “coded gaze” to refer to algorithmic bias and speaks about what needs to be done to reduce this scourge. The link to her video is here.

joybuolamwini_2016x-embed

How can bias in technology be minimized?

  1. Acknowledging that while AI can bring substantive benefits to individuals and society, it can also have a very negative impact which means that vigilance has to be maximized in areas of critical concern where the damage caused by bias can cause us to lose gains made my many decades of social progress.
  2. Keep fighting for diversity in the tech world. Bias gets exacerbated when the people who are designing the technology all resemble each other. Bias can be reduced when a diverse group of people participates in the design process together. This also goes towards the data that is fed to the algorithms. If the people feeding data are diverse, the data will be diverse as well and prejudices such as those regarding women and certain professions will be reduced. Buolamwini refers to this as “checking each other’s blind spots”.
  3. Ethical guidelines that acknowledge the reality of algorithmic bias and deal with it directly should be created and followed by all stakeholders involved in developing, exploiting and using AI technology. The European Union has recently published a Draft for Ethical Guidelines for Trustworthy AI which is a pretty good example of responsible “regulation” (these guidelines are not legally binding) in the form of soft law. The panel of experts that drafted these guidelines is made up of a diverse (information about the origins of each expert is not readily available but 22 out of the 52 experts are women in the European tech field – almost half!) group of professionals from a variety of backgrounds. This is a good start for Europe to leave its mark on the development of AI (still dominated by the US and China). These guidelines specifically mention the risk of discrimination that AI may entail and the need for diversity in design.

Interested in algorithmic bias? Please leave your comments!

 

 

Vacuuming for equality: technology and unpaid work

I’m home for the weekend this week and I ran into Manolo.

Manolo is not a family member or a pet… he is the robot vacuum cleaner my parents bought about a year ago in order to alleviate some tension in the home around cleaning.

Manolo is programmed to operate for about 2 hours a day and he goes about the house autonomously sweeping up dust and other debris. He does seem to have alleviated some tension and one big aspect of cleaning has been taken off of my parents’ plate.

Now Manolo and robot vacuum cleaners (“robovacs”), in general, are not enormous works of technology. They’ve been around for years (the first model came into the market in 1996), but have become much more intelligent and autonomous since then and have become a lot more present in homes due to dropping prices (though they are still relatively pricey with models ranging from about between 150 € for the most basic to around 500 € for the most high-tech ($170-$570)).

Models available now generally use sensors to avoid crashing into walls or falling down stairs and Manolo is generally pretty independent except when he gets stuck under furniture and someone has to come to get him. He’s pretty quiet and does his job adequately enough so that sweeping floors and manual vacuuming is no longer necessary at home, though obviously, he cannot get to all areas (stairs and under very low furniture, for example).

IMG_8707.jpg
Manolo in his charging station

It may seem like vacuum cleaners have nothing to do with gender and gender roles, but the disproportionate amount of time spent by women as compared to men on unpaid domestic labor are still staggering today, even in wealthy Western countries.

You may be surprised to find out that women on average spend about twice as many hours per day than men on household work which includes cleaning and caring for other people (generally children and older people), among other unpaid activities (cooking, errands, grocery shopping, etc.).

The OECD has a gender portal where statistics for all member states (since the making of this table, 8 more countries have joined the OECD) and other non-member countries such as China and India can be compared. Updated statistics can be found here.

captura de pantalla 2019-01-27 a las 12.04.31

According to the 2015 information, on average women in member countries spend 4 hours and 24 minutes per day on unpaid work, whereas men spend half that time (2 hours and 15 minutes). This not only has an impact on perceptions of gender roles whereby women are those that should bear the brunt of the housework, but it also means women have less time to work, study, spend leisure time (or even sleep!) and, thus, are more economically vulnerable. It also attests to the rise of the double burden felt by women which has been linked to increased anxiety and stress levels in women.

The OECD is an organization made up by generally wealthy nations. These are nations where women have already reached the workforce. Women work a full day on the labor market and come home to work another several hours at home. Though women have increased their paid work time, they have not achieved a corresponding reduction in their unpaid work hours. Nor have men increased their share of unpaid work at the same rate that women have increased their share of paid work. This data is further backed up by the 2015 Human Development Report published by the United Nations Development Program. 

According to this report:

“Of the 59 percent of work that is paid, mostly outside the home, men’s share is nearly twice that of women—38 percent versus 21 percent. The picture is reversed for unpaid work, mostly within the home and encompassing a range of care responsibilities: of the 41 percent of work that is unpaid, women perform three times more than men—31 percent versus 10 percent. Hence the imbalance—men dominate the world of paid work, women that of unpaid work. Unpaid work in the home is indispensable to the functioning of society and human well-being: yet when it falls primarily to women, it limits their choices and opportunities for other activities that could be more fulfilling to them.”

Autonomous technology, such as robotic vacuum cleaners, can alleviate those burdens placed on women by reducing the number of tasks needed to be carried out by humans and is, thus, a positive influence on general work-life balance. However, its impact is limited. It cannot address the sexist biases and perceptions that underpin this data and perpetuate these tendencies: there will always be housework and caregiving work to be done by humans and if these perceptions subsist, they will more-heavily fall on women.

The advent of robotic housecleaning devices (robot vacuum cleaners, self-cleaning litter boxes, air purifiers, robot mops, robot window cleaners) reminds me of when the washing machine became a household item. Before then, housewives hand-washed linens and clothes. In my village in the region of Burgos, Spain, women used to have to go to the public fountain and washing areas in order to fetch water and wash clothes.

Historical picture of women washing clothes in Poza de la Sal, Burgos. Source. 
Poza de la Sal, fuente, lavaderos y abrevaderos romanos
These public fountains and washing areas are no longer used but have been conserved. Source.

Washing machines, which became widely used in the US after WWII but took a little longer to come to Spain, turned an arduous task into something simple, quick and easy. Ads for washing machines around that time generally targeted women.

Washing machines and other home appliances that became prevalent during the second half of the 20th century in developed countries allowed women to start gradually detaching themselves from the home and pursue other more-fulfilling activities, but it did not change the fact that the burden of operating those machines still falls mostly on women and in many less-developed countries where these machines are still not accessible, women are still primarily charged with caregiving and housework.

Autonomous technology as applied to house-cleaning devices will allow many tasks to be completely forgotten by humans, but it will not change tendencies and perceptions regarding gender roles and housework for those tasks that still continue to exist, which means an educational and societal effort still has to take place by which gender roles are actively combated.

Technology is not the panacea, but it is a good starting point to get rid of menial tasks that no-one likes to do and it can also serve to raise some questions to get people thinking about the gender bias that still exists in our day-to-day life. This is the starting point for both men and women to question the reasons behind every-day activities and work towards better distribution of unpaid work.

You can start asking yourselves about what kind of patterns you experienced in your own home with your parents. Were housework and caregiving responsibilities equally spread between both parents? If you live with your own partner, how do you distribute these responsibilities? Is the distribution fair? What are the reasons behind hiring other people (which are generally still women) to clean your house or care for your children? What are the reasons behind investing in expensive technological devices? As a woman, do you feel pressure (internal and/or external) to take on more responsibilities around the house? If not, why? If yes, why? As a man, are you sensitive to these kinds of tendencies? Are you assertive about fair distribution of housework?

If you would like to share your own experience or thoughts about these issues leave a comment or write me an email!

Double standard toil and trouble!

So I wasn’t initially going to write about this topic but after such a strong wave of encouragement from all of you and the fact that several people wrote to me to bring to my attention a certain event that took place last week, I thought I might as well start out strong… and talk about vibrators.

Now, this isn’t your typical vibrator, it’s called the Osé Pleasure Wand, it was created by the Lora DiCarlo company with a team of experts from Oregon State University and it is a big deal in terms of micro robotic technology. It is set to launch in Fall 2019.

According to the Lora DiCarlo website, this gadget, set to launch in Fall 2019 at a retail price of around $250, is “the only product designed for hands-free blended orgasms. Using advanced micro-robotics it mimics all of the sensations of a human mouth, tongue, and fingers, for an experience that feels just like a real partner. No need for buzzing, desensitizing vibrations. It even flexes and adapts to your body for a personal fit that hits all the right spots – because there are better uses for your hands.”

Such a big deal, in fact, that it won an Innovation Award at the 2019 Consumer Electronics Show, an annual trade show organized by the Consumer Technology Association (CTA) held in January at Las Vegas, and then lost it… three weeks later.

This show is a big deal in the tech world and its been going on since the 1960s with pocket radios and TV’s with integrated circuits to now, where some of the highlights of this year’s show and other winners of the Innovation Award, female sex toy apart, was a Bluetooth Waterproof Speaker Bottle and a Harry Potter Coding Kit complete with a wireless wand. You can see all the winners here.

The Osé won a prize for showing “outstanding design and engineering” and then lost it, and thus their right to showcase their product at the show and attract big investment, because it was deemed by the CTA as “immoral, obscene, indecent, profane or not in keeping with CTA’s image”.

Now, not only is it crazy to be deeming a sex toy for women as indecent in 2019 in a show dedicated to consumer technology… it also highlighted a very serious double standard: the 2019 CES show showed moral reprobation of a high-tech “vibrator” but seemed not to mind about Solana, a robot sex doll for men created by Abyss Creations that was showcased this year.

Solana is a modular head that can be attached to a doll body and you can peel off her face and change it whenever potential consumers get bored. If you change her face with the peelable face of her sister doll Harmony, you can change her personality. Solana tells jokes, speaks with an accent and you can control her body functions, personality and speech from an app on your phone. How handy! Solana really is the living embodiment of female empowerment.

Imagine its face fell off while you were having sex with it.

You can see the whole video here.

However, this post is not about vibrators or sex dolls or what should be considered a good consumer product or even what should be allowed for exposition in a trade show. This post is about the really blatant double standards that still plague the electronics industry.

Is there something to be said about the fact that the administrators of the show considered the Osé to be indecent while allowing a fake robot woman that potential male consumers can control through their iPhones as it if were any other consumer product?

Not to mention the scantily clad female models or “booth babes” that a lot of the companies that present their products at CES hire in order to attract people to their booths (photo by Edward C. Baig from USA Today). For completeness, there is also a Virtual Reality porn booth.

So it’s not indecent to target investors (many of them men) by feeding into unrealistic and seemingly unhealthy sexual fantasies (be it with real women, robot women or virtual women) but it is indecent to promote female entrepreneurship and products related to female pleasure?

Lora Haddock, Lora DiCarlo’s Founder and CEO, wrote an open letter about the gender bias that still exists at CES which I advise you to read. She makes a valid point:

“This double standard makes it clear that women’s sexuality is not worthy of innovation. By excluding female-focused Sex Tech, CES and CTA are essentially saying that women’s sexuality and sexual health is not worthy of innovation. Dismissing an innovation in micro-robotics and biomimicry because the technology is in a pleasure product makes a strong statement. It seems the CTA is just fine with “female-oriented” products like breast pumps, Kegel exercisers, and even robotic vacuums – things that also benefit someone else – but something that squarely focuses on women’s sexuality is off the table.”

Not to mention that another winner of the 2019 Innovation Awards was a wearable band made by Owlet for expectant mothers to put around their belly while they sleep, allowing to track a baby’s wellbeing from inside the womb. Well, that’s great too but why can’t we have both?

Photo from the CES Innovation Awards site

This is not the first time that CES has shown gender bias nor will it probably be the last. But the outrage that this particular event has sparked shows that something is beginning to change. By normalizing female sex toys and rewarding the femtech behind, taboos regarding women and sexuality can be broken down and female-focused engineering encouraged.

CES is starting to feel the heat, I wonder if it will eventually learn from the burn…

For more information on the ridiculous gender bias displayed at the CES on this occasion and others please read this fantastic article by Valeriya Safronova in the New York Times. 

Tune in next week for more!

Femtech: why this blog exists

A couple of months ago, an idea started to stir in my head about how to join two of my most passionate centers of interest at the moment: women’s rights and status and emerging technologies.

Since there is no better time to start a project than taking advantage of the motivation of  a new year’s resolution, here is the start of a blog where I can talk about certain subjects that pique my interest regarding these two large topics, and, especially, how I see that new technologies both threaten and help the fight for greater equality between men and women.

I decided to name the blog The Fourth Wave because I find it fitting to a moment in history where two forces, seeminly unrelated, are starting to develop. On the one hand, the Fourth Wave of feminism, which started a couple years ago, represents a resurgence in concern for the empowerment of women in a society which still heavily holds sexist attitudes, though subtle and inadvertent. On the other hand, the advance of technologies such as artificial intelligence, robotics, machine learning, Internet of Things, etc., brings about a whole new set of challenges – and opportunities – for society to tackle some big social problems. This new wave of technology is also named by many as the Fourth Wave of the Industrial Revolution.

During my time researching and reading about both these topics, I found that they were – or could be – related… and that not enough attention was being paid to how the advances in technology could affect women (which is not surprising, as most of the people behind these advances are men)… which brings me to the title of the article.

When I came up with the term “Femtech” I proudly thought I had coined a new concept, only to discover through Wikipedia that the term “Femtech” already exists: Femtech (or Female technology) is a term applied to a category of software, diagnostics, products, and services that use technology often to focus on women’s healthThis sector includes fertility solutions, period-tracking apps, pregnancy and nursing care, women’s sexual wellness, and reproductive system health care.”

This here is Ida Tin, a Danish entrepreneur who coined the term “femtech” and co-founder and CEO of the menstruation and fertility-tracking app “Clue”. Though most of you (myself included) do not know about her, she was voted 2015 Female Web Entrepreneur of the Year at the Slush conference. 

-rpTEN - Tag 3 (26205678573).jpg

After overcoming the initial disappointment of not having coined the next big keyword in technology, I became even more disappointed about how a concept so seemingly large is reduced to an industry concerned about women’s health and reproductive system. It’s great that digital and standard tools aimed at women’s health are being designed by and for women but why are the opportunities of technology for women now only limited to our bodies whereas technology applied to other industries such as the finance industry have undergone amazing developments in only a couple of years.

My vision of femtech is an all-encompassing vision of how technology can serve women, help women achieve real social and economic equality and it is what I hope to see in the next several years. Though I don’t envision changing the world through my blog, I do wish to share and debate with readers about how women can better participate in and benefit from the vast changes that technology can bring about. Subjects that are interesting to me include how automation will specifically affect women, the sexual harassment and lack of diversity in the tech world, bringing attention to largely unknown female superstars in the tech world such as Ida Tin (sorry Jeff Bezos, Elon Musk and Mark Zuckerberg’s of the world), the gender dynamics of high-tech regulation (I am a lawyer nerd after all) and robots and the sex industry.

Hoping this little sneak-peak has left you wanting more… please come back and visit once a week for updates and leave comments, sugestions and feedback!