Are AI-powered voice assistants sexist?
The U.N is worried about gender bias in AI systems
Humans and AI systems are constantly working together. However, it’s showing up a new challenge for society because human bias is starting to affect the data. There are some AI tools that contain biased data, from racial to gender or ideological biases. One example is the sexism that we can see on digital assistants.
Discreetly, we find that prejudices are also present in the technological products that we use every day. For instance, in the case of voice assistants, we’ve seen that they are feminine and they try to project an image of a young woman with a submissive personality.
From their names Alexa (Amazon), Aura (Movistar), Siri (Apple) or Cortana (Microsoft), with the exception of perhaps the neutral Google Home, to their female sounding voice tones, everything implies that it is a servile woman who is attending our requests.
What the U.N study reveals
This trend, at first, could have gone unnoticed for many people, but increasingly it has been generated more and more impact about its possible implications. According to the UNESCO, the UN’s Education, Science, and Culture agency, in a report made by EQUALS, these voice assistants favor discrimination and the perpetuation of sexist gender stereotypes.
Also, these assistants respond to sexist insults in an educated, passive and even cordial manner. In this way, from Alexa to Siri, they are normalizing and reinforcing the subjugation and servitude of women.
Associating virtual assistants with female roles represents many risks. Since it helps to keep alive the gap that exists between men and women and, also it helps to preserve certain harmful gender roles.
Image from EQUALS, 2019.
Why the majority of virtual assistants are women?
Firstly, it is understood that both men and women prefer to interact with women. This is because while the male voice can be associated with authority, the female voice is perceived as more helpful.
Secondly, as revealed in the EQUALS report, the number of women that works in the digital technological sector is less than men. So, the technical teams that develop AI tools are composed merely by men and women are under-represented in these teams.
Thirdly, the fact that virtual assistants are women by default is a reflection of the discriminatory gender biases that still exist in our society and proof how these biases can be present in artificial intelligence solutions.
Image from EQUALS, 2019.
“Women representing less than one third of the total workforce and an even smaller proportion of empoyees in leadership roles”.
Will society equality increase the number of women on the digital sector?
Nevertheless, despite expectations, recent studies show that increasing gender equality in society does not necessarily mean an increment of women participation in the digital sector or studying ICT in higher education.
So, the presence of digital gender gaps and the biases implies a relevant need to encourage and promote women to study ICT higher education or learning new digital skills, despite the level of equality existing in society.
Actions to avoid sexism on AI
Finally, we can improve this situation and avoid future biases by taking action. On EQUALS report, they present up to 12 recommendations that can be taken to prevent virtual assistants based on AI from continuing to perpetuate gender stereotypes. Some of the measures are to incorporate more women in the programming world, stop including the feminine voice by default or, for example, continue investigating the prejudices that virtual assistants can reflect and eliminate them rapidly.
Artificial Intelligence Solutions
If you are looking to implement an AI project at your company, at Koukio Solutions we can help you. Our team of experts can help you to develop your idea and turn it into reality.
Contact us to get professional advice on how can AI be executed in your business.
Make the most of your investment.