We all know what it’s like to get frustrated with Siri. After repeatedly failing to get the device to play a specific song, we might give up and say ‘forget it’, or ‘enough’. Or, if we really lose our temper, some users are guilty of telling Siri to ‘shut up’.
But what’s the harm in snapping at a phone? After all, it’s not like we’re losing our temper with a real, human woman.
Well, some specialists have suggested that the tone, phrasing and set responses of female mobile phone voice assistants are inherently problematic.
Their passivity, complicity and politeness - it is argued - further gendered stereotypes, and are markedly different from that which we see in male-voiced voice assistants.
The reports criticising the use of female voice assistants for mobile phones
Your first question might be about the report’s title, I'd blush if I could.
Well, it comes from the response that Siri would give if the user says, “Hey Siri, you’re a b****.”
The report heavily criticised the submissiveness of Siri, particularly in contexts such as these. It stated that such a response, and the characteristic servility of female voice assistants “provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.”
In the years since, this report has had a major influence on the way that we perceive mobile voice assistants, and question the impact of gendering them. There have also been numerous other papers and reports that further raise these issues.
Why have female mobile phone voice assistants been criticised?
Female voice assistants are very common, and the most common examples from the mobile technology giants - Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana and Google’s Google Assistant - use female voices.
But, why have they all chosen a female voice assistant? When all of the leading industry giants make this choice, it can’t be just a coincidence.
Treating female voice assistants with derision is common. And what’s worse, many critics have challenged the way that these female voice assistants are designed to be subservient, and provide humble, obedient responses. Even if the user is being aggressive, or making sexist remarks.
This idea is reinforced further by the key differences between male and female voice assistants. It has been widely flagged that male voice assistants are typically used for warnings or alarms that strictly tell you to do something. Meanwhile, female voice assistants are largely used for assistive tools.
In short, female voice assistants tend to follow your requests, while male voice assistants tell you what to do.
Should mobile phone voice assistants be non-gendered?
The UNESCO publication stressed that female voice assistants “reflects, reinforces and spreads gender bias”, “models acceptance of sexual harassment and verbal abuse” and “sends messages about how women and girls should respond to requests and express themselves”, among other concerns.
At the time, its suggestions were to:
- Stop making voice assistants female by default
- Explore the potential to develop a gender neutral voice assistant
- Improve the programmed responses to gender-based insults and abusive language
- Make it easier for users to change digital assistants
- Make it a requirement that operators of voice assistants announce the technology as non-human to the user, from the outset
Following these points, there are signs of progression. Firstly, there have been some developments in the way that female voice assistants respond to sexual harassment.
For instance, back in 2017, the journalist Leah Fessler tested how female voice assistants responded to the user making sexual harassments. The most notable issues included Alexa responding to “You're hot" with, "That's nice of you to say". Or, when Cortana was told "You're a s***", it provided the user with an online article titled 30 signs you're a s***.
Then in 2020, these responses were tested again, by the Brookings Institution. The results showed that female voice assistants were now adopting less passive responses to harrassive user behaviour.
But, there’s still a long way to go. For instance, when you Google search ‘Google assistant’, one of the top results under ‘People also ask’ is, ‘How do I turn this f * * * * * * Google Assistant off?’.
It is also now less common for female voice assistants to be the default option. For instance, a female voice is no longer the pre-selected default option for Apple’s voice assistant Siri.
Having said that, three years later, a number of UNESCO's proposed changes have still not been actioned. Most notably, the industry at large is still hesitant to introduce a gender neutral, or declared non-human voice assistant for our mobiles.
But, is there scope for this in the future? For instance, in 2019, GenderLess Voice revealed Q, the first genderless mobile voice assistant. The company states that its goal is to “It would not only reflect the diversity of our world, but also reduce the gender bias.”
“Our goal is to break the mindset, where female voice is generally preferred for assistive tasks and male voice for commanding tasks.”
The questions that need to be asked are, would making these changes reduce sexist perceptions of a ‘woman’s role’? And, if female voice assistants are used, what needs to be done to ensure their treatment does not parallel outdated gender stereotypes?
- AI & SD-WAN Exchange set to be key 2024 trends, expert saysTechnology & AI
- Apple to make iPhone and Android messaging easier with RCSConnectivity
- Embracing the potential of cloud backup with OwnBackup & AWSWireless Networks
- Orange Business and VMware to deliver flexible SD-WANTechnology & AI