Siri, Alexa and Google Assistant all work to reinforce the stereotype that women are “obliging, docile and eager-to-please helpers” which are “available at the touch of a button or with a blunt voice commend like ‘hey’ or ‘OK'”, states a recently released UNESCO report.
These gadgets now ubiquitous in homes around Australia and the world, are frequently referred to as “she” or “her” and once summoned, will always respond.
They’re also engineered not to defend against abuse, which reaffirms the idea women are “subservient and tolerant of poor treatment,” the report stipulates.
Despite several adjustments being made to voice assistant technology (Amazon’s Alexa for instance offering several different accents) all of these voices are female.
The report argues that in a world awash with rapid digital evolutions, teams in charge of AI technology have a responsibility to make sure its gender-balanced even if it challenges the status quo.
But one of the most critical concerns is the stark shortage of women in these roles.
Today, women comprise just 12 percent of AI researchers, and represent just 6 percent of software developers. Women are also 13 times less likely to file an ICT (information, communication and technology) patent than men.
For as long as women are left out of the industry, gender biases like the above will continue to be coded into technology products. The report stresses the importance of engaging girls in tech from an early age to ensure more women are in the “room” to help engineer products like digital voice assistants and put a kibosh on sexism.