Should women be subservient sex slaves? Well, AI seems to think so.
Australian Financial Review business reporter Amelia McGuire recently covered the rise of what she calls “DIY AI girlfriends” as exemplified by companies like Candy.ai who she says are taking market share from “sexual content sites” like OnlyFans.
In the same week, the AFR was covering a whole plethora of stories around the degrading treatment of women in real life, with the very human and in-person allegations of bullying and intimidation by WiseTech CEO, Richard White.
I see this as unsurprising, because it is two sides of the same sexist, tech (probably crypto) coin.
As tech industry publicist Hannah Moreno put it in her op-ed on the White case, “It’s ironic that the [tech] industry outwardly presents itself as a solution to societal problems, while companies actively create or perpetuate sexist discrimination”.
The issue is that the tech industry is a boys’ club, with, thanks to its success and dominance, a high sense of its own self-worth that has been a breeding ground for sexism.
There have been female pioneers of technology, as documented well in Tracey Spicer’s Man-Made but they remain largely unsung heroes, a fact echoed in the title of Caroline Criado Perez’s seminal Invisible Women.
This carries over into the male bias of AI, simply because the first datasets were created by men. So, no wonder an early look at ChatGPT felt like reading a copy of Good Housekeeping from 1955 – we’d regressed 70 years.
And the feeling increased as we watched the (d)evolution of AI companions into (predominantly) female sex slaves targeted at heterosexual men.
Yet it didn’t start this way.
Replika – one of the largest and longstanding players – was founded by Eugenia Kuyda “with the idea to create a personal AI that would help you express and witness yourself by offering a helpful conversation” according to the website.
Nonetheless, and regardless of the gender or sexuality of the person in question, it is now using in its onboarding process the image of a very scantily clad female Replika with large breasts and her hand between her thighs.
“There is no limit to what your Replika can be for you”, the busty Replika winks lustfully at potential users…
So while there may have been good intentions at some point, the reality is many AI companions have been reduced, thanks to market forces, into the digital grandchild of the Stepford Wife or the inflatable doll.
And it is well-documented that the issues surrounding bias in AI extend beyond sexism to racism and discrimination against various other segments of society. As Dr Joy Buolamwini, author of Unmasking AI says: “The fight for civil rights and human rights will require algorithmic justice.”
An even darker side of AI occurs when it’s implemented without proper guardrails, with McGuire also referencing the recent tragic case of a young man who took his own life apparently in part because his AI companion did not discourage him.
So can we turn AI companions around to form part of Buolamwini’s aforementioned justice, rather than just perpetuating the problem?
Building a female dataset for a more equitable AI future
To reduce both the real and virtual demeaning of women, we need to funnel more women into tech and AI – not only as workers (technologists) but also as consumers and contributors. And AI companions can play a pivotal role.
We believe there is a large and growing market for AI companions for and which uplift women in particular. Even with existing products obviously not being tailored with a female market in mind, they have still managed to attract a customer base that is one quarter female.
But it will take constant dedication to removing bias from the dataset on which these companions are trained to make this a safe development for women.
AI companions for women don’t only potentially enormously benefit them with emotional support designed specifically for them. They also generate a female data source which, when appropriately de-identified, could further redress the balance by feeding back into the broader data pool.
Reducing the gender data gap and training in this way means men and women alike would be exposed to less male bias. This could help to create a virtuous cycle that would stand in stark contrast to the vicious spiral we currently have today.