I’ve started using ChatGPT as my work “wife.” Just think about the way powerful and creative men have historically used their female spouses to do all the mundane, behind-the-scenes labour that would otherwise slow them down.
He stands on her back to lift himself up and shine, unimpeded by drudgery.
I use ChatGPT in a strikingly similar way – not for creative ideas, but to handle time-consuming, brain-draining tasks that free me up to think and write more clearly.
From shortening paragraphs to generating citations or even helping me find a word lost in perimenopausal brain fog at 3am — it’s the kind of unpaid labour that has always quietly supported ‘great’ work.
As a sole parent juggling multiple jobs and a mortgage, I don’t want a co-founder, I want a secretary. And I’ve found one in the shape of a bot.
But this convenience comes at a cost. We’ll come back to that in a tick. But first, back a step.
Why do so many women need a “wife”?
Back in 2016, I was the married mum of two little kids, carrying the bulk of the mental and domestic load, as well as working full time as a freelancer.
There’s a list saved in my phone from this period which cites all the family-related jobs that fell to me. A few days ago, I popped the list – featuring about 60 items – into ChatGPT and asked it to calculate how many hours a week these tasks would take to complete.
Within seconds, ChatGPT provided detailed time breakdowns. The answer? 80-85 hours a week. Woah!
The bot told me, “If one parent is doing most of this, it’s the equivalent of two full-time jobs…This is a conservative estimate, assuming efficiency.”
Then, the AI tool added my weekly paid work hours as a freelancer to this domestic burden and concluded: “That’s the equivalent of working 17-18 hours per day, 7 days a week, with no real time off.”
No wonder women in heterosexual relatiosnhips are tired. HILDA data shows: “Australian women still undertake the majority of housework, whereas men’s share of housework has remained constant over 20 years.”
I’m not married anymore but still have a lot on my plate. As kids get bigger, so do their problems. And after numerous interest-rate rises and the skyrocketing cost of living, making ends meet is a challenge.
This is why I have started using ChatGPT as a kind of slave. My reasoning? After millennia of human history through which women have persistently been cast as the domestic doormats of their families, this is a feminist ‘correction.’
I’m thinking here of Anna Funder’s unputdownable book Wifedom, where English novelist George Orwell’s wife Eileen quietly does much of the domestic, secretarial and even intellectual work that fuels his career. Despite this, she’s effectively invisible (and dies a miserable death).
This is not the vision I have for my life. And probably, not one you have either.
Having this tech assistant helps me shed the drudge work and frees up time to do the creative tasks which matter.
But we can’t be too hasty. AI tools like ChatGPT may seem like a panacea; they are not.
Spelling out the AI dumpster fire
My own book – alongside thousands of other books – Troll Hunting was scraped without permission to train both OpenAI and Meta’s large language models. So, I’m effectively using tools built from the theft of my own intellectual property — which raises serious ethical questions.
Like me, author and award-winning journalist Tracey Spicer’s books were plundered by big tech. Somewhat paradoxically, one of these stolen works is about AI. It’s called: Man-Made: How the bias of the past is being built into the future.
Reflecting on using the very tools which were created from our stolen work, Spicer says: “In a perfect world, we’d be boycotting these ethically bereft companies. However, we live in an imperfect world.
“Current and future roles will require everyone to be adept at using AI tools. If we’re left behind in this fourth ‘Industrial Revolution,’ there’ll be even fewer women – and older people – in the workforce. We don’t want to go backwards.”
(For her part, Tracey favours the AI tool “Claude” which she says operates in a more ethical way than ChatGPT.)

Is AI really offering feminist levelling of domestic load?
Dr Matt Beard is a philosopher at the Cranlana Centre for Ethical Leadership.
He agrees that having many people (mainly women) burning the bulk of their energy on mundane tasks isn’t useful.
However, he points to deeper issues which arise by attempting to posit AI tools as a solution. Dr Beard suggests that if time is freed up by AI, it’s “…likely that some other domestic task will arise that fills in the time…”
He encapsulates it this way: “AI that makes existing systems more efficient may not be the solution when those systems are broken at their core.”
Deception and trickery
The ethical issues don’t end there. Are we tricking people if we use AI and don’t disclose it? Here are experts have differing views.
Tracey Spicer’s answer is a resounding, “Yes!”
“The EU’s AI Act requires providers of AI systems to mark their output as AI-generated. I strongly contend that we should all be transparent about where and how we use AI, during this time of increasing mistrust in the media.”
For Dr Beard, this issue is not as clear cut. “If you do use Al and you don’t disclose it, it might give the appearance of hiding something. However, if people think that you’re using Al to write for you, then disclosing your use might also create a dip in trust.”
Well, that’s clear as mud. Hold up, though. Dr Beard has more to say. He wants us to think about “…whether the use of AI is substantively different to the way you’ve written in the past.”
“I don’t know many writers who disclose that they asked another author to proof their article before submitting it; or who used spell check.
“If your use of AI is analogous to these practices, I’m not sure that disclosure is necessary. But if you’re essentially co-authoring with AI, then I think it’s worth considering how you make that known in a way that’s accurate and honest,” Dr Beard says.
The environmental cost
The way journalist and broadcaster Julia Baird explains it, each little query we ask of ChatGPT uses up litres of water.
Spicer shares the same concern: “Effectively, you’re drinking a plastic bottle of water every time you use ChatGPT to write an email. My advice is to learn more about the environmental impacts and use these tools only when necessary.”
Sure. There may be benefits to individuals which come from using AI. But Dr Beard suggests we look below any surface benefits.
“We can’t ignore the growing body of evidence that those benefits are purchased at significant moral cost – including to intellectual property rights of creative industries (including yours – and mine), additional pressure onto ecosystems that are already under unbearable strain, and the exploitative wage conditions for Kenyan workers who were hired to train AI to be less toxic.”
Dr Beard suggests we ask ourselves two key questions to help guide our personal AI use: “If you are doing something that might involve harm, is what you’re doing necessary? Are the harms that are caused proportionate to the benefits that are offered?”