The study was commissioned by UniBank with an aim to deepen understanding of how artificial intelligence (AI) influences the likelihood of women being hired in finance industry roles.
The findings revealed that gender bias enters the recruitment procedure at a number of different points — the main causes of bias include gender-skewed datasets, correlational bias judgements in algorithms and human decision-making.
The thirty-five page report, titled ‘Ethical Implications of AI bias as a Result of Workforce Gender Imbalance’ found there was “something distinct about the men’s CVs that made our panel rank them higher, beyond experience, qualification and education.”
How did the study work?
40 Masters, PhD and post-graduate degree students who have had experience in employment-hiring acted as recruiters. They were given real-life CVs from job-applicants applying for jobs at a bank.
The job-applicants were applying for three different roles — data analyst, which is male dominated, finance, which is gender-balanced and recruitment officer, which is female-dominated.
Fifty percent of the recruiters were given the CVs with the applicants’ gender reversed — the other 50 percent were given CVs with genders unchanged. The recruiters then had to rank the candidates and make a list of the top and bottom three CVs for each position. The recruiters told the researchers they were judging the applicants based on relevant experience, education and key words such as “bachelor’s” and “human resources”.
What were the results?
The researchers found that on average, the recruiters ranked the female candidates four places lower than male candidates for a finance officer role. Female candidates were also placed two and a half places lower than male candidates for the data analyst position, despite the CVs being entirely identical.
In fact, the male candidates were ranked higher for all three positions, demonstrating the strongest examples of unintentional bias at this level of human decision-making. There was minimal difference between male and female recruiters— both were found to be likelier to rank men’s CVs higher than women’s.
Mike Lanzing, UniBank’s General Manager, said in a statement, “As the use of artificial intelligence becomes more common, it’s important that we understand how our existing biases are feeding into supposedly impartial models.”
“We need to take care that we are not reversing decades of progress toward women’s financial independence and security by baking in old attitudes about the sort of work women are suited to,” Lanzing said. “UniBank is committed to ensuring that women have an equal stake in the future and these findings are a reminder that we have to take active steps to achieve that goal.”
The researchers adopted a regression model of analysis to control for variations in the CVs to show that a candidate’s gender was likely to be the most critical factor in determining their likelihood in getting a job.
Associate Professor Leah Ruppanner, a researcher from the University of Melbourne, remarked on the good timing of this research: “…it gives us basis to accurately explore how CVs are judged by human panels.”
“Computers don’t ask why,” she told the ABC. “The onus is on us to understand the subconscious bias behind job hiring decisions, before we start embedding these problematic preferences into artificial intelligence algorithms.”
The findings from the research found that machine ranking algorithms did not share the subconscious bias exhibited by humans. Instead, they offered more impartial, independent evaluations of the candidates.
Despite the small sample numbers in this study, the computer-generated algorithm detected men had more relevant experience than women, though women were found to have a better match on keyword requirements. Professor Leah Ruppanner believes these small differences have a big impact on the algorithm.
“We know that women have less experience because they take time [off work] for care giving, and that algorithm is going to bump men up and women down based on experience,” she said. “The algorithm reinforces and amplifies unconscious gender bias in recruiting.”
Professor Ruppanner added that algorithms must be used carefully to avoid skewed results, and that these problems were exacerbated when large recruiting websites fail to broadcast how their algorithms function.
“The algorithm isn’t thinking about experience, it’s just finding associations,” she said. “You have to say to it, ‘Don’t penalise women for parental leave.’ It has to be coded in.”
A spokesperson for one of Australia’s largest recruiting websites, SEEK — said its sustainability report addresses this issue of bias in recruitment. The report describes how artificial intelligence can usher in both risks and benefits to the recruitment process.
“Potential exists for artificial intelligence to detect and embed discriminatory bias in human behaviour,” the report notes. “Conversely, there is opportunity to remove explicit bias signals in data to generate more equitable outcomes. One example of this is the removal of names (which can often infer someone’s ethnicity or gender) from resumes before they are used in models generating artificial intelligence.”
The report recommended four keys ways that bias could be reduced in these processes. They include training programs for human resource professionals, regular audits of hiring by gender across all positions to identify potential roles that are vulnerable to gender discrimination, creating transparent hiring algorithms designed to reduce gender bias and establishing quota systems for hiring to ensure women are not excluded from male-dominated or gender professions based on hiring biases.
“This means there is something distinct about the men’s resumes that made our panel rank them higher, beyond experience, qualification and education,” the researchers explained. “This forms the most alarming dimension of gender bias, as we are not capturing what gives men the edge in these positions.”