A new report has found that women, already disproportionately affected by job losses during the pandemic, face further disadvantage from job hiring algorithms and human bias.
The research from the University of Melbourne, commissioned by UniBank, has found that subconscious gender biases greatly influence decisions by human recruiters and are exacerbated in algorithmic sorting models for CVs.
Commissioned to better understand how artificial intelligence (AI) affects the chances of women being hired into finance industry roles, the study discovered that bias enters the recruitment process at a number of different points. The research report explains that gender-skewed datasets, correlational bias judgements in algorithms and human decision-making are key drivers of amplified bias.
Alarmingly the human recruiting panels, made up of Masters and PhD students with experience in hiring, demonstrated the strongest examples of unintentional bias, consistently preferring CVs with male names over female equivalents.
Despite stating a desire to rank CVs by education, experience and keyword matches to job descriptions, on average the panel ranked women 4 places lower than men for a finance officer role and 2.5 places lower than men for a data analyst position, even when the substance of the CV was otherwise identical.
The report states that “there was something distinct about the men’s CVs that made our panel rank them higher, beyond experience, qualification and education.”
UniBank General Manager, Mike Lanzing, said, “As the use of artificial intelligence becomes more common, it’s important that we understand how our existing biases are feeding into supposedly impartial models.”
“We need to take care that we are not reversing decades of progress toward women’s financial independence and security by baking in old attitudes about the sort of work women are suited to. UniBank is committed to ensuring that women have an equal stake in the future and these findings are a reminder that we have to take active steps to achieve that goal,” Mr Lanzing said.
Researchers used a regression model of analysis to control for differences in CVs and demonstrate that a candidate’s gender was likely to be the most important factor in determining their ranking.
Interestingly, the research found that machine ranking algorithms did not share this bias, offering fairer independent appraisals of the candidates. A basic algorithm was worse at predicting how the human panel would behave when judging CVs with women’s names, again indicating a subconscious human gender bias unrelated to expected metrics of success.
More sophisticated AIs enhance their models using publicly available material about the qualities of successful candidates, creating a “black box” that operates without transparency or human oversight. Any amount of initial bias will be amplified through this cycle.
University of Melbourne researcher, Associate Professor Leah Ruppanner said, “The timing of this research is important as it gives us a basis to accurately explore how CVs are judged by human panels. Computers don’t ask why. The onus is on us to understand the subconscious bias behind job hiring decisions before we start embedding these problematic preferences into artificial intelligence algorithms.”
The report suggested a number of measures that could reduce bias in these processes including training programs for human resource professionals and creating transparent hiring algorithms designed to reduce gender bias.