Cambridge University report shows how factors like lighting can misrepresent personality traits.

October 20, 2022

2 Min Read

Cambridge University report shows how factors like lighting can misrepresent personality traits.

AI-powered human resources tools that claim to eliminate bias can actually contribute to discrimination, researchers have discovered.

Scientists from Cambridge University’s Centre for Gender Studies developed an AI tool based on 'technosolutionism,' an algorithmic-based solution designed for use by HR teams to analyze job candidates. The system monitors a candidate’s speech patterns, vocabulary and facial expressions for so-called ‘culture fit’ and personality traits.

The researcher’s Personality Machine showed how small changes like background, clothing, facial expressions and lighting could generate very different personality reports. 

“All too often, the hiring process is oblique and confusing. We want to give people a visceral demonstration of the sorts of judgements that are now being made about them automatically,” said Euan Ong, a student developer who worked on the study.

They found the solutions were akin to “automated pseudoscience,” similar to physiognomy or phrenology, which leads to dangerous stereotypes. The results showed the pool of job candidates became more homogenous, not diverse.

Datasets used to build algorithms are often based on existing datasets, which can lead to job hires that look like the existing workforce. And since AI recruitment tools are proprietary, it’s unclear how they’re developed.

Candidate pool became less diverse

The researchers said companies use AI to evaluate candidate videos based on regions on the face. The scoring is based on five personality traits, including agreeableness, openness, conscientiousness, neuroticism and extroversion.

“These tools are trained to predict personality based on common patterns in images of people they’ve previously seen, and often end up finding spurious correlations between personality and apparently unrelated properties of the image, like brightness,” Ong said. 

As HR departments turn to AI to handle the volume of job candidates, as well as the pandemic driving companies to look to technology to solve their labor shortages, the scientists wanted to see how successful the AI tools were working.

“This trend was in already in place as the pandemic began, and the accelerated shift to online working caused by COVID-19 is likely to see greater deployment of AI tools by HR departments in future,” said Dr. Kerry Mackereth, the study’s co-author.

Scientists cautioned hiring executives not to rely on AI to diversify their workforce. The paper was published in Philosophy and Technology.

AI-powered HR tools, like the ones outlined by the Cambridge University researchers would be considered high risk under the EU’s prospective laws on AI. Those developing AI-based recruitment tools would be required to conduct risk assessments, include “appropriate” human oversight measures, and use high levels of security and quality datasets.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like