Will AI Remove Hiring Bias?
AI solutions come with high expectations—and concerns—about the technology's ability to end hiring discrimination
If you've been following the latest hiring trends, you may have noticed that many recruiters are turning to artificial intelligence (AI) tools to tackle discrimination in hiring―and the expectations for success are high.
However, HR technology analysts and even executives at companies offering AI solutions caution that a totally bias-free hiring process may be difficult to achieve.
How difficult? Amazon, the world's largest online retailer, found out the hard way. In 2015, the company discovered that a recruiting system it was building with machine-learning algorithms had begun to downgrade certain resumes that included words such as "women's club." By contrast, the system favored male candidates to whom such verbs such as "executed" and "captured" were attributed. The recruiting system was never rolled out companywide.
According to Sarah Brennan, CEO and principal strategist at Accelir, a Milwaukee-based HR technology consulting firm, the problem with Amazon's recruiting technology was that it was loaded with past resumes gleaned from a 10-year period in which men applied and were hired more often than women.
"How it was built meant 'men' equals 'good,' " Brennan said. "It was not a clean algorithm to learn from. Amazon tried building this internally using biased data to start with, instead of partnering with a recruiting technology [company] that specializes in and has run validation studies on their programming to avoid the system simply repeating bad habits."
[SHRM members-only online discussion platform: SHRM Connect]
AI: A Diversity Solution?
A study from McKinsey and Company found that companies that have a diverse workforce financially outperform companies that don't.
Discrimination in hiring, however, is proving difficult to reverse. A recent study found that hiring discrimination against black workers in the U.S. has not declined in the last quarter century.
A 2016 National Bureau of Economic Research study showed that while women made up more than 50 percent of white-collar workers, they represented only 4.6 percent of executives.
According to Madeline Laurano, co-founder of Aptitude Research Partners, a Boston-based research and advisory firm, AI's ability to successfully eradicate bias in hiring will depend on whether:
- Companies use the right AI solutions consistently throughout the organization and the entire recruitment process.
- Organizations move toward educating recruiting teams and hiring managers on how to use AI solutions to drive diversity goals.
A recent survey from management consulting firm Korn Ferry found that of the 770 talent acquisition professionals polled, 63 percent said AI has changed the way recruiting is done at their organization, and 69 percent said using AI as a sourcing tool finds higher-quality candidates. One of the vendors creating those AI tools is Pymetrics, a New York City-based company that builds machine-learning algorithms. Pymetrics' algorithm-building process includes having clients' top performers complete games that test for traits such as short-term memory and planning and responding to job-related tasks.
Pymetrics is the first step in the application process for candidates who have applied for a position. If candidates are applying for a senior sales manager job, for example, they'll be invited to play the neuroscience games and their results are analyzed by bias-tested algorithms built off of the client's top performers in that role. Pymetrics then provides a recommendation to the employer about the applicant's predicted fit to the role.
To prevent machine-learning algorithms from introducing bias, the company pretests all algorithms to ensure they do not favor any gender or ethnicity, said Priyanka Jain, product lead at the company. However, she asserts that bias can occur later in the hiring process, such as during the interview when many interviewers may introduce their own.
Seattle-based Textio uses AI to help clients search for a more diverse pool of applicants. The company provides an augmented writing platform to suggest bias-neutral language for job posts and recruiters' e-mails to job candidates.
Textio's machine-learning algorithm is based on nearly 400 million talent acquisition documents, such as job postings, from which the tool finds linguistic patterns and then suggests words that avoid alienating certain groups.
"Textio finds the pattern that works to get engagement from different groups of people," said Kieran Snyder, Textio co-founder and CEO. "The platform gives you feedback in real time that tells you what words you might like to change and what words you might want to use to get responses from the people that you are looking for."
Yet Snyder warns that AI has its limits.
"Textio is not going to help people who are consciously biased," she said. "If you are actually a bigoted person, there is no software in the world that is going to make you not bigoted."
Online real estate company Zillow estimates that since the company began using Textio's augmented writing platform two years ago, there has been an 11 percent increase in the number of female applicants, a 3 percent increase in the number of female employees and a 6 percent increase in the number of women in technical roles.
Inclusivity starts at the beginning of the candidate experience, when an applicant reads a Zillow job posting, said Annie Rihn, vice president of recruiting at the company. "The language that we choose can either make or break our connection with a job applicant. People will self-select into a process or self-select out of a process based on the language we choose in a job description," she said.
Using AI to Reduce Human Bias
San Francisco-based Entelo uses AI in its Unbiased Sourcing Mode, a tool that anonymizes job candidates. The tool redacts names, photos, gender, schools, graduation dates and other information that may lead to a preference for or against a candidate.
Entelo's chief marketing officer, Mike Trigg, said that at its highest level, AI holds incredible promise to reduce overt, implicit or unconscious discrimination in the hiring process. Yet Trigg worries—and recent research supports his concern—that machine learning can replicate many of the same kinds of bias present in human decision-making.
"If the machine is learning from a biased recruiter, that has the potential to reinforce bias rather than remove bias from the hiring process," Trigg said. "You run the risk that it is going to mimic the biases that individual recruiter may have."
DK Bartley, senior vice president and head of diversity and inclusion at Dentsu Aegis Network, a multinational media and digital marketing communications company, said the successful use of AI in hiring depends on whether the tool is built to generate fair and balanced results, as well as how recruiters and hiring managers analyze the data and act on the insights generated by the system.
Bartley noted that AI tools can help the recruiting process only if hiring managers and recruiters evaluate the data to find out, for example, how many candidates they have, who was selected and why, who interviewed the candidates and the outcome.
"If, at the end of the day, you utilize AI and you did not end up hiring that diverse candidate, that's OK because you have diligently followed the process," he said. "You've interviewed the right pool of candidates, and for you, the recruiter, the perception is that this was the best candidate. The system will support your decision by showing that all the other candidates did not have all the skill sets you require. At the end of the day, AI holds everybody accountable."
Nicole Lewis is a freelance journalist based in Miami. She covers business, technology and public policy.
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.