It’s important to realize that everyone carries implicit biases, and these biases impact recruiting and hiring.
That’s the message two inclusion and diversity experts from VallotKarp Consulting in New York City delivered Nov. 5 to attendees at SHRM INCLUSION 2024 outside Denver.
Consultants Brittany Boone, an industrial/organizational psychologist, and Diego Carvajal, an attorney, presented on how bias shows up in each phase of the recruiting and hiring process, including writing job posts, sourcing talent, screening resumes, interviewing, and making final decisions.
Implicit biases include subconscious feelings, attitudes, prejudices, and stereotypes an individual has developed due to prior influences and imprints throughout their lives.
“Bias is your own internal algorithm, based on assumptions and associations,” Carvajal said. “Implicit bias does not mean being racist. Everyone has biases. The question is, how do you recognize it, and what do you do about it?”
Those making employment decisions should especially be aware of their unconscious biases because they play such a critical role in who gets hired at an organization, Boone explained.
“And even though we strive to be fair and unbiased, we all still have these biases,” she said. “That doesn’t mean you are a bad person.”
Writing Job Posts
Job ads and job descriptions are the first impression an applicant has of an employer’s hiring process. Job posts should be clear and precise in accurately describing the skills and qualifications needed for the role, Carvajal explained, while also being inclusive of all potential applicants.
“Think about the message you want to send and how accurately you are describing the job to attract the best talent,” he said. “Are you creating the right message that aligns with your company culture and values?”
Studies show that certain words can be seen as coded to be more appealing to certain groups, such as men or women, for example.
Sourcing Talent
Are you looking for candidates in all the right places?
“Clients will tell us that women or people of color don’t apply when they post their jobs,” Boone said. “We ask them where they are posting their jobs, and it’s the same old places. How intentional are you being to bring in the diversity you want to see?”
Carvajal added that as more recruiters use AI-based sourcing technology, more groups of people could remain unseen. “It’s human nature to return to the channels you are used to,” he said. “Technology can help, but only if the scope is large and varied, targeting different demographic groups.”
Both experts advised employers to build talent pipelines before roles open up. And don’t overly rely on referral networks.
“Look into specialized industry associations and job fairs that include underrepresented candidates, and insist that recruiters deliver a diverse set of candidates,” Carvajal said.
Screening Resumes
Some studies have shown that names on the same resumes that sound Black don’t get as many callbacks as names that sound white, even among companies that profess that they value diversity.
“Sometimes there is a disconnect between what we are saying and what we are doing, and we have to put some safeguards in place to prevent this,” Boone said. “Acknowledge first impressions and then challenge them.”
She shared some questions to consider:
- Do items on a resume make you react strongly?
- Do items on a resume elicit positive feelings?
- How similar is the candidate’s experience to yours?
“This step of holding yourself accountable and being intentional must be practiced—it will take some time,” Boone said.
Technology can be a solution for unbiased resume screening, but users must be mindful of algorithmic bias baked into the tool, she said: “Make sure AI decisions can be explained, and explanations are provided to recruiters.”
Interviewing
The interviewing stage is the most fraught with subjectivity and potential bias, Carvajal said, because this is where more free-flowing conversations happen.
He advised employers to standardize interview processes, making sure that each candidate is asked the same questions and evaluated on the same criteria. Also, avoid personal questions, have diverse interview panels, and assess interviews using competency-based scoring.
“You want to make sure that every candidate is going through the same process,” Carvajal said. It’s also important to be aware of the associations that may influence who you hire when making a final decision, including recognizing affinity bias, confirmation bias, likeability bias, and the “halo/horn effect” based on how someone dresses or looks.
Interviewers should guard against considering false predictors of success, Boone said, such as a candidate being well-dressed or physically attractive, or who maintains eye contact or delivers a strong handshake.
Both speakers recommended employers practice thinking about “cultural add” versus “cultural fit” when evaluating candidates.
“‘Fit’ may be a code word for bias—it’s based on a gut feeling and is linked to comfort,” Carvajal said. “Consider cultural add instead, which should be able to be quantified and highlights skills and dimensions that your team is missing.”
Boone said that before a final hiring decision is made, all evaluators should finish rating candidates before anyone reveals their ratings. The most senior person should always go last so as not to influence other people’s decisions.
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.