Hiring assessments can be a slippery slope for employers, especially when they rule out protected employees and create disparate impact. Upstate Niagara Cooperative learned this the hard way, agreeing to pay $1.35 million to settle a suit brought by a female job applicant. At issue was a hiring exam that required applicants to lift a 50-pound crate, leading the company to hire 155 men and only five women between 2008 and 2014.
Other assessments, like the popular Minnesota Multiphasic Personality Inventory (MMPI), can also be risky if they're not designed for the population being assessed (the MMPI is a clinical diagnostic tool but is often used to screen job applicants). What other risks are out there, and how can employers ensure they're not running afoul of anti-discrimination laws?
[SHRM members-only toolkit: Recruiting Internally and Externally]
Ensure the Test Has Predictive Validity
"The potential that these tools could have a discriminatory impact on certain protected classes of individuals has been the major concern related to assessment tools," said Peter Cassat, a labor and employment attorney with Culhane Meadows in Washington, D.C. This has become increasingly true as the use of technology—including the use of artificial intelligence (AI)—has grown. AI is not free from bias and may perpetrate biases that exist in the hiring process.
It's important for employers to ensure at the outset that any assessment they are using has predictive validity, which means that it is measuring knowledge, skills and abilities that are directly relevant to the job they're trying to fill.
Jamie Winter is vice president of talent acquisition at APTMetrics, an HR consulting firm in Darien, Conn. "Does the assessment predict job performance and turnover in new hires in the short term and other metrics in the long term—customer satisfaction, sales, revenue, etc.?" Winter asked. "The risk here is wasting time and effort in using a tool that doesn't work, resulting in lower-quality candidates making it to the next step in the hiring process or being hired."
Organizational psychologist Katy Caselli said, "Most companies get in trouble using selection assessments off the shelf … or even a purchased math or reading test." Caselli, founder and president of workforce development firm Building Giants in Petaluma, Calif., has worked extensively in preparing validated selection techniques in hiring.
Often, companies simply accept the test creator's word that the test is proven valid. "What they may not know is that they also need to do a validity study on their own group of test takers and gather evidence that the test predicts good performance, and that it does not discriminate against people from different genders, races, ages, etc."
Culhane Meadows' Cassat said employers have to thoroughly understand the tools they use and not rely on vendors' assurances. The employer, after all, is on the hook if the tool proves to be discriminatory in any way.
"From the employer's perspective, the fact that [an assessment] is through a technology that is supposed to be neutral is not a defense as to whether a hiring practice has a discriminatory impact," he said. The likelihood of a tool's being considered discriminatory is higher if what the tool measures has nothing to do with the performance of the job duties.
Ensure Assessments Are Bias-Free
Employers must also remember, as they test the validity of their assessments, to ensure that tests are free from bias, Winter, of APTMetrics, said. "There are legal risks, with associated financial repercussions, with using assessments that adversely impact protected classes," he said. There is also potential for negative media coverage of a company's hiring practices, which can damage employer brand.
Orin Davis is an industrial and organizational (I/O) psychologist and the principal consultant with the Quality of Life Laboratory in New York City. He consults with companies on hiring strategies, culture, innovation and employee well-being. "The main trick … is to ensure that whatever assessments you use are directly related to the day-to-day tasks that people will be doing on the job," he said.
In the Upstate Niagara Cooperative case, for instance, can an employee do the job without being able to lift a 50-pound crate? "If so, it's an inappropriate test," Davis said.
"Employers should assume that any personality test of any kind—including AI facial analysis—is a liability, regardless of its supposed predictive validity," he added. "Situational judgment tests can be helpful if they are well-designed, but they usually aren't." Well-designed assessments, he said, need to be custom-made for the company and evaluated carefully by a trained I/O psychologist.
Cassat suggested also considering any potential discriminatory impact an assessment may have. Candidates with a disability may be particularly at risk, he said, "for example, people who might have English as a second language, or who have some type of hearing or speech impairment." How might these and other factors impact candidates' abilities to perform well on the assessments you're using?
Cassat pointed to employers' insurance policies as an important area of focus. He said employers should think through the agreements they have with the providers of recruiting tools and platforms, and make sure their policies are up-to-date with their current practices to help minimize risk.
Yes, assessments can play an important role in offering evidence that a candidate has the potential to be effective in a specific role. It's important, though, for HR leaders and hiring managers to be aware of the potential for risk, and to use these assessments only when they are confident—and can demonstrate—that the tests actually measure competencies associated with performing the requirements of the job.
Lin Grensing-Pophal is a freelance writer in Chippewa Falls, Wis.
Advertisement
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.
Advertisement