A host of existing laws already apply to regulate artificial intelligence, so HR professionals need to be sure to coordinate compliance. Don’t just rely on vendors, said Anthony May, an attorney with Brown, Goldstein & Levy in Baltimore. Instead, ask vendors questions to determine if they’re in line with applicable laws.
Just because an employer has satisfied one law regulating AI doesn’t mean it’s complying with another, he cautioned at the SHRM Talent Conference & Expo 2024 taking place this week in Las Vegas.
For example, a New York City law requires employers that use automated employment decision tools to audit them for potential race and gender bias, publish the audit results on their websites, and notify employees and job candidates that such tools are being used.
The law does not prohibit bias based on disability, marital status, national origin and other factors that are unlawful bases for employment decisions under federal law, May said.
Instead, preventing those types of discrimination via AI in the U.S. falls under three primary laws, he said: Title VII of the Civil Rights Act of 1964, the Americans with Disabilities Act (ADA) and the Age Discrimination in Employment Act (ADEA). Title VII prohibits discrimination based on race, color, religion, sex—including sexual orientation and gender identity—and national origin.
“AI is new, but the same laws apply in a different context,” he said. When biased historical data is entered into an algorithm to determine if a job candidate will succeed, biased data will result, he cautioned.
AI presents a “black box dilemma,” May said. There are “millions of different data points,” so it’s impossible to know how it reaches its decisions, he said.
Sometimes, discrimination from using AI is more obvious. As AI is used to replace in-person interviews, facial recognition technology may inadvertently screen out someone with cerebral palsy, for example. There needs to be a reasonable accommodation at the start of the application process, May said.
Keep in mind state laws, he added. Maryland requires employers to notify applicants if they’ve been recorded or if data was used to make hiring decisions, May said.
Enforcement Actions
The U.S. Equal Employment Opportunity Commission (EEOC) “is coming after you” if AI is used in a discriminatory manner, May said.
He highlighted a settlement last September with iTutorGroup in which the company agreed to pay $365,000 to settle a discriminatory hiring suit. The company allegedly programmed its tutor application software to automatically reject female applicants ages 55 or older and male applicants ages 60 or older, according to the agency. The company rejected more than 200 qualified applicants based in the U.S. because of their ages, the EEOC said.
“Even companies doing business abroad will face serious consequences if they discriminate against U.S.-based employees,” said EEOC trial attorney Daniel Seltzer in New York City.
The EEOC will monitor the company’s compliance for at least five years if the company resumes hiring in the U.S. “You do not want to be in their position,” May said, noting that this was one of the first pieces of AI litigation from the EEOC.
In a separate case, Mobley v. Workday, a Black disabled applicant over 40 years old applied for 100 jobs using Workday’s platform and heard back from none of the hiring employers, May said. Alleging violations of Title VII, the ADA and ADEA, he sued Workday, which May described as a software developer behind an algorithm that businesses use to screen out candidates.
Initially, the claims were dismissed “without prejudice,” meaning they could be filed again. Workday had argued it wasn’t an employer or agency, and the court agreed that the plaintiff failed to show that the company was either. The plaintiff also lacked allegations sufficient to support an inference of intentional discrimination, disparate impact or causation, the court ruled.
However, the plaintiff has refiled the case, and the EEOC filed a friend-of-the-court brief on April 9 calling for the case to move forward.
“First, Mobley has plausibly alleged that Workday operates as an employment agency because it purportedly engages to a significant degree in screening and referral activities that have long been associated with traditional employment agencies,” the EEOC stated. “Second, Mobley has plausibly alleged that Workday is an indirect employer because it purportedly exercises significant control over his and other applicants’ access to employment opportunities with Workday’s employer-clients. Third, and finally, Mobley has plausibly alleged that Workday is an agent of employers because employers have purportedly delegated authority to Workday to make at least some hiring decisions.”
Nonetheless, the EEOC took no position on the accuracy of the allegations in the case.
“We believe this lawsuit is without merit and deny the allegations,” said a Workday spokesperson. “As a leading provider of finance and HR software, Workday is a technology company—not an employment agency. We are focused on delivering products designed to be configured and used by customers to best support their needs. We do not have oversight or control of our customers’ job application processes, and likewise, our customers do not delegate control to us in regards to their hiring processes.”
Question Vendors
May said the Mobley case shows employers should ask vendors how their algorithms are created and what they are doing.
“Thoroughly vet vendors,” he said. “Hold their feet to the fire.” Do they hold routine audits? If there are discriminatory outcomes, why are those happening?
Employers may be on the hook if vendors are discriminating, he cautioned.
Ask if vendors that do conduct audits are looking for more than just race bias or sex bias, May added, such as disability bias.
“Ask for their track record,” he said. How many contracts have they had and was there any bias? If they’re not open and honest with you, do not use them, May said.
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.