New York City’s law regulating the use of automated employment decision tools (AEDTs) in hiring and promotions entered its enforcement phase July 5 after months of delays.
The first-of-its-kind law requiring employers to audit their HR technology systems for bias and publish the results or face fines took effect Jan. 1 but enforcement was delayed while clarifications in the regulations were ironed out.
Material changes were made to the law at each rulemaking stage, keeping employers and other interested parties guessing at what the final result might look like. The New York City Department of Consumer and Worker Protection released a set of frequently asked questions (FAQs) to accompany the July 5 enforcement date, providing more clarification to some of the law’s provisions.
“Following this law has been a journey and there will likely be more changes in the future,” said Roy Wang, an AI expert and general counsel at Eightfold AI, a talent intelligence platform in Santa Clara, Calif. “The legislation has over time become pretty clear, which is helpful. But it’s the first of other city and state legislation and I hope that those jurisdictions will look to this New York City law as a model and not reinvent the wheel.”
An inconsistent patchwork of laws is always problematic for employers operating across locales. The New York City law comes amid a nationwide push to regulate increasingly more powerful automaton and AI technology at work. The Equal Employment Opportunity Commission and a handful of states and Washington, D.C., are all weighing their own legislation covering AI bias in hiring.
The New York City law is a powerful sign of government catching up to emerging technology before it wreaks havoc on the workforce, said Jonathan Kestenbaum, managing director of technology strategy and partnerships at AMS, a recruitment solution provider and advisory firm. “It’s a significant step forward in the ongoing fight against discrimination and bias in the workplace,” he said.
“To be sure, AI has impacted corporate hiring in mostly positive ways. It has facilitated the more mind-numbing aspects of hiring, such as filtering through thousands of resumes, and removed unintended bias from hiring processes. But left unchecked, AI can also perpetuate unintended biases, violating both local and existing federal laws.”
Who and What Is Covered?
The FAQs clarify that the law applies to employers and employment agencies only when the job is located in New York City.
“If you are an employer, then the geographic analysis is simple, and driven by the location of the job, not that of the employer,” said Niloy Ray, an attorney in the Minneapolis office of Littler. “If the job is performed at an office or other corporate location outside NYC, the law does not apply. If the job is performed at an assigned corporate location in NYC, even in a partial or hybrid manner, then the law applies.”
And if the job is performed remotely the law does not apply unless the job associated with the remote position is located in New York City, he added.
Ray said that the provision covering employment agencies is less clear. “The FAQs suggest that the law applies to all jobs—even jobs performed fully outside NYC—if the hiring is done by an employment agency located in NYC,” he said. “We expect that such is not the intent of the agency, and that the agency intends to regulate agency-based hiring only where the positions being hired for are at least partially located in NYC or are fully remote but attached to a NYC brick-and-mortar office.”
As for what is covered, the law defines AEDTs as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation” that is used to “substantially assist or replace discretionary decision making for making employment decisions.”
If employers or employment agencies use an AEDT to substantially help them assess or screen candidates at any point in the hiring or promotion process, they must comply with the law’s requirements before using the technology.
AEDTs include tools used for screening, interviewing, assessing and scoring potential hires and employees for promotion. Covered technologies include those that use algorithms to analyze resumes, chatbots that conduct interviews and assessment platforms that evaluate job seekers on skillsets, traits or aptitude.
Ray noted an important distinction covered in the FAQs—the law only applies when the technology is directed toward actual jobseekers or employees, and not before someone has applied for a job. That means that employers can use unaudited technology to source candidates, scan resume databases and conduct outreach to potential candidates.
“The FAQ establishes that using an AEDT to search through an existing database of nonemployee resumes or other collection of potential-applicant data, and/or merely encouraging those identified as prime candidates to apply for the position at hand, does not activate the requirements of the law,” he said.
Bias Audits
The law requires that the AI bias audit to assess the tool’s potential disparate impact on sex, race and ethnicity be conducted by a third party that has no vested financial or other interest in the employer, which has led to a cottage industry of AI consulting firms and third-party auditors ready to assist New York City employers.
“Over the past six months, we have been receiving an increasing number of requests for audits to ensure compliance with the New York City bias audit law,” said Adriano Koshiyama, co-founder of Holistic AI, an AI risk management and auditing platform based in London. “Without a doubt, this law is an important step towards making tools safer and fairer for prospective and current employees by increasing transparency and accountability. With the increasing usage of AI in employment, this law also plays an important role as a major safeguard to New York City residents.”
The FAQs state that compliance responsibility rests with employers, and not vendors. An AEDT vendor is not responsible for conducting a bias audit of its tool. Employers must ensure that a bias audit was performed on the AEDT within one year of using it.
Ray pointed out a few aspects of the audit process which were confirmed in the FAQs.
- The law stops short of requiring corporate analysis and response to the bias audit. “The bias audit results are not intended to spur any specific subsequent actions on the part of the business,” he said.
- A bias audit need not be specific to a job or job class—rather, a bias audit spanning multiple types of positions would suffice.
- If there is a gap or insufficiency in demographic data for candidates, businesses may be able to rely on test data instead.
Wang said that Eightfold did a lot of internal work as an AEDT vendor as the law moved through its various stages, including interviewing independent auditing firms to partner with. Two auditors were selected, and multiple audits conducted, including the latest one in June. “We published the audit so the public can see it and hopefully it can help our customers,” Wang said.
Employers and employment agencies must publish a summary of the results of the most recent bias audit along with the date the technology was first used.
The summary of results must include:
- The date of the most recent bias audit of the AEDT.
- The source and explanation of the data used to conduct the bias audit.
- The number of individuals the AEDT assessed that fall within an unknown category.
- The number of applicants or candidates, the selection or scoring rates, as applicable, and the impact ratios for all categories.
Employers and employment agencies can only rely on a bias audit for one year from the date it was conducted before a new audit is required.
Providing Notice
Employers and employment agencies must notify employees and job candidates who are residents of New York City that they are using an AEDT and the job qualifications or characteristics the AEDT will assess 10 business days before using the tool.
As an alternative, notice to job seekers can be posted on the employment section of an organization’s website and notice to employees can be included in a written policy.
Ray said that as the recent history of the New York City law illustrated, “regulating AI-driven employment activity is neither straightforward nor easily done in the abstract. Instead, much of the nuance to this and other similar legislative efforts will be teased out only as businesses begin efforts to comply.”
And positively for employers, the New York City Department of Consumer and Worker Protection “declared firmly its intent to collaborate with, rather than penalize, businesses working in good faith to meet the requirements of the law,” Ray said.
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.