New York City issued its final rule regulating the use of automated employment decision tools (AEDTs) in hiring and announced that enforcement of the law will begin July 5.
The law took effect on Jan. 1, 2023, after rules were proposed in September 2022 and later revised in December 2022 following a public hearing. The final rule adopts the December proposal, with modest amendments and clarifications related to important definitions contained in the law and how bias audits will meet compliance.
"The AEDT law restricts the use of automated employment decision tools and artificial intelligence by employers and employment agencies by requiring that such tools be subjected to bias audits and requiring employers and employment agencies to notify employees and job candidates that such tools are being used to evaluate them," said Simone Francis, an attorney with Ogletree Deakins in New York City and St. Thomas, Virgin Islands.
"For employers that utilize automated employment decision tools in a manner that comes within the scope of this law, the issuance of final rules provides some long-awaited clarity about their obligations related to obtaining a bias audit and providing information about the results of the audit," she said. "The final rules include a number of changes to earlier versions, including expanding the scope of 'machine learning, statistical modeling, data analytics, or artificial intelligence,' modifying bias audit standards, and clarifying information that must be disclosed."
Violation of the law could subject an employer to fines between $500 and $1,500 per violation, per day.
Defining AEDTs
Employers are still trying to understand the scope of the law and exactly what is covered, said Nathaniel Glasser, an attorney in the Washington, D.C., office of Epstein Becker Green and co-leader of the firm's AI practice group. The law applies to tools that, "in layman's terms, are technologies that are used to make hiring or promotion decisions," he said.
Depending on how the tools are used, these could include resume screeners, video or text interview software, and cognitive or neurological assessments, he added.
The law defines AEDTs as "any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation" that is used to "substantially assist or replace discretionary decision making for making employment decisions that impact natural persons."
For example, an automated process that screens resumes and schedules interviews based on the screening results would fall under the law's requirements. On the other hand, an automated process that simply transfers applicant information from resumes to a spreadsheet, without otherwise scoring or ranking the applicants, would not be subject to the law's requirements.
"Sourcing tools are not considered AEDTs because the law defines 'candidates for employment' as individuals who have applied for specific positions, whereas sourcing tools are used to identify potential candidates for employment," Glasser said. "The law clearly applies to technologies used as the determinative factor to decide whether to move candidates along in the hiring or promotion process, as well as to technologies whose outputs are clearly given more weight than other factors in making such decisions. But there remain questions as to whether the law applies to technologies used early in the decision-making process that may be given different weights depending on the individual hiring manager or recruiter interpreting the output."
Conducting Bias Audits
Under the law, a bias audit must be conducted "no more than one year prior" to the use of an AEDT by employers or employment agencies.
"A bias audit is defined as 'an impartial evaluation by an independent auditor' to assess the tool's potential disparate impact on sex, race and ethnicity," Francis said. "The employer or employment agency must also post a summary of the results of the most recent bias audit on its website."
The final rules clarify the required calculations for a bias audit, as well as the provisions that address the use of historical and test data.
"The bias audit must use data from the employer's or employment agency's own historical use of the AEDT," Glasser said. "Where, however, this historical data is insufficient—either because the employer or employment agency has never used the AEDT, or because its historical data is not statistically significant to conduct a bias audit—the independent auditor may use historical data of other employers or employment agencies."
Test data can be used as an alternative, but in that case, the audit must explain why historical data could not be used.
Francis said the final rule puts an end to any lingering uncertainty about who can perform the bias audit. Anyone involved in using, developing or distributing the technology is disqualified from performing the audit.
"Ultimately, the responsibility lies with employers covered by the law to conduct a bias audit, as liability runs to employers for failing to comply with the law," Glasser said. "The final rule clarifies that an independent auditor cannot have financial or employment ties to a creator or user of the tool; thus, the bias audit cannot be performed by the vendor. However, nothing in the law prohibits the vendor from hiring an independent auditor to produce a bias audit on behalf of an employer."
Disclosing Audit Results
Before using an AEDT, employers and employment agencies must publicly disclose the date of the technology's most recent bias audit and a summary of the results, Francis said. The following information must be included in the summary:
- The source and explanation of the data used to conduct the bias audit.
- The number of applicants or candidates; the selection or scoring rates, as applicable; and the impact ratios for all categories.
- The distribution date for the AEDT.
"The final version of the rules continues to specify that the notice requirements may be met with an active hyperlink to a website that must be clearly identified as a link to the results of the bias audit," Francis said. "Additionally, the summary must be posted at least six months after its latest use of the AEDT for an employment decision."
Providing Notice
The provisions around required notices to candidates and employees are unchanged from the December 2022 proposed rule. "The rules specify that notice to candidates may be provided via the website or in a job posting or by mail at least 10 business days before use of an AEDT," Francis said. "Notice to employees being considered for promotion may be provided in a policy or procedure that is distributed at least 10 business days before use of an AEDT."
Glasser said that in preparation for the July 5 enforcement date, employers using AEDTs should consider analyzing and determining whether their use of the technology triggers a need to comply with the law.
"If using an AEDT in a manner requiring compliance, [employers should] identify and gather data to conduct a bias audit, retain an independent auditor to conduct the audit, consider where and how to publish the results of the audit, and ensure compliance with all notice requirements," he said.
Francis added that employers not subject to the law "may wish to continue to monitor developments, because the use of artificial intelligence to aid employment decisions continues to garner significant regulatory attention, and it is likely that additional laws will be proposed and enacted which require employers to have a sound understanding of these technologies, their uses and the outcomes associated with those uses."
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.