A new law amending New York City’s administrative code penalizing employers for bias in artificial intelligence (AI) hiring tools takes effect in January 2023. Because the city has not provided guidance, employers might not know how to prepare. By consulting an experienced employment law attorney, businesses can avoid being penalized for bias in their hiring processes.
Is There Bias In Automated Employment Decision Tools?
The new anti-AI bias law applies to any New York City employer or employment agency that uses an automated employee decision tool to screen an employee or candidate. An “automated employment decision tool” is defined as:
“any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making…that impact natural persons.”
Excluded are tools that do not (1) automate, support, substantially assist or replace discretionary decision-making processes and (2) materially impact natural persons, such as:
- Junk email filters
- Antivirus software
- Data sets
- Other data compilations
The law requires employers to conduct an independent audit of automated tools, make the results publicly available on their websites, and disclose the data collected by the AI tool, either directly to the public or by responding to an inquiry.
Employers can be penalized up to $500 for a first violation and $1,500 for each subsequent violation each day they fail to correct the issue, which can result in substantial fines.
The law does not provide a private right of action, meaning that workers cannot sue if they believe they’ve suffered AI hiring bias. However, there is a potential for federal class actions if the city finds a tool an employer used was discriminatory.
Anti-AI Bias Law Lacks Guidance
At this juncture, the city has not issued guidance on complying with the anti-AI bias law. In addition, the law does not define the term “independent auditor” nor specify who should perform the audit. The city’s Office of the Corporation Counsel will enforce the law, but how it will do so is unclear. However, the city will likely learn about a problematic AI decision-making process through a complaint,
In the meantime, employers can look to the Equal Employment Opportunity Commission’s technical assistance guidance covering artificial intelligence hiring tools. The guidance directs employers to assess their AI tools for bias against disabled workers but does not impose additional requirements or penalties. The EEOC’s guidance also includes questions to ask vendors and how to provide reasonable accommodations, particularly to disabled job applicants or employees. Finally, employers should consult technical experts to determine how their artificial employment decision tools work and employment law attorneys who can identify the potential for discrimination complaints.