Woman on a computer

EEOC Strategic Enforcement Plan Hones in on AI Hiring Tools

By Douglas Lipsky

While the media focused on the next big thing in artificial intelligence, ChatGPT, the Equal Employment Opportunity Commission (EEOC) recently announced its intent to regulate the use of AI systems in employment decisions. To ensure your company’s AI hiring practices don’t run afoul of EEOC requirements, consult an experienced employment law attorney.

The Backdrop

The EEOC published its draft Strategic Enforcement Plan (“SEP”) for 2023-2027 on January 10, 2023, outlining the agency’s upcoming enforcement priorities. Among these priorities is eliminating barriers to recruitment and hiring with an emphasis on:

  • The use of automated systems, including AI or machine learning, to target job advertisements, recruit applicants, or make or assist in hiring decisions where such systems intentionally exclude or adversely impact protected groups
  • Screening tools that disproportionately impact workers based on their protected status, including those facilitated by AI or other automated systems

In particular, the EEOC intends to focus on practices and policies that discriminate based on race, ethnicity, religion, age, gender, pregnancy, disability, and sexual identity. Although the agency has yet to release the final SEP, the employment watchdog has been examining the use of AI in recruitment and hiring for several years. 

Despite the potential cost benefits of AI screening tools, the concern is that automated candidate sourcing may lead to discriminatory hiring practices. While the agency intends to enforce anti-discrimination laws, whether a human or AU commits alleged violations, the miscues in the ChatGPT beta release highlight the potential for human bias in artificial intelligence.

In short, algorithms used by employers to assess candidates may lack diversity and, therefore, reinforce institutional bias, resulting in discriminatory hiring practices. While the EEOC intends to focus on discriminatory AI hiring tools, the draft SEP does not include specific guidance on using artificial intelligence in employment decisions.

Why This Matters

Whether the EEOC’s intent to focus on AI hiring tools will result in enforcement actions is unclear. In any event, many employers in New York City and the country currently rely on artificial intelligence in their hiring processes.  

As we have previously written (here), NYC employers must adhere to a city council ordinance that became effective on January 1, 2023, requiring businesses that use AI hiring tools to notify job candidates of the use of such tools. Ultimately, artificial intelligence is poised to dramatically impact the contemporary workplace and society, whether or not humans are ready. The best way for employees and employers to understand their rights and obligations in this evolving landscape is to consult an experienced employment lawyer.

About the Author
Douglas Lipsky is a co-founding partner of Lipsky Lowe LLP. He has extensive experience in all areas of employment law, including discrimination, sexual harassment, hostile work environment, retaliation, wrongful discharge, breach of contract, unpaid overtime, and unpaid tips. He also represents clients in complex wage and hour claims, including collective actions under the federal Fair Labor Standards Act and class actions under the laws of many different states. If you have questions about this article, contact Douglas today.