In May, the U.S. Equal Employment Opportunity Commission (EEOC) issued technical guidance on how an employer’s use of AI tools to assist in making employment decisions could violate the Americans with Disabilities Act (ADA). If you are disabled and suspect that an employer or its technology has discriminated against you, it is essential to consult with an experienced employment lawyer.
The ADA and the Use of AI Tools to Assess Job Applicants and Employees
The EEOC guidance is part of the agency’s “Initiative on Artificial Intelligence and Algorithmic Fairness,” which we have previously written about (here). The guidance focuses on three technological tools:
- Software – Information technology programs or procedures that instruct computers on how to perform given tasks or functions. In an employment context, such software can include automatic resume-screening software, hiring software, chatbots for hiring and workflow, as well as video interviewing, employee monitoring, and worker management software.
- Algorithms – An algorithm is a series of instructions that a computer follows to accomplish some end. The EEOC focuses on algorithms built into human resources software or applications to allow employers to process data to evaluate, rate, and make other decisions about job applicants and employees at various stages of employment, including hiring, performance evaluation, promotion, and termination.
- Artificial Intelligence (AI) – Some employers use AI to evaluate, rate, and make other decisions about job applicants and employees that rely on the computer’s analysis of data to determine which criteria to use when making employment decisions. AI includes machine learning, computer vision, natural language processing and understanding, intelligent decision support systems, and autonomous systems.
Employers may rely on different types of software that incorporate algorithmic decision-making at several stages of the employment process, such as:
- Resume scanners that rank applications based on certain keywords
- Employee monitoring software that rates employees based on keystrokes and other factors
- “Virtual assistants” or “chatbots” that ask job applicants about their qualifications and reject those who do not meet predefined requirements
- Video interviewing software that evaluates candidates based on their facial expressions and speech patterns
- Testing software that scores applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or a more traditional test
The guidance notes that each of these types of software may include AI.
How An Employer’s Use of Algorithmic Decision-Making Tools Could violate the ADA
According to the EEOC, an employer’s use of algorithmic decision-making tools could violate the ADA by:
- Not providing a reasonable accommodation that job applicant or employee required to be rated fairly and accurately by the algorithm
- Relying on an algorithmic decision-making tool that intentionally or unintentionally prevents a job applicant or employee from meeting the criteria for a job opportunity
- Adopting an algorithmic decision-making tool for use with its job applicants or employees that violates the ADA’s restrictions on disability-related inquiries and medical examinations
In short, an employer’s use of an algorithmic decision-making tool may violate the ADA for one or more of the above reasons.
More and more businesses are relying on computer-based tools to assist in hiring workers, monitoring worker performance, determining pay or promotions, and establishing the terms and conditions of employment. The best way to ensure your business does not run afoul of the ADA and other equal employment opportunity laws is to contact a tech-savvy employment law attorney.