EEOC Issues Guidance on the Use of Artificial Intelligence Tools Thumbnail

EEOC Issues Guidance on the Use of Artificial Intelligence Tools

Reposted from the Labor & Employment Law Navigator Blog - Click Here to Subscribe

Since at least 1978 when the U.S. Equal Opportunity Commission (“EEOC”) issued guidance on hiring tools, employers have known that they need to analyze carefully any testing procedures they utilize to screen potential employees and current employees in order to ensure that they are properly validated and do not discriminate against individuals or protected classes under the disparate treatment and disparate impact theories of discrimination. On May 12, 2022, the U.S. Equal Employment Opportunity Commission (“EEOC”) for the first time issued guidance in a question and answer format to employers, employees and applicants on the use of artificial intelligence tools.

Artificial intelligence (“AI”) tools are often times used by employers to assist them and save them time in making decisions regarding hiring new employees, monitoring their work performance, determining wages and promotions, as well as other terms and conditions of employment. These tools typically rely upon software that uses algorithms to aid the decision-making process. The concern of the EEOC is that artificial intelligence tools utilized by employers may have a disparate impact on individuals, both applicants and current employees, with disabilities, and therefore violate Title I of the Americans with Disabilities Act (“ADA”). The technical assistance guidance published by the EEOC gives employers practical tips on how to comply with the ADA, and to applicants and employees whose rights may have been violated.

Examples of some of the AI tools that concern the EEOC are: “resume scanners that prioritize applications using certain keywords; employee monitoring software that rates employees on the basis of their keystrokes or other factors; “virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or on a more traditional test.”

While the technical assistance guidance was issued by the EEOC, it was done in conjunction with the U.S. Department of Justice (“DOJ”), as the EEOC enforces disability discrimination laws for the private sector and federal employees, whereas the DOJ enforces such laws for state and local government employees. The guidance comes after disability advocates for years have been complaining of discrimination via employer testing programs, and been clamoring for action with respect to the use of AI. In this regard, it is estimated that more than 80% of employers use some form of automated tools to screen candidates for hire.

The guidance explains the meaning of AI as well as “software and algorithms,” and how they relate to one another when used in the workplace. It then goes on to discuss the basics of the ADA and reasonable accommodation, as well as algorithmic-decision making tools that screen out qualified individuals with disabilities and that violate the prohibitions on disability-related inquiries and medical examinations. It finishes by providing “Promising Practices” for employers, job applicants and employees. In particular, the technical assistance guidance directs employers to be critical of any AI tools they may use, and it includes questions that employers should ask vendors of AI tools. The EEOC emphasizes that any AI tools should focus on determining the abilities or qualifications that are needed for a job, regardless of whether a reasonable accommodation is needed to perform the essential job functions.

Related professionals

Related practices