EEOC Releases Guidance on How Employer Software and Artificial Intelligence Can Discriminate People with Disabilities | Sheppard Mullin Richter & Hampton LLP

[co-author: Wolfram Ott]*

On May 12, 2022, the Equal Employment Opportunity Commission (“EEOC”) released tips discussing the application of the Americans with Disabilities Act (“ADA”) to employers using software, algorithms and artificial intelligence in hiring and employment decisions. Produced as part of the EEOC’s launch of its Initiative on Artificial Intelligence and Algorithmic Fairness in October 2021, the EEOC’s latest guidance reflects its goal of ensuring that employers using technology in hiring and employment decisions comply with federal civil rights laws. Notably, the guidelines were released days after the EEOC filed a lawsuit against a software company alleging age discrimination, potentially flagging similar actions related to the use of artificial intelligence in the context. employment. Below are some key points about the new guidelines.

Scope and definitions

The guidelines involve a wide range of technologies commonly used by employers, including software, algorithms and artificial intelligence:

  • Software: Refers to information technology programs or procedures that provide instructions to a computer on how to perform a particular task or functions. Examples of software used in employment and hiring decisions include automatic resume filtering software, hiring software, and video interview software.
  • Algorithms: A set of instructions that a computer can follow to accomplish an end. Human resource software and applications use algorithms to allow employers to process data to rank, assess, score, and make other decisions about candidates and employees.
  • Artificial Intelligence (“AI”): Congress has defined “AI” as a “machine-based system that can, for a given set of human-defined goals, make predictions, recommendations, or decisions that influence real or virtual environments.” The use of AI in the employment context usually means that the developer relies in part on the computer’s analysis of data to determine the criteria to use when making employment decisions. use. AI can include machine learning, computer vision, natural language processing and understanding, intelligent decision support systems, and autonomous systems.

Employers may use tools that include a combination of these terms. For example, an employer may use resume selection software that incorporates an algorithm created by human design or an algorithm supplemented by AI data analysis.

Ways Algorithmic Decision Tools Can Violate the ADA

The guidance examines the three most common ways an employer’s use of algorithmic decision-making tools could violate the ADA. This includes the following:

  • When an employer fails to provide job applicants and employees with disabilities with the “reasonable accommodations” necessary for the assessment tool to be fair and accurate for the applicant or employee. The guidelines make it clear that when an employer uses software tools, AI, or algorithmic tools to assess applicants or employees, the ADA requires that reasonable accommodations be made for individuals if their disability makes such an assessment difficult. or results in a less favorable situation. results. For example, the EEOC states that a job seeker with limited manual dexterity may report that they would have difficulty passing a knowledge test that requires a manual input device such as a keyboard or touchpad. In this case, the employer should provide an accessible version of the test (for example, a version in which the jobseeker is able to provide answers orally rather than manually) as a reasonable accommodation, unless this does not cause undue hardship. .
  • When technology “filters” people with disabilities, intentionally or not. “Screening” is illegal when a person who would otherwise be able to perform the essential functions of a job loses that job because they are unable to complete an assessment, or their performance on that assessment suffers, due of his disability. This can happen even when an assessment claims to be ‘unbiased’. For example, if a chatbot is programmed with an algorithm that rejects all candidates who, during their “conversation” with the chatbot, indicate that they have significant gaps in their career path. If a particular candidate had a gap in the job and if the gap was caused by a disability (for example, if the person had to stop working for treatment), then the chatbot can work to weed out that person due to his handicap. .
  • When the assessment contains “disability-related inquiries” or functions as an unauthorized “medical examination”. Any question that may prompt an employee or candidate to provide information about a disability or that the person has a disability, either directly or indirectly, is a “disability-related inquiry”. Questions aimed at obtaining information about a person’s physical and mental impairments or health may also be referred to as a “medical examination”.

Employer responsibility for supplier technology

Importantly, the EEOC guidelines state that employers are generally responsible for the discriminatory effects of software used in the hiring process, even when the software is used by a third party on behalf of the employer.

Best Practices for Employers

The EEOC offered “promising practices” for employers seeking to ensure compliance with the ADA. These recommendations provide useful suggestions on the means by which employers can protect themselves against claims of discrimination on the basis of disability. These recommendations include:

  • Inform candidates or employees of how an assessment will be conducted, let them know that accommodations are available if needed, and explain the process for requesting such accommodations;
  • Explain what traits an assessment is designed to measure and how that measurement will be made;
  • Develop alternative methods of assessing employees and candidates where standard tools may disadvantage people with disabilities;
  • Require all vendors and third parties acting on the Company’s behalf to conduct assessments to convey all accommodation requests, or require the third party to provide reasonable accommodations as required by the ADA;
  • Ensure that technologies used to assess employees and applicants are designed to be accessible to people with a wide range of disabilities;
  • Ensure that assessments only measure the traits, skills, abilities, or qualities needed for the position, and measure those qualities directly; and
  • Confirm that software and applications, algorithms, artificial intelligence and other assessment systems do not ask questions about disability, physical or mental impairments or health, unless these questions are related to applications for reasonable accommodation.

Key points to remember

May marked the EEOC’s first new AI developments since the launch of the Artificial Intelligence and Algorithmic Fairness Initiative. These new guidelines provide much-needed information on how the EEOC will apply the ADA as it relates to AI going forward. Employers should use the “promising practices” provided to ensure compliance and avoid potential liability.

Since this issue is still in development, we will continue to monitor developments in this area and provide updates as new information becomes available.

*Wolfram Ott is a summer associate in the Labor and Employment group and contributed to this article.