Two US Government Agencies Warn about Hiring Technology that Discriminates against Disabled Applicants

On This Page

This is an article about technology and new US government guidance about that technology. Companies and other organizations often use tech when hiring new employees. This tech may help a company review resumes. Or it may analyze video interviews and offer information about a job candidate. This type of technology can make it harder for disabled people to get hired. For example, technology that analyzes a person’s voice may make it harder for a person with a speech impairment to get hired. This may happen even when the person is qualified for the job. Two parts of the US Government wrote about this kind of tech. They both explain how to avoid discrimination against disabled people with these types of hiring tech. One resource is from the US Department of Justice (DOJ). One resource is from the Equal Employment Opportunity Commission (EEOC).

Article updated

This article has been updated since it was first published on May 29, 2022. The most recent update was added on January 1, 2023. Read the updates for this article.

robot hand choosing a person out of many on a touchscreen

The unemployment rate for people with disabilities in the United States is at an unacceptable high rate. As the US Bureau of Labor Statistics stated in a February 2022 informational release: “Across all educational attainment groups, unemployment rates for persons with a disability were higher than those for persons without a disability.”

Two new resources from the US federal government address one type of barrier to the employment of disabled people: Algorithmic and Artificial Intelligence (AI) hiring tools that discriminate.

On May 12, 2022 the United States Department of Justice issued Algorithms, Artificial Intelligence, and Disability Discrimination in Hiring. The resource describes how certain technology used by organizations during the hiring process can limit disabled people’s ability to obtain employment.

On the same day, the US Equal Employment Opportunity Commission (EEOC) issued a related resource titled The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees.

Two different federal agencies issued resources on this topic because, as the DOJ explains:

The Department of Justice enforces disability discrimination laws with respect to state and local government employers. The Equal Employment Opportunity Commission (EEOC) enforces disability discrimination laws with respect to employers in the private sector and the federal government. The obligation to avoid disability discrimination in employment applies to both public and private employers.DOJ May 2022 AI hiring guidance

These resources underscore the importance of having federal government agencies committed to disability rights. The guidance from the DOJ and the EEOC should be carefully reviewed by anyone responsible for hiring (talent acquisition), Human Resources (HR), Diversity, Equity and Inclusion (DEI), technology purchases, accessibility, accommodations at work, and marketing and communication around job openings.

Jump to:


Types of technology addressed in the government guidance

The EEOC guidance starts with definitions of software, algorithms, and artificial intelligence (AI). It then describes the wide range of technology embodying these terms. According to the EEOC resource, technologies that may discriminate against applicants with disabilities include:

resume scanners that prioritize applications using certain keywords; employee monitoring software that rates employees on the basis of their keystrokes or other factors; “virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or on a more traditional test. Each of these types of software may include AI.EEOC hiring tool guidance

Algorithms and artificial intelligence can lead to disability discrimination in hiring

The Department of Justice’s May 2022 guidance on AI hiring tools and discrimination starts out strong:

This guidance explains how algorithms and artificial intelligence can lead to disability discrimination in hiring. DOJ May 2022 AI hiring guidance

After a plain language explanation of how technology is used in the hiring process, there is a clear statement of how Americans with Disabilities Act impacts the entire employment process and technology choices:

The ADA applies to all parts of employment, including how an employer selects, tests, or promotes employees. An employer who chooses to use a hiring technology must ensure that its use does not cause unlawful discrimination on the basis of disability.DOJ May 2022 AI hiring guidance

Here are some other highlights of the DOJ and EEOC Guidance:

Employers are responsible if they buy technology that discriminates

The DOJ guidance underscores the importance of embedding accessibility and disability inclusion in procurement processes:

Employers must avoid using hiring technologies in ways that discriminate against people with disabilities. This includes when an employer uses another company’s discriminatory hiring technologies DOJ May 2022 AI hiring guidance

The EEOC guidance, in part written as questions and answers, also emphasizes an employer’s responsibility for the purchase of hiring technology that discriminates.

3. Is an employer responsible under the ADA for its use of algorithmic decision-making tools even if the tools are designed or administered by another entity, such as a software vendor?
In many cases, yes. For example, if an employer administers a pre-employment test, it may be responsible for ADA discrimination if the test discriminates against individuals with disabilities, even if the test was developed by an outside vendor. In addition, employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on the employer’s behalf.

An article about the new guidance in Wired Magazine was titled “Feds Warn Employers About Discriminatory Hiring Algorithms. It quoted Ben Winters from the Electronic Privacy Information Center. The “greatest benefit of the DOJ and EEOC Guidance,” he said, is that “It puts employers on notice that the agencies are expecting them to have a higher standard for the vendors they use.”

Back to top

Reasonable Accommodation policies must be implemented whenever AI hiring tools are used

Both the DOJ and EEOC Guidance specifically address the intersection of reasonable accommodation and algorithmic hiring tools. The DOJ gives examples of “practices that employers using hiring technologies may need to implement.” These include:

  • telling applicants about the type of technology being used and how the applicants will be evaluated
  • providing enough information to applicants so that they may decide whether to seek a reasonable accommodation
  • providing and implementing clear procedures for requesting reasonable accommodations and making sure that asking for one does not hurt the applicant’s chance of getting the job

The EEOC guidance focuses on employer responsibility in the accommodation process even if they contracted with third parties in connection with hiring technology that needs accommodations:

7. Is an employer responsible for providing reasonable accommodations related to the use of algorithmic decision-making tools, even if the software or application is developed or administered by another entity?
In many cases, yes. As explained in Question 3 above, an employer may be held responsible for the actions of other entities, such as software vendors, that the employer has authorized to act on its behalf. For example, if an employer were to contract with a software vendor to administer and score on its behalf a pre-employment test, the employer likely would be held responsible for actions that the vendor performed—or did not perform—on its behalf. Thus, if an applicant were to tell the vendor that a medical condition was making it difficult to take the test (which qualifies as a request for reasonable accommodation), and the vendor did not provide an accommodation that was required under the ADA, the employer likely would be responsible even if it was unaware that the applicant reported a problem to the vendor.

Section 12 of the EEOC guidances suggests steps an employer can take to ensure its hiring technologies and vendors do not discrimination. The first suggestion is

If the tool requires applicants or employees to engage a user interface, did the vendor make the interface accessible to as many individuals with disabilities as possible?EEOC hiring tool guidance

Want to avoid disability discrimination in Hiring?

old red, white, and gray silos in a fieldIn many organizations, HR accommodation teams and technology purchasing (procurement) teams rarely interact. Accessibility teams are too often not embedded throughout an organization. Marketing and communications teams advertising jobs are not made aware of accessibility and disability inclusion requirements.

The new guidance from the US Department of Justice and the Equal Employment Opportunity Commission is a call for breaking down internal silos to ensure disability inclusion in the hiring process.

Further Reading on AI and algorithmic hiring tools and disability discrimination

Please see the Updates to the article below for additional resources published after this article was written.

Updates to this article

January 1, 2023 Update

Updating this article to include a significant new resource on the important topic of avoiding discrimination with hiring tools that use artificial intelligence (AI).

On December 5, 2022, the Center for Democracy and Technology (CDT) issued a press release titled CDT & Top Civil Rights Groups Publish Standards to Ensure Fairness in Hiring Practices that Use Automated Tech.

The press release announced the publication of “Civil Rights Standards for 21st Century Employment Selection Procedures.”

The Standards are endorsed by disability and other civil rights organizations including the American Association for People with Disabilities (AAPD) and the Autistic Self-Advocacy Network (ASAN). They should be required reading for any organization using today’s hiring technology.

Back to the original article text