In 2022 US Government Agencies Warned about Artificial Intelligence Hiring Tech that Discriminates against Disabled Applicants In 2025 a legal complaint was filed on these issues

On This Page

This is an article about technology and new US government guidance about that technology. Companies and other organizations often use tech when hiring new employees. This tech may help a company review resumes. Or it may analyze video interviews and offer information about a job candidate. This type of technology can make it harder for disabled people to get hired. For example, technology that analyzes a person’s voice may make it harder for a person with a speech impairment to get hired. This may happen even when the person is qualified for the job. Two parts of the US Government, the Department of Justice (DOJ) and the Equal Employment Opportunity Commission (EEOC) wrote about this kind of tech. They both explain how to avoid discrimination against disabled people with these types of hiring tech.

Please note that some of the links in this article may be broken because the trump administration has pulled many important documents off of federal websites. The update section (found below this summary) will provide alternative links when available. Even if the republican administration withdraws the guidance mentioned here, its content can still be used by advocates and can still help organizations avoid discrimination.

The March 19 Update explains a new legal claim challenging unfair AI hiring tools.

Article updated

This article has been updated since it was first published on May 29, 2022. The most recent update was added on March 19, 2025. Read the updates for this article.

robot hand choosing a person out of many on a touchscreen

The unemployment rate for people with disabilities in the United States is at an unacceptable high rate. As the US Bureau of Labor Statistics stated in a February 2022 informational release: “Across all educational attainment groups, unemployment rates for persons with a disability were higher than those for persons without a disability.”

Two new resources from the US federal government address one type of barrier to the employment of disabled people: Algorithmic and Artificial Intelligence (AI) hiring tools that discriminate.

On May 12, 2022 the United States Department of Justice issued Algorithms, Artificial Intelligence, and Disability Discrimination in Hiring. The resource describes how certain technology used by organizations during the hiring process can limit disabled people’s ability to obtain employment.

On the same day, the US Equal Employment Opportunity Commission (EEOC) issued a related resource titled The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees. [Note: this page has been taken down by the trump administration. See the February 2, 2025 Update below to learn what this means and how to read the document on the Internet Archive.]

Two different federal agencies issued resources on this topic because, as the DOJ explains:

The Department of Justice enforces disability discrimination laws with respect to state and local government employers. The Equal Employment Opportunity Commission (EEOC) enforces disability discrimination laws with respect to employers in the private sector and the federal government. The obligation to avoid disability discrimination in employment applies to both public and private employers.DOJ May 2022 AI hiring guidance

These resources underscore the importance of having federal government agencies committed to disability rights. The guidance from the DOJ and the EEOC should be carefully reviewed by anyone responsible for hiring (talent acquisition), Human Resources (HR), Diversity, Equity and Inclusion (DEI), technology purchases, accessibility, accommodations at work, and marketing and communication around job openings.

Jump to:


Types of technology addressed in the government guidance

The EEOC guidance starts with definitions of software, algorithms, and artificial intelligence (AI). It then describes the wide range of technology embodying these terms. According to the EEOC resource, technologies that may discriminate against applicants with disabilities include:

resume scanners that prioritize applications using certain keywords; employee monitoring software that rates employees on the basis of their keystrokes or other factors; “virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or on a more traditional test. Each of these types of software may include AI.EEOC hiring tool guidance

Algorithms and artificial intelligence can lead to disability discrimination in hiring

The Department of Justice’s May 2022 guidance on AI hiring tools and discrimination starts out strong:

This guidance explains how algorithms and artificial intelligence can lead to disability discrimination in hiring. DOJ May 2022 AI hiring guidance

After a plain language explanation of how technology is used in the hiring process, there is a clear statement of how Americans with Disabilities Act impacts the entire employment process and technology choices:

The ADA applies to all parts of employment, including how an employer selects, tests, or promotes employees. An employer who chooses to use a hiring technology must ensure that its use does not cause unlawful discrimination on the basis of disability.DOJ May 2022 AI hiring guidance

Here are some other highlights of the DOJ and EEOC Guidance:

Employers are responsible if they buy technology that discriminates

The DOJ guidance underscores the importance of embedding accessibility and disability inclusion in procurement processes:

Employers must avoid using hiring technologies in ways that discriminate against people with disabilities. This includes when an employer uses another company’s discriminatory hiring technologies DOJ May 2022 AI hiring guidance

The EEOC guidance, in part written as questions and answers, also emphasizes an employer’s responsibility for the purchase of hiring technology that discriminates.

3. Is an employer responsible under the ADA for its use of algorithmic decision-making tools even if the tools are designed or administered by another entity, such as a software vendor?
In many cases, yes. For example, if an employer administers a pre-employment test, it may be responsible for ADA discrimination if the test discriminates against individuals with disabilities, even if the test was developed by an outside vendor. In addition, employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on the employer’s behalf.

An article about the new guidance in Wired Magazine was titled “Feds Warn Employers About Discriminatory Hiring Algorithms. It quoted Ben Winters from the Electronic Privacy Information Center. The “greatest benefit of the DOJ and EEOC Guidance,” he said, is that “It puts employers on notice that the agencies are expecting them to have a higher standard for the vendors they use.”

Back to top

Reasonable Accommodation policies must be implemented whenever AI hiring tools are used

Both the DOJ and EEOC Guidance specifically address the intersection of reasonable accommodation and algorithmic hiring tools. The DOJ gives examples of “practices that employers using hiring technologies may need to implement.” These include:

  • telling applicants about the type of technology being used and how the applicants will be evaluated
  • providing enough information to applicants so that they may decide whether to seek a reasonable accommodation
  • providing and implementing clear procedures for requesting reasonable accommodations and making sure that asking for one does not hurt the applicant’s chance of getting the job

The EEOC guidance focuses on employer responsibility in the accommodation process even if they contracted with third parties in connection with hiring technology that needs accommodations:

7. Is an employer responsible for providing reasonable accommodations related to the use of algorithmic decision-making tools, even if the software or application is developed or administered by another entity?
In many cases, yes. As explained in Question 3 above, an employer may be held responsible for the actions of other entities, such as software vendors, that the employer has authorized to act on its behalf. For example, if an employer were to contract with a software vendor to administer and score on its behalf a pre-employment test, the employer likely would be held responsible for actions that the vendor performed—or did not perform—on its behalf. Thus, if an applicant were to tell the vendor that a medical condition was making it difficult to take the test (which qualifies as a request for reasonable accommodation), and the vendor did not provide an accommodation that was required under the ADA, the employer likely would be responsible even if it was unaware that the applicant reported a problem to the vendor.

Section 12 of the EEOC guidances suggests steps an employer can take to ensure its hiring technologies and vendors do not discrimination. The first suggestion is

If the tool requires applicants or employees to engage a user interface, did the vendor make the interface accessible to as many individuals with disabilities as possible?EEOC hiring tool guidance

Want to avoid disability discrimination in Hiring?

old red, white, and gray silos in a fieldIn many organizations, HR accommodation teams and technology purchasing (procurement) teams rarely interact. Accessibility teams are too often not embedded throughout an organization. Marketing and communications teams advertising jobs are not made aware of accessibility and disability inclusion requirements.

The new guidance from the US Department of Justice and the Equal Employment Opportunity Commission is a call for breaking down internal silos to ensure disability inclusion in the hiring process.

Further Reading on AI and algorithmic hiring tools and disability discrimination

Please see the Updates to the article below for additional resources published after this article was written.

Updates to this article

March 19, 2025 Update

human-looking robot deep in thought On March 19, 2025, the American Civil Liberties Union (ACLU) issued a press release titled Complaint Filed Against Intuit and HireVue Over Biased AI Hiring Technology That Works Worse for Deaf and Non-White Applicants. The Complaint was filed with the Colorado Civil Rights Division and the Equal Employment Opportunity Commission (EEOC) on behalf of an “Indigenous and Deaf woman who was denied a promotion on the basis of her disability and her race.”

According to the press release:

The complaint alleges that the companies violated the Colorado Anti-Discrimination Act (CADA), the Americans with Disabilities Act (ADA), and Title VII of the Civil Rights Act by using biased AI technology in hiring, resulting in systemic discrimination.

It goes on to state that:

Research shows that the type of technology underlying HireVue’s system is often unable to accurately recognize and analyze the speech of a deaf applicant, resulting in lower scores. The technology also struggles with non-white applicants, including Indigenous English speakers, whose speech patterns may differ from white applicants.

Read the ACLU complaint against HireVue and Intuit. In addition to the ACLU, the disabled person in the case, identified as D.K., is represented by Public Justice, Eisenberg & Baum, LLP, and ACLU of Colorado.

I will update this article with developments about this case as I learn of them.

Back to the original article text

February 2, 2025 Update

This article includes a link to official Guidance from the Equal Employment Opportunity Commission dated May 12, 2022 titled “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees.” This Guidance was removed from the EEOC website during the first two weeks of the second trump administration. It can still be read on the Internet Archive WayBack Machine here.

The AI and employment discrimination guidance was not thew only resource removed from the federal government website after republicans came to power in 2025. The New York Times reported on February 2 that “more than 8,000 web pages” had been taken down over a 3 day period.

Although this Guidance is no longer on the EEOC’s website, and will likely remain off throughout the trump administration, Lainey believes its analysis of the Americans with Disabilities Act and how it may impact Artificial Intelligence hiring tools is still accurate.

Lawyers and advocates should be aware that the Guidance is no longer on the EEOC website if urging compliance with it. (Taking something off a website may mean it is no longer EEOC guidance, but it is possible that legally, official steps would need to be taken for the guidance to be officially repudiated.)

Yet in a court case, it may not matter. A judge may agree with the analysis stated in the Guidance as the correct interpretation of the Americans with Disabilities Act whether or not it is on the EEOC website. Judges and advocates can adopt the ADA analysis in this Guideline as their own without any reference at all to the guidelines.

And as explained in the article above, the Department of Justice issued a companion Guidance on the same issue. As of today’s date, that guidance remains on the DOJ website here.

Back to the original article text

October 30, 2024 Update

The Partnership on Employment and Accessible Technology (PEAT) has published an important new resource for anyone who cares about making “AI-enabled hiring tools more inclusive and accessible for disabled job seekers.”

The resource, announced in September 2024, is titled AI & Inclusive Hiring Framework. It is a detailed, easy to use online portal with information for everyone in an organization whose roles touch on hiring tool fairness, including people in Human Relations (HR), Information Technology (IT), Procurement, and more.

I’ve long been a fan of PEAT and encourage readers to explore other parts of the PEAT website. The group is funded by the U.S. Department of Labor’s Office of Disability Employment Policy and managed by federal contractors at Wheelhouse Group, a Cadmus Company.

Back to the original article text

August 20, 2024 Update

In August of 2024 this LFLegal website published its first guest article. Written by Artificial Intelligence (AI) fairness expert, advocate, and scholar Jutta Treviranus, the article is titled Artificial Intelligence (AI) vs. Difference.

Even though AI can also be very helpful to disabled people, the article discusses how artificial intelligence (AI) can make existing social inequalities worse and can cause more unfairness for disabled people and others.

Back to the original article text

January 1, 2023 Update

Updating this article to include a significant new resource on the important topic of avoiding discrimination with hiring tools that use artificial intelligence (AI).

On December 5, 2022, the Center for Democracy and Technology (CDT) issued a press release titled CDT & Top Civil Rights Groups Publish Standards to Ensure Fairness in Hiring Practices that Use Automated Tech.

The press release announced the publication of “Civil Rights Standards for 21st Century Employment Selection Procedures.”

The Standards are endorsed by disability and other civil rights organizations including the American Association for People with Disabilities (AAPD) and the Autistic Self-Advocacy Network (ASAN). They should be required reading for any organization using today’s hiring technology.

Back to the original article text