Skip to Main Content
Publications

The EEOC Settles Its First Lawsuit Alleging AI-Based Discrimination in Employment

The Equal Employment Opportunity Commission (EEOC) entered into a settlement agreement in its first lawsuit alleging discrimination based on artificial intelligence (AI). In the lawsuit, the EEOC alleged that iTutorGroup, an online English-language tutoring company, and its affiliated companies, violated the Age Discrimination in Employment Act (ADEA), by programming their application software to automatically reject female applicants over the age of 55 and male applicants over the age of 60. The EEOC further alleged from late March 2020 until early April 2020, the defendants failed to hire the charging party and more than 200 other qualified tutor applicants aged 55 and older from the United States because of their age.

According to the EEOC's allegations, the charging party applied with the defendant using her real birthdate and was immediately rejected. The following day, the charging party applied using a more recent birthdate but an otherwise identical application and was offered an interview.

The defendants denied all allegations in the Amended Complaint and took the position that its tutors were independent contractors rather than employees.

On August 9, 2023, the EEOC filed a Joint Notice of Settlement and Request for Approval and Execution of Consent Decree which set forth the terms of the settlement agreement with the defendants. Although the presiding judge must approve the Decree before it becomes effective, the key terms include the following:

  • Defendants will pay the total gross sum of $365,000 which will be distributed among tutor applicants who were allegedly rejected by the defendants because of age in March and April 2020.
  • Defendants are enjoined from the following:
    • Rejecting or screening any tutor applicants ages 40 or over because of age
    • Requesting birth dates of tutor applicants before a job offer is made
    • Rejecting tutor applicants because of sex or screening applicants based on sex
    • Retaliating against any employee for engaging in protected activity
  • Defendants must do the following:
    • Provide an EEOC-approved memo identifying the requirements of federal antidiscrimination laws to all employees or independent contractors who may be involved in the screening, hiring, or supervising of tutor applicants or tutors;
    • Adopt and incorporate EEOC-approved antidiscrimination policies and complaint procedures applicable to the screening, hiring, and supervision of tutors and tutor applicants that meet specific requirements outlined by the EEOC;
    • Provide EEOC-approved four-hour training programs conducted by EEOC-approved third-parties for all supervisory and management employees, as well as any employees or independent contractors who may be involved in screening, hiring, or supervising tutor applicants and tutors;
    • Provide training for new employees within thirty (30) days of their employment and continued training to employees on an annual basis;
    • Provide written notice to the EEOC of any verbal or written complaints of discrimination from employees or applicants; and
    • Contact all applicants who were purportedly rejected by the defendants because of age in March 2020 and April 2020, and invite them to reapply. Any applicants who reapply must be interviewed. The defendants must provide the EEOC with the outcome of each applicant's application and interview, and a detailed explanation as to why any offer was not made.
  • The EEOC may monitor the compliance of the defendants with the Decree through inspection of the premises and records of the defendants and interviews with the defendants' officers, agents, employees, and independent contractors.

The EEOC's settlement with iTutorGroup serves as a good reminder to employers that the EEOC has made it a priority to have employers' use of AI in the workplace comply with federal antidiscrimination laws. In January 2023, the EEOC issued its draft enforcement plan for 2023 through 2027 (the SEP) which confirmed that the agency will focus on the use of AI throughout the span of an employee's employment – from recruitment to application to performance management.

In May 2023, the EEOC released a technical assistance document, "Assessing Adverse Impact in Software Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964," addressing disparate impact discrimination against job applicants and employees that may be caused by AI. The EEOC's guidance makes it clear that employers bear the burden of compliance with AI employment tools. Importantly, the guidance states that an employer may be held liable for the actions of agents, including software vendors.

The EEOC's AI guidance, along with its recent agreement in the iTutorGroup case, illustrates its commitment to pursuing action to rectify any disparate impact caused by AI tools used in the employment context. Regardless of whether the employer or a third party programmed the AI, the employer will be held responsible for its outcome. Thus, employers must take affirmative steps to ensure that such AI tools comply with federal, state, and local laws.

Key Takeaways

  • Employers must ensure that any AI used in making employment decisions, including AI created by vendors, does not result in disparate treatment or disparate impact on persons in protected categories as set forth in laws like Title VII, the ADEA, and the Americans with Disabilities Act.
  • Vendors do not have responsibility for the decisions their tools make. Thus, employers cannot rely solely on representations from vendors regarding their software's compliance with discrimination laws. Employers must understand the AI being used and conduct regular audits to confirm its compliance. Employers should also evaluate whether their agreement with the AI vendor provides for indemnification or contribution in the event the employer is sued over its use of the tool.
  • Employers should ensure that employees who have access to AI systems are properly trained by the vendor or creator of the AI being used to ensure that they are using it correctly and not inadvertently creating bias with a few clicks or modifications of the software.

For additional information, or if you would like assistance in reviewing company policies and practices related to the use of AI, please contact Jennifer K. Dunlap or any member of the Firm's Labor & Employment Group.

To learn more about AI and how Baker Donelson can be of help to your business in understanding the legal implications and risk mitigation strategies, please visit our Artificial Intelligence practice overview.

Email Disclaimer

NOTICE: The mailing of this email is not intended to create, and receipt of it does not constitute an attorney-client relationship. Anything that you send to anyone at our Firm will not be confidential or privileged unless we have agreed to represent you. If you send this email, you confirm that you have read and understand this notice.
Cancel Accept