The Litigation Risks of Using AI in Hiring: What Startups Should Know About Employment Law Compliance

According to a 2024 Resume Builder survey of nearly 1,000 companies, the use of AI in recruitment and hiring has become increasingly common:

  • 82% of companies use AI to review resumes

  • 40% of companies employ AI chatbots to communicate with candidates

  • 23% of companies use AI to conduct interviews

  • 64% of companies apply AI to review candidate assessments.

  • Additionally, 28% of companies use AI for onboarding new hires, and 42% scan social media or personal websites as part of the hiring process.

Survey of nearly 1,000 companies reveals how AI is transforming every stage of hiring — from resume screening to onboarding.

Although AI can help companies streamline the recruitment and hiring process, there are some possible pitfalls to be aware of. Companies should ensure that any AI screening tools used do not run afoul of any federal or state anti-discrimination laws and make sure their businesses are protected from possible lawsuits. 

For example, federal law prohibits discrimination against individuals over the age of 40 in hiring and firing decisions. A recent age discrimination lawsuit illustrates some of the risks involved in outsourcing screening to AI. In a lawsuit against Workday Inc., Derek Mobley claimed that Workday's artificial intelligence (AI)-driven applicant screening tools have systematically disadvantaged him and other older job seekers. Workday is an AI platform for HR and finance that is used by over 11,000 organizations worldwide, and one of the tools offered by Workday is algorithmic screening for job applicants to these organizations. Mobley, who is over the age of 40, submitted more than 100 applications to different jobs on Workday’s platform and was rejected from all of them. His lawsuit, which claims that the AI tools unfairly penalize older candidates, was initially dismissed, but he was allowed to file an amended complaint. Now, the U.S. District Court for the Northern District of California has allowed Mobley’s case to proceed under the Age Discrimination in Employment Act (ADEA) based on a disparate impact theory. In May 2025, the Court allowed the lawsuit to move forward as a nationwide collective action lawsuit that other affected individuals can “opt in” to. Workday has denied the allegations and the case is ongoing.

Here are some key take-aways for employers using AI in their recruiting and hiring process:

  1. Review vendor and internal protocols: ensure that any AI screening used does not disproportionately filter out candidates based on protected categories, such as gender, age, and race. Review applicable federal, state, and local laws and communicate with any vendors about compliance and screening policies. 

  2. Maintain clear documentation: For hiring, compensation, and termination decisions, maintain clear records. For any job descriptions, ensure that the requirements are non-discriminatory and are tied to the essential functions of the position. For example, if the position has certain physical components, including a requirement in a job description that an employee must be able to lift a certain amount is permitted, but only if it is consistent with the actual essential functions and requirements of the position.

  3. Keep human oversight: Make sure that hiring decisions are not entirely outsourced to AI. For resume screening, it is recommended to analyze outcomes across race, gender, and other protected classes if possible. If you notice significant disparities, consider reviewing the screening system and applicant pool to see if it is possible to identify the causes. 

  4. Keep an eye on legal developments: the EEOC (Equal Employment Opportunity Commission) and DOL (Department of Labor) have withdrawn their federal-level AI guidance. Some states, such as Colorado and Illinois, are stepping in with laws governing the use of AI in screening and interview tools. In New York City, Local Law 144 went into effect in July 2023 and prohibits employers and employment agencies from using an automated employment decision tool unless the tool has been subject to a bias audit within one year of the use of the tool, information about the bias audit is publicly available, and certain notices have been provided to employees or job candidates.

If you have specific questions about using AI in recruitment and hiring, let’s connect and make sure your business is protected against potential discrimination claims.

Next
Next

Employee compensation considerations for start-ups: salary, equity, and other benefits