Attorneys working with ClassAction.org are investigating whether class action lawsuits can be filed on behalf of job applicants who believe AI harmed them during the screening and interview process. Employers should consider this warning that makes sure they are comply with the Fair Credit Reporting Act (FCRA).
Below is a summary of the article. We encourage you to read the article in its entirety.
Specifically, they believe certain companies that offer AI pre-employment screening services for employers could be illegally providing consumer reports about job applicants without adhering to the strict requirements of the Fair Credit Reporting Act (FCRA). [R&A comment – We provide a list of FCRA requirements here.] For instance, companies that provide consumer reports must follow procedures to ensure that the reports are accurate, provide copies of the reports and investigate disputes.
AI Hiring and Screening: How Is it Used?
The use of AI in recruiting and hiring is becoming increasingly common; for instance, the World Economic Forum reported in March 2025 that roughly 88% of companies use AI for initial candidate screening.
Is There Bias in AI Hiring?
An October 2024 survey of hundreds of business leaders indicates that roughly seven in 10 companies allow AI tools to reject candidates without any human oversight—and concerns are being raised that the lack of human involvement could leave room for discrimination and AI hiring bias.
According to the article, the findings indicate that human oversight of AI in the hiring process “remains essential” and that although the tools may reduce human biases in some areas, they “introduce new patterns of discrimination that require monitoring.”
Lawsuits Filed Over AI Employee Screening
In September 2023, iTutorGroup, which provides English-language tutoring services to students in China, paid $365,000 to settle an AI screening discrimination lawsuit brought by the Equal Employment Opportunity Commission (EEOC). The lawsuit claimed iTutorGroup programmed its AI recruitment software to automatically reject applications from female candidates who were 55 or older and male candidates who were 60 or older
Another AI hiring bias lawsuit filed in 2024 claims Workday’s job applicant screening technology discriminates against people over age 40. The plaintiff says he was rejected from over 100 jobs on the human resources software company’s platform due to his age, race and disabilities, and four other plaintiffs have since added their own age discrimination claims. The plaintiffs argue that their applications were rejected sometimes only hours or even minutes after submission, and during non-business hours, indicating that a human did not review the application.
In July 2024, CVS privately settled a proposed class action lawsuit filed by a job applicant who claimed the company broke Massachusetts law by having prospective employees take what legally amounted to a lie detector test. Specifically, the lawsuit alleged that applicants were required to undergo Hirevue video interviews, which used Affectiva’s AI technology to track facial expressions (e.g., smiles, smirks) and assign each candidate an “employability score.”
In addition to the risk of discrimination in AI hiring, concerns have been raised about data security and privacy, as AI-driven hiring tools can collect a significant amount of sensitive data, such as biometric identifiers, potentially without proper consent.
How Might AI Recruiters Violate the Fair Credit Reporting Act?
The Federal Trade Commission (FTC) notes that companies that provide screening services for employers may be considered consumer reporting agencies under the Fair Credit Reporting Act if they provide information that indicates a person’s “credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living.”
Under the FCRA, consumer reporting agencies are required to follow reasonable procedures to ensure the “maximum possible accuracy” of the reports and obtain certifications from their clients that they are complying with the FCRA. The companies are also required to give consumers access to their files when requested, investigate disputes and correct or delete any inaccurate, incomplete or unverifiable information.
Employers are also subject to FCRA rules when obtaining consumer reports for employment purposes. Before obtaining the report, the employer must inform the job applicant (in a standalone format separate from an application) that they may use information from the report for employment decisions, and they must also get the applicant’s written permission to obtain the report.
The employer must also certify to the provider of the consumer report that they will not discriminate against the applicant or otherwise misuse the information in the report.
If the employer takes an adverse action against an applicant (such as rejecting their application) based on information from their consumer report, the employer must provide the person a notice that contains a copy of the report and their rights under the FCRA. The adverse action notice must also include the name, address and phone number of the company that provided the report and inform the applicant that they have a right to dispute the accuracy and completeness of the information.
James P. Randisi, President of Randisi & Associates, Inc., has been helping employers protect their clients, workforce and reputation through implementation of employment screening and drug testing programs since 1999. This post does not constitute legal advice. Randisi & Associates, Inc. is not a law firm. Always contact competent employment legal counsel. To learn more about the rights of employees who test positive for marijuana, Mr. Randisi can be contacted by phone at 410.494.0232 or Email: info@randisiandassociates.com or the website at randisiandassociates.com


