Home > Articles > The Algorithm Is on Trial: What the Workday Ruling Means for Every ATS Administrator

The Algorithm Is on Trial: What the Workday Ruling Means for Every ATS Administrator

The Algorithm Is on Trial: What the Workday Ruling Means for Every ATS Administrator

If you’re a system administrator, or HR tech professional managing applicant screening tools, a March 2026 federal ruling just changed your exposure. A federal judge confirmed that job applicants receive the exact same ADEA protections as employees — meaning every workflow you configure is now a legal decision, not just an operational one. Here’s an ICIMS consultant perspective.

In March 2026, a federal judge looked at the Age Discrimination in Employment Act and said something that changes the game for everyone managing applicant-facing technology: job applicants get the same protections as employees.

Not “similar” protections. Not “some” protections. The same ones.

That means the tools we configure, the workflows we build, and the criteria we set are not just operational decisions. They are legal ones.

Why This Ruling Matters More Than Anything Else in This Case

Let’s back up. Workday, Inc. is facing a massive federal lawsuit (Mobley v. Workday, Inc., N.D. Cal.) alleging that its AI-powered applicant screening tools discriminated against job seekers on the basis of age, race, and disability. The case has been building since February 2023 and has produced several significant rulings. But one stands above the rest.

Workday’s legal team made what looked like a strong argument. They said the ADEA’s disparate impact protections were designed for employees, not applicants. They pointed to Congress’s failed attempts to amend the statute to explicitly include applicants, and argued that the Supreme Court’s 2024 Loper Bright decision (which ended courts deferring to agency interpretations of ambiguous laws) should wipe out prior precedent that extended those protections.

Judge Rita Lin rejected both arguments.

She wrote that the Supreme Court has already dismissed the logic of using failed amendments to limit a statute’s scope. She also held that Loper Bright did not disturb the existing district court precedent affirming ADEA coverage for applicants.

The result: there is no legal distinction between an applicant and an employee when it comes to age-based disparate impact claims.

This matters because Workday’s entire defense strategy rested on that distinction. Without it, every decision the algorithm makes about who moves forward and who gets screened out is subject to the same legal scrutiny as a termination or a denied promotion.

What That Means for the People Managing These Systems

If you’re an ICIMS system admin, a Workday admin, or anyone responsible for ATS implementation and configuration, here’s the shift.

Before this ruling, there was a perception (and in some circuits, legal support for the idea) that employers had more latitude in how they filtered applicants. The thinking was that the applicant pool is broad, the tools are neutral, and the employment relationship hasn’t started yet.

That framework is gone now, at least in this court.

Every knockout question, every screening criterion, every AI scoring model, every automated rejection workflow you configure now carries the same legal weight as a decision made about a current employee. If your screening process produces a pattern of adverse impact against applicants over 40, the fact that they were “just applicants” is no longer a shield.

And the scale is staggering. Workday disclosed in its own filings that approximately 1.1 billion applications were rejected through its system during the relevant period. The court-certified collective could include hundreds of millions of members. The court ordered Workday to hand over a list of every customer using its AI screening features so impacted applicants can be identified and notified.

The Full Picture: Where This Case Stands

Here’s a quick timeline of the major rulings.

July 2024: Judge Lin denied Workday’s motion to dismiss and ruled the company could be held liable as an “agent” of employers using its platform. The court distinguished between simple tools (spreadsheets, email) and Workday’s AI, which actively scores, ranks, and recommends whether to advance or reject candidates. The court stated that drawing an artificial distinction between software decisionmakers and human decisionmakers would gut anti-discrimination laws.

May 2025: The court granted preliminary certification of a nationwide collective action under the ADEA. The collective includes all individuals aged 40 and over who were denied employment recommendations through Workday’s platform from September 24, 2020 to the present.

July 2025: The court expanded the collective to include applicants processed using Workday’s HiredScore AI features.

December 2025: The court approved a notice plan and ordered Workday to produce its customer list so affected applicants could be notified.

March 2026: The court rejected Workday’s argument that ADEA disparate impact protections apply only to employees, ruling that applicants are equally protected.

What Is Still Being Litigated

The age discrimination claim is the only one that has achieved collective certification. Race claims under Title VII, disability claims under the ADA, and certain California state law claims are still being litigated individually and have not been certified for class treatment.

In March 2026, the court dismissed some of those state claims and an individual plaintiff’s disability claim. Plaintiffs have since filed an amended complaint to revive them.

Both sides are now in discovery. Expert testimony on algorithmic bias will follow. Workday still has the right to move for decertification of the collective, and the court has not yet ruled on whether the tools actually discriminate. What has been established is that the allegations can proceed collectively, and that a software vendor can be held legally responsible as an agent.

This case could take two to five years to resolve. It’s not over. But the precedent being set along the way is already reshaping the landscape.

The ICIMS Consultant’s Action List After Mobley v. Workday

This is not a “wait and see” situation. If you are configuring, maintaining, or managing any platform that screens, scores, ranks, or filters applicants, here is your action list.

Audit your screening workflows for adverse impact. Pull your data and analyze rejection rates by age, race, sex, and disability status. If your screening criteria or AI tools are producing a pattern of disparate results, you need to know about it before someone else does the math for you. If you need help evaluating your current configuration, working with an experienced iCIMS consulting partner can help you identify exposure points before they become legal ones.

Document your configuration decisions. Every knockout question, every screening threshold, every scoring weight should be documented with a clear business justification. “That’s how it was set up when I got here” is not a defensible answer.

Review your vendor contracts. Your agreements with AI hiring tool vendors should include representations about EEO compliance, indemnification for bias claims, and transparency about how the algorithm makes decisions. If your contract is silent on these points, it’s time for a conversation.

Preserve applicant data. Rejection records, screening scores, workflow configurations, and any documentation of how AI features were deployed should be retained. If litigation reaches your organization, your defense depends on the records you kept.

Understand that “we didn’t build it” is not a defense. The Mobley court has established that employers who delegate hiring decisions to third-party AI tools do not delegate their legal exposure. If a vendor’s tool produces discriminatory outcomes, the employer is on the hook too.

Stay current on the regulatory landscape. California adopted final regulations on automated decision-making systems in March 2025, including AI hiring tools. Other states are following. A federal executive order in April 2025 directed agencies to reduce disparate impact enforcement, but that does not affect private lawsuits. The legal exposure is coming from state regulators and individual plaintiffs, and that pipeline is growing.

Understanding these risks and building compliant screening workflows from the start is one of the clearest ways to protect your iCIMS ROI and your organization’s reputation.

The Bottom Line

Mobley v. Workday has produced a lot of significant rulings. But the one that should keep you up at night is the simplest one to understand.

The applicant is now legally indistinguishable from the employee when it comes to age-based disparate impact protections.

That means the tools we configure, the workflows we build, and the criteria we set are not just operational decisions. They are legal ones. Every screening rule that touches an applicant now carries the same weight as a policy applied to your current workforce.

For those of us managing these systems, that is not a footnote. That is the whole story.

FAQ

What is the Mobley v. Workday lawsuit about? Mobley v. Workday is a federal lawsuit alleging that Workday’s AI-powered applicant screening tools discriminated against job applicants on the basis of age, race, and disability. It is one of the first large-scale legal tests of AI hiring tools in federal court.

Does the ADEA protect job applicants or just current employees? As of the March 2026 ruling in this case, the court held that the ADEA’s disparate impact protections apply equally to job applicants and employees. There is no legal distinction between the two when it comes to age-based discrimination claims.

Can a software vendor be held liable for hiring discrimination? Yes. The court ruled that Workday can be held liable as an “agent” of the employers using its platform, because its AI tools actively participate in the hiring process by scoring, ranking, and recommending candidates. This principle could extend to other ATS and AI hiring tool vendors.

What should iCIMS system admins do in response to this ruling? Audit your screening workflows for adverse impact by protected class, document your configuration decisions with business justifications, review vendor contracts for EEO compliance and indemnification provisions, and preserve all applicant data and rejection records.

Does this lawsuit only affect companies using Workday? No. The legal principles being established in this case, particularly around vendor liability and applicant protections, apply to any company using AI-powered screening, scoring, or ranking tools in the hiring process, regardless of platform.

 

RELATED POSTS

System Admin Insights
Subscribe to our newsletter
Get exclusive access to the full learning opportunity