Home > Articles > AI Candidate Screening: How Does iCIMS Compare?

AI Candidate Screening: How Does iCIMS Compare?

AI Candidate Screening: How Does iCIMS Compare?

AI-Powered Candidate Screening in ATS (2023–2025): iCIMS vs. Workday, Greenhouse, SmartRecruiters, and Ashby

Overview: Rise of AI Screening and Regulatory Pressure (2023–2025)

In the past two years, applicant tracking system (ATS) vendors have increasingly incorporated AI-driven candidate screening tools – such as automated resume ranking and filtering – but with varying degrees of aggressiveness and caution. The period 2023–2025 saw heightened regulatory and ethical scrutiny around AI in hiring (e.g. New York City’s Local Law 144 bias-audit requirement).

Some ATS providers rushed to deploy AI-powered screening for efficiency, while others took a conservative “human-in-the-loop” approach to avoid bias pitfalls. Below we compare iCIMS with Workday, Greenhouse, SmartRecruiters, and Ashby in terms of their AI screening adoption, product announcements, ethical stances, and any notable partnerships or issues.

iCIMS: “Responsible AI” with Candidate Ranking and Compliance

iCIMS has integrated AI into its Talent Cloud platform for years – for example, its “Candidate Ranking” feature uses an AI-based Role Fit algorithm to match applicant skills/experience to job requirements and surface top candidates. Rather than avoiding automated screening, iCIMS embraced it (bolstered by its 2020 acquisition of AI firm Opening.io), but has been cautious and compliance-focused.

Notably, iCIMS established a Responsible AI program and proactively aligned with emerging laws. When NYC’s bias-audit law took effect, iCIMS identified Candidate Ranking as an “Automated Employment Decision Tool” (AEDT) and commissioned independent bias audits in 2022 and 2023.

These audits found favorable results and iCIMS publishes summaries to help customers comply with disclosure requirements. iCIMS stresses transparency and fairness – its CTO has emphasized that the company delivers “purpose-built AI… with the highest standards of accuracy, usability and trust”.

In practice, iCIMS’ AI provides recommendations (e.g. ranking or matching) to recruiters rather than fully automated hiring decisions, keeping recruiters involved in final screening. In 2024, iCIMS even unveiled a “Copilot” generative AI assistant (for writing job descriptions, interview questions, etc.), continuing its AI innovation but within a framework of careful, ethical use.

In summary, iCIMS has been an adopter of AI screening tools, but in a relatively cautious manner – coupling new AI features with bias audits, opt-outs, and “trusted” AI branding to mitigate risks.

This stance positions iCIMS as more conservative than some peers that pushed AI screening aggressively without such early guardrails, though less conservative than vendors who avoided AI screening altogether.

Workday: Aggressive AI Adoption and a Bias Lawsuit

Workday (maker of Workday Recruiting) has taken an aggressive approach to AI-powered screening, which has come under legal scrutiny. Workday’s system uses machine learning to help client companies filter and rank candidates – for instance, by analyzing workforce data to predict “best fit” applicants. By 2023, Workday was enabling many large employers to automate resume screening at scale, but this led to a novel class-action lawsuit alleging that Workday’s AI hiring tools “screen out” certain candidates in biased ways.

In Mobley v. Workday (2023), a job seeker claimed Workday’s algorithm – trained on employer data – disproportionately rejected applicants who were Black, over 40, or had disabilities. In mid-2024 a federal judge allowed key claims to proceed, suggesting Workday could be liable as an “agent” of its customers for AI-driven discrimination. This lawsuit underscores the reputational and legal risk of Workday’s fast adoption of AI screening.

Despite this, Workday has doubled down on AI capabilities while signaling a more “responsible AI” posture. In February 2024, Workday announced it will acquire HiredScore, an AI talent-screening platform, to enhance Workday’s recruiting and AI-driven candidate grading offerings. Workday touted that combining HiredScore with its own Talent Cloud will provide a “comprehensive, transparent, and intelligent” hiring solution with “AI technologies that keep humans at the center”.

In other words, Workday is investing heavily in AI for recruitment (via acquisitions and new AI features announced at events like Workday Rising), but it now emphasizes bias mitigation and human oversight in response to regulatory pressures.

Overall, Workday’s approach has been more aggressive than iCIMS’: Workday rolled out AI screening broadly (triggering legal action), whereas iCIMS implemented AI matching but under stricter compliance and auditing from the start.

Greenhouse: Deliberately Conservative – Humans in the Loop

Greenhouse has been the most conservative of these vendors regarding AI-driven screening. Greenhouse’s philosophy centers on structured hiring and reducing bias, so it has intentionally avoided using machine learning to automate candidate evaluations. In a 2023 company blog, Greenhouse answered customer concerns by flatly stating: “we don’t use ML or other algorithmic techniques to automatically make disposition recommendations, assign quality scores, or rank candidates”.

Every decision point in Greenhouse’s ATS is kept human-in-the-loop to ensure fairness and transparency. Greenhouse does offer rules-based automation (e.g. auto-rejecting an applicant who answers a knockout question incorrectly), but “only based on [an] isolated response to a custom prompt question… without any input from training data or machine learning”. For example, an employer can auto-reject candidates who say they lack a required certification, but no black-box AI is making holistic “fit” judgments.

By forbidding algorithmic ranking/scoring, Greenhouse ensures its software is not classified as an AEDT under laws like NYC Local Law 144 and thus sidesteps bias audit requirements. Instead of AI screening, Greenhouse has focused AI on low-risk uses like writing job descriptions or scheduling (areas that “have no direct bearing on whether a candidate is ultimately hired or rejected”). The company openly prioritizes compliance and bias reduction over automation speed.

Greenhouse’s stance is therefore highly cautious – far more conservative than iCIMS, Workday, or SmartRecruiters – as it essentially refuses to adopt automated candidate screening tools that might jeopardize fair hiring.

SmartRecruiters: Early AI Screening and Evolving Transparency

SmartRecruiters was an early adopter of AI in recruiting and has been quite aggressive in rolling out automated screening tools, though it has recently moved to increase transparency and fairness. Back in 2018, SmartRecruiters introduced SmartAssistant, an AI-powered engine that instantly evaluates incoming resumes and matches candidates to jobs. This AI screening capability (resumé parsing and scoring) has been a core part of SmartRecruiters’ platform, widely used to “automatically screen resumes” and ease recruiters’ workload.

By 2023, SmartRecruiters released a new version of SmartAssistant with more fine-tuned controls – allowing customization of matching criteria (skills, location, etc.) and giving recruiters insight into why candidates are recommended. This upgrade was likely in response to user demands for less of a “black box.” Reviews note that SmartRecruiters “leans heavily on AI, with the support of SmartAssistant, to screen candidate resumes at speed,” applying automatic fit scores and generating AI-driven candidate summaries for recruiters.

These AI features have been praised for efficiency – SmartAssistant helps surface the most qualified candidates fast – but critics have pointed out that SmartRecruiters historically lacked built-in bias mitigation tools (e.g. anonymization) compared to some competitors.

With new regulations, SmartRecruiters has taken steps toward responsible AI use. The company was subject to NYC’s bias audit mandate (employers using SmartAssistant in NYC had to undergo an independent audit of the tool’s impact). An audit of 5.35 million candidates scored by SmartRecruiters’ AI in 2023 showed virtually equal “pass rates” across gender and only minor disparities across racial groups (e.g. 47% of male and 47% of female applicants scored above the threshold). SmartRecruiters has made the audit results public to satisfy transparency requirements.

While SmartRecruiters hasn’t faced known legal challenges like Workday’s, it operates in the same regulatory climate and has adjusted by improving AI “transparency and customization” and framing AI as a tool to augment (not replace) recruiters.

In comparison to iCIMS, SmartRecruiters was more aggressive in embracing automated screening early on, but is now converging toward a more cautious, open approach (adding explainability and complying with bias audits) to ensure its AI-driven filtering is used ethically.

Ashby: Startup Embracing AI with Ethical Guardrails

Ashby, a newer ATS entrant (founded in 2018, targeting high-growth startups), has in the last two years eagerly adopted AI to differentiate its product – including AI-assisted candidate screening – while simultaneously building in strong ethical safeguards. In 2023, Ashby launched “AI-Assisted Application Review,” a feature that uses an integrated AI model to analyze inbound resumes and highlight those that meet recruiter-defined job criteria. This essentially automates the first pass of screening: recruiters input objective requirements for a role, and the AI flags which applicants match those requirements.

Ashby’s approach is to speed up resume review (one blog boasts of reviewing 1,500 resumes in 6 hours with the AI assist) but without ceding full control to the algorithm. The system does not auto-reject or hire; it simply filters by criteria and presents candidates for human decision-making, reflecting a semi-automated approach.

Despite aggressively adding AI features, Ashby has been notably conservative in how it implements them. The company published “Pillars of Responsible AI” and designed its tools to be compliant and fair from day one. For example, PII is redacted from resumes before the AI model evaluates them (to avoid bias on names or demographics), and no customer data is used to train the AI. Ashby’s AI outputs come with citations and explanations, allowing recruiters to verify why a candidate was flagged.

Importantly, “the AI never ‘ranks’ or gives numerical ratings to applicants – a human must always be involved in decision-making”. Ashby also built in compliance features: the platform can automatically disclose AI use to candidates and offer opt-outs, in line with emerging laws. In anticipation of regulations, Ashby partnered with an auditor (FairNow) to conduct a bias audit of its screening functionality and maintain a “model inventory” for accountability.

In spirit, Ashby’s stance is very much “move fast and don’t break things.” The startup is leveraging AI to the fullest (even beyond screening – e.g. AI tools for summarizing candidate profiles and scheduling interviews), but it is doing so with a conscientious, built-in compliance approach that even some larger vendors only adopted later.

Compared to iCIMS, Ashby is similarly focused on responsible AI, though Ashby may be considered slightly more aggressive in rapidly rolling out new AI screening capabilities (given its need to innovate) and simultaneously very careful with ethical guardrails.

Conclusion: Is iCIMS More Conservative Than Its Peers?

Looking across these vendors, iCIMS has indeed taken a relatively cautious stance on AI-powered candidate screening in 2023–2025, especially when contrasted with certain peers. Workday and SmartRecruiters were more aggressive in automating candidate filtering – Workday even faced a high-profile bias lawsuit over its AI, and SmartRecruiters proudly leaned on AI scoring early. By contrast, iCIMS embedded AI matching into its platform but simultaneously invested in ethical AI practices (bias audits, transparency, and compliance measures) from the outset.

Greenhouse took the most conservative route of all – outright avoiding algorithmic screening to prioritize fairness – whereas iCIMS chose to leverage AI but with caution. Ashby, as a newcomer, embraced AI screening enthusiastically but in a responsible manner akin to iCIMS’ ethos.

Overall, iCIMS’ approach can be characterized as measured and cautious relative to the aggressive AI adoption of Workday and SmartRecruiters. It sits between Greenhouse’s ultra-conservative, human-only philosophy and the more headlong rush into AI by some competitors. iCIMS has signaled that it values innovation and accountability – using AI to improve recruiting outcomes, but “with the highest standards of … trust” and compliance in place.

This balanced strategy appears to have helped iCIMS avoid the notable AI controversies that hit some rivals, positioning it as a vendor that harnesses AI’s benefits while deliberately managing its risks.

Sources

Official product blogs, press releases, and news coverage were used to compare each vendor’s AI screening initiatives and stances. These include iCIMS’ announcements and blog posts on AI features and responsible use, Workday’s acquisition of HiredScore and the Reuters report on the bias lawsuit, Greenhouse’s public statements on avoiding algorithmic screening, SmartRecruiters’ discussions of SmartAssistant and an independent review of its AI scoring approach, and Ashby’s documentation of its AI-assisted review and bias audit measures. Each vendor’s position was assessed with an eye to product capabilities, ethical safeguards, and any legal/reputational issues reported. The comparison highlights how iCIMS has navigated the AI revolution in recruiting with a conservative-yet-progressive strategy relative to its peers.

iCIMS

Bias Audit Disclosures
iCIMS published results of its NYC Local Law 144 audit for Candidate Ranking.
https://www.icims.com/content/dam/www/assets/pdf/iCIMS-Bias-Audit-Summary.pdf

Responsible AI Program
“Our AI is purpose-built for hiring and used responsibly.”
https://www.icims.com/solutions/responsible-ai/

Candidate Ranking Feature
https://www.icims.com/platform/talent-cloud/candidate-ranking/

Generative AI Copilot Launch
https://www.icims.com/newsroom/icims-launches-copilot-generative-ai/

Workday

Class Action Lawsuit
Mobley v. Workday, U.S. District Court (NDCA, May 2025)
https://cases.justia.com/federal/district-courts/california/candce/3:2023cv00770/408645/128/0.pdf

HiredScore Acquisition Announcement
https://blog.workday.com/en-us/2024/workday-acquire-hiredscore.html

Greenhouse

Automated Employment Decision Tools Explained
https://www.greenhouse.io/blog/aedt-explained

Product FAQ on AI Use
https://support.greenhouse.io/hc/en-us/articles/14653609234075-AEDT-and-bias-audit-regulations

SmartRecruiters

SmartAssistant Overview
https://www.smartrecruiters.com/resources/glossary/smartassistant/

Bias Audit Report & Compliance Announcement
https://www.smartrecruiters.com/blog/auditing-our-ai-smartrecruiters-complies-with-new-york-citys-aedt-regulation/

SmartAssistant 2.0 Announcement
https://www.smartrecruiters.com/blog/introducing-smartassistant-2-0-ai-powered-hiring/

Ashby

AI-Assisted Application Review Blog
https://www.ashbyhq.com/blog/ai-assisted-application-review

How We Built Ethical AI Screening
https://www.ashbyhq.com/blog/how-we-built-ethical-ai-screening

RELATED POSTS

Product Deep Dive: Sync2Hire

SAI hosted Brian Sherlock and Vinnie from Sync2Hire for an exclusive product demonstration. Their platform centralizes all post-application communications in one iCIMS-integrated hub, complete with AI-powered compliance guardrails. Designed specifically for hands-on HR tech practitioners, this session showcased practical solutions for real recruiting challenges.

SAI: iCIMS Free Friday Recap: 5/16/25

What’s really behind iCIMS application drop-off? In our latest System Admin Insights Free Friday call, we dug into a real client’s application process and uncovered actionable insights—like how ADA overlays impact UX, when to use save-and-return, and why defining “applicant” should come before optimization. If you’re serious about iCIMS performance, this recap is a must-read.

A Fireside Chat with the Founder of iCIMS

Alex sits down with iCIMS Founder Colin Day for an insider’s look at building an HR tech powerhouse. This candid conversation reveals strategic pivots, sustainable growth decisions, and leadership insights that transformed a side project into an industry leader that thrived by staying focused while competitors chased trends.

System Admin Insights
Subscribe to our newsletter
Get exclusive access to the full learning opportunity