back to blog

AI in Recruiting: The Good, Bad and Ugly

AI in recruiting
  • By 2025, approximately 87% of companies are using AI tools in their hiring process—primarily for candidate sourcing, screening, and automating admin tasks.
  • AI can reduce time-to-hire by as much as 70%, improve offer acceptance by around 18%, and cut cost-per-screening by up to 75% in some cases.
  • About 68% of recruiters believe AI helps reduce unintentional bias, though 35–56% still worry it may screen out qualified or diverse candidates.
  • AI interview tools can reduce sentiment-driven bias by over 41%, improving fairness in early-stage assessments.

The Good: Power and Precision

AI isn’t new to recruiting, but its role has expanded dramatically in recent years. In 2025, it’s embedded in nearly every stage of the hiring process. According to recent reports, 87% of companies use AI-powered platforms to source, screen, and engage candidates. Tools powered by natural language processing and machine learning help recruiters parse resumes, score candidate fit, and even automate initial outreach campaigns—freeing up hours that would have been spent manually scanning resumes.

Recruiting teams are also using AI to improve quality of hire. Some tools analyze candidate behavior and engagement patterns to predict job performance and retention likelihood. Others integrate with assessment platforms to tailor candidate experiences. Companies like Hilton and Unilever have reported major gains in both efficiency and effectiveness, with reduced time-to-fill and better long-term employee performance as a result.

At its best, AI can deliver faster screening, more targeted outreach, and data-driven decisions. The benefits are measurable: reduced time-to-hire, lower costs, and better alignment between candidate and role.

The Bad: Bias, Blind Spots, and Black Boxes

While AI offers speed and consistency, it’s not immune to bias or error. In fact, some of the same algorithms designed to improve fairness can inadvertently screen out qualified candidates due to incomplete data or flawed training sets. A widely cited case involved Amazon’s experimental hiring algorithm, which penalized resumes that included the word “women’s”—a reflection of biased training data, not intentional exclusion.

A 2024 study published in the Journal of Business Research further underscores these concerns. The researchers audited multiple commercial AI resume screening tools and found evidence of inconsistent outcomes, candidate ranking discrepancies, and embedded gender bias in systems that were designed to appear neutral. Their findings suggest that even AI solutions marketed as objective can perpetuate systemic bias when training data or matching criteria aren’t rigorously validated. The study’s authors call for increased transparency and the use of external audits to mitigate harm and ensure fairness in automated hiring systems.

And then there’s the legal side. New York City’s AI hiring law and Illinois’ regulations on video interview analysis are early signals of stricter compliance standards ahead. Companies using AI without proper oversight may find themselves unintentionally violating emerging hiring laws.

The Ugly: Replacing the Human Element

Efficiency can come at a cost, particularly when companies use AI to completely replace human interaction. Surveys show that 67% of job seekers feel uncomfortable with AI reviewing resumes and making decisions. Candidates often cite a lack of personalization, poor communication, and uncertainty about how their information is being used.

Despite this, 24% of organizations report using AI to conduct entire interviews, and that number is expected to grow. While the technology behind AI interviews is improving, it lacks the relational nuance recruiters bring to the table. Culture fit, adaptability, and leadership potential are difficult to measure with code alone.

When overused, AI can create a cold, confusing experience for candidates and diminish a company’s employer brand—especially when applicants are left guessing whether a human ever saw their resume.

Where We Go From Here: AI as a Strategic Co-Pilot

At HT Group, we believe the future of recruiting is collaborative. AI is a powerful tool, but it’s just that—a tool. The real magic happens when experienced recruiters use AI to enhance, not replace, their expertise.

We use AI for what it does best: automating admin tasks, surfacing qualified candidates faster, and helping us understand evolving talent market trends. But when it comes to interviews, assessments, and final decisions, we rely on human intuition, judgment, and relationships.

If you’re exploring AI in your hiring strategy, consider these best practices:

  • Use AI to support—never replace—your team’s decision-making.
  • Regularly audit tools for accuracy, fairness, and compliance.
  • Keep candidates informed and engaged throughout the process.
  • Ensure that your technology aligns with your brand’s values and candidate experience goals.

AI isn’t going away. But neither is the recruiter. The best hiring outcomes will always come from tech-enabled teams who still put people first.