This is a guest post by Dr. Heidi Kirby. Dr. Kirby is a leader in instructional design and technology. Her focus is in helping real people navigate complex systems to empower them to do their best work without burning out. She has worked with a wide variety of clients – from NASA to billion-dollar SaaS companies to government, retail, and non-profits.

We’ve heard stories and seen research come out about how AI acts as a mirror of sorts, reflecting our values, language, and yes, even our biases. The HR/L&D world is not exempt from this. AI is uncovering flaws in the processes that we’ve been ignoring for years, and it’s amplifying them. However, it doesn’t help anyone to blame AI – we should be looking at the things we can do to improve our processes, relationships, and organizations. 

Click here to read a summary of this blog!
  • AI is everywhere in hiring, but trust is nowhere. In 2025, 99% of hiring managers reported using AI, and 98% saw efficiency gains—yet 54% take issue with candidates using AI for resumes or cover letters. If robots can screen humans but humans can’t use robots, we’ve got a credibility problem.
  • The stakes are high and getting higher. With 100,000+ layoffs in January 2026 alone (the highest since 2009), candidates are navigating instability while facing AI-driven filters that often reject non-linear or diverse career paths. When “perfect match” algorithms replace human judgment, transferable skills lose out.
  • We want AI literacy… without funding it. While 77% of companies allow AI use, only 32% provide AI training (BambooHR). Add to that a 31% drop in trust in company-provided generative AI (HBR, May–July 2025), and it’s clear: we can’t demand fluency while underinvesting in capability.
  • AI efficiency isn’t as efficient as we think. According to Workday, for every 10 hours saved with AI, nearly 4 hours are lost fixing its output. Meanwhile, 70% of employees struggle with work pace and volume, and 50% report burnout. Faster output, same exhaustion – cue workslop.
  • The real gap isn’t tech; it’s accountability. Across hiring, learning, and workflow design, the pattern is the same: we automate human touchpoints, then wonder why trust erodes. AI may increase speed, but without empathy and a human in the loop, it simply scales our double standards.

AI in the Hiring Process

In some ways, AI has made the hiring process a lot easier for HR and hiring managers. A 2025 report from Insight Global found that 99% of hiring managers use AI in the hiring process, and 98% of them saw significant improvements in efficiency. Yet, we’re faced with a conundrum: although nearly all managers use AI to streamline their workflow, 54% said they take issue with candidates who leverage those same tools for their resumes or cover letters. These same companies are outsourcing entire virtual interviews to AI, but they are against job seekers leveraging these tools to better communicate their skills. Hypocritical? Absolutely. But it also poses a serious risk for the workforce that we’re seeing play out in real time. 

Diverse candidates are missing out on job opportunities because their previous titles aren’t a 1:1 match for the current role. Companies are hiring folks who look good on paper, but that doesn’t always translate to doing a better job.

As Tomas Chamorro-Premuzic writes in HBR, “When everything can be faked at scale, organizations are forced to question not just candidates, but their own evaluation tools.”

The applicants don’t trust the hiring process, and the hiring managers don’t trust the applicants. Job seekers are using AI to apply and hiring managers are using AI to assess candidates; it’s no surprise that little hiring is getting done when robots are talking to one another and humans are out of the loop. We simply can’t sustain a system where there is no trust on either side. 

How HR Can Break the Cycle

So, how do we fix this and break the cycle? It starts with a fundamental shift in HR’s perspective. With over 100,000 layoffs in January 2026 alone (the highest volume since the 2009 Great Recession), HR must recognize that job seekers are navigating a period of unprecedented instability. We cannot punish candidates for using the same tools we use to filter them. It’s time for HR to reclaim the “human” part of its name. Instead of letting robots talk to robots, we need to bring humans back into the interviews. Real conversations are the only way to assess the skills and to identify transferable ones. AI is not up to that challenge. If we want candidates to trust us with their careers, we have to stop hiding behind tech and start taking personal accountability for the people we choose to bring through the door.

AI in Learning Content

We see this same double standard in L&D. L&D teams fear learners will use AI to “cheat” or bypass dull content, yet they demand high levels of AI literacy. According to BambooHR, while 77% of companies allow AI use, only 32% of companies provide AI training and skills for employees. Whose responsibility is that? We want literacy, but we haven't prioritized supporting it. If we aren't taking ownership for upskilling our people, we're failing our primary purpose. Who else is going to lead our organizations through this transition if not us?

The irony is that we worry about learners taking shortcuts, while we take them too. L&D teams are using AI to crank out more content faster, including AI avatars and synthetic voices that create an 'uncanny valley' effect. Additionally, as a 2025 Nature study highlights, AI creates a “cognitive paradox” where automated content triggers cognitive offloading, effectively stripping away the germane load, which is needed for deep learning. These issues further erode trust with employees who are already skeptical. In fact, one HBR study on trust and AI found that between May and July of 2025, trust in company-provided generative AI fell 31%. If we thought we had an engagement problem before, imagine trying to reach our audience with content they inherently don’t trust and can’t learn from.

How L&D Can Rebuild Credibility

The good news is that teams can fix this gap before it’s too late. First, L&D teams need to stop using AI to churn out more content until they offer a clear program on safe and effective use of AI for everyone. To avoid “uncanny valley” and cognitive overload, L&D teams need to start using AI for efficiency in their processes, and stop relying on generic AI avatars and animations that fail to capture the attention or trust of our learners. Teams also need to upskill in foundational instructional design skills, including learning science and cognitive load. Finally, teams need to measure their impact, not how many courses they are able to produce. However, shifting our focus from course volume to actual impact requires us to rethink our definition of efficiency entirely.

AI for Efficiency

The obsession with ‘content at scale” reveals a deeper misunderstanding of efficiency – we’re using AI to speed up our output, but we’re failing to use it to sharpen our strategy or fix broken or outdated processes. “Efficiency” is being pushed by every AI vendor online, but in HR and L&D, we’ve mistaken “doing things faster” for “doing things better.” We’re using AI to automate touchpoints that define our value: feedback, coaching, hiring, interviewing – under the guise of “saving time.” However, data shows we’re not using our time saved wisely.

In fact, many are using a big chunk of that time saved to fix problems AI created. According to a study from Workday, for every 10 hours of efficiency gained through AI, almost 4 hours are lost to fixing its output. And despite all the time saved, nearly 70% of people say they struggle with the pace and volume of their work, and 50% feel burnt out because the email overload, meetings, and after-hours work haven’t decreased. I haven’t taken a math class in a very long time, and even I can see that this math isn’t adding up. 

Redefining Efficiency in HR and L&D

Solving this problem will require work, but it’s necessary to create genuine efficiency. First, HR and L&D need to be replacing “efficiency gains” with high-touch, human interaction. For example, if you’re saving 1 hour per week on course development, your team should be allocating that time to 1:1 coaching or interviewing focus groups for needs analysis. If we know AI output is a problem, we should be involving humans at some stage in all our projects and accounting for this time from the beginning. And finally, we need to use AI for efficiency but tell the story ourselves.

We need to start having AI:

  • look at our data
  • analyze our findings
  • automate administrative steps of our processes

Then, we can share those stories with our fellow employees. If we want to reclaim the trust we've lost, we have to be the ones who show up to do the human work that AI can’t do.

Conclusion: Rebuilding Trust

When we look at these three double standards in hiring, learning content, and efficiency, the common thread is a loss of perspective. We’ve treated AI as a replacement for human judgment rather than a tool to enhance it. We’ve prioritized the "robot talking to robot" workflow because it’s fast, but speed without soul is expensive noise. Breaking this cycle of double standards requires us to return to two things: empathy and accountability.

Empathy means recognizing that if we use these tools to make our lives easier, we must empower our candidates and employees to do the same. Accountability means ensuring a human is always in the loop, not just to fix when the AI gets it wrong but to provide the nuance and connection that only humans can.

AI is a mirror, and right now, it’s showing us that we’ve drifted too far from the "human" in HR and L&D. It’s time to start using technology to clear the path back to each other. If we want our organizations to trust us to lead them through this transition, we have to prove we’re still in the room.

Dr. Kirby's expertise is in combining humans stories and innovative tech to create better workplaces for everyone. To read more from Dr. Heidi Kirby or learn about how your organization can make use of her talents, find her on LinkedIn.