This is a guest post by Adam Weber. Adam is the founder of Emplify (acquired by 15Five) and an executive coach who works closely with C-suite leaders to build high-performing, people-first organizations. He brings a practical, data-informed perspective on leadership, culture, and engagement, helping companies navigate change and drive sustainable performance.
I’ve had this same conversation probably 20 times in the last few months.
A leader will say something like,
“Hey, we’ve rolled out AI tools across the organization… and people are using them… but I don’t know — it just feels like things are more chaotic than they should be.”
And I usually pause for a second, because they’re expecting me to give them some tactical fix.
And instead I say something like:
“Yeah… that makes sense.”
Because what they’re experiencing isn’t a failure of AI.
It’s the Performance Paradox.
We Added Speed… But Not Direction
If you haven’t seen the report yet (here), the idea is pretty straightforward:
We now have more tools, more access, more speed than ever before… and somehow performance feels harder to sustain. That’s weird, right?
It should feel easier. But it doesn’t.
And when we got into this during the BOOST 2026 panel — with Lori, Heidi, and David — it became obvious pretty quickly that the issue isn’t the tools. It’s what the tools are revealing.
David said something early in the conversation that I immediately wrote down:
“Speed is not a direction.”
And that’s the whole thing — we’ve basically put a turbo engine into organizations that weren’t entirely sure where they were going in the first place.
So now we just get to confusion faster.
The Sven Story (Or: Why AI Doesn’t Fix Broken Systems)
Lori told a story during the panel about an AI agent she built to respond to emails overnight.
She named him Sven. Which already makes me like him.
And at first, Sven was incredible. He was drafting responses, keeping up with clients in different time zones — doing exactly what you’d want an AI agent to do.
And then… Sven started making things up.
Including details about her mother.
Which, if you’ve ever worked with AI tools long enough, is both hilarious and deeply unsettling.
But the point wasn’t that Sven “failed.”
The point was:
Sven required structure, training, and ongoing clarity.
Without that, he just amplified the chaos.
And that’s exactly what’s happening inside organizations right now.
AI isn’t breaking things.
It’s exposing the fact that things weren’t that clear to begin with.
AI Adoption Isn’t a Technology Problem — It’s a Cultural One
If you asked me what skill matters most heading into 2026, I wouldn’t say anything technical.
It’s this:
Can your organization actually adopt AI without creating confusion?
Because right now, most can’t.
We talked about this in the report — and I’ve seen it play out over and over again:
Most companies are treating AI like an IT rollout.
“Here’s your license. Go be more productive.”
That’s like handing someone a chainsaw and saying, “You’re a carpenter now.”
Technically true. Functionally chaotic.
The real bottleneck isn’t the tool.
The real confusion is coming from:
- How decisions get made
- How priorities are set
- How work is coordinated
If those things aren’t clear, AI just makes the misalignment show up faster without accountability.
Managers Are Quietly Becoming the Most Important Role in the Company
There was a moment in the conversation where we started talking about managers, and you could almost feel the collective tension.
Because everyone knows it’s true.
Managers are carrying a lot right now.
They’re being asked to:
- Lead people
- Integrate AI
- Deliver results
- Navigate constant change
And they’re doing it in environments where the rules keep shifting.
The line I keep coming back to — and that shows up in the report — is:
When roles are shifting, clarity becomes the work.
That’s not a soft skill anymore.
That’s the job.
It’s not:
“Can you manage performance?”
It’s:
“Can you help people understand what performance even means right now?”
And that’s a very different kind of leadership.
High Performance Doesn’t Look Like What It Used To
I used to think high performance was mostly about intensity.
More output. More effort. More drive.
And don’t get me wrong — that still matters.
But it’s not the differentiator anymore.
What I see in high-performing teams today is something much simpler (and much harder):
They adjust quickly. They notice when something’s off. They talk about it. They fix it.
No drama. No blame. Just… movement.
That’s why I said this in the report:
High performance today is less about grinding harder and more about aligning faster.
And that only works if there’s enough trust in the system for people to say:
“Hey, this isn’t working.”
Which, by the way, is getting harder — not easier — in a lot of organizations right now.
The Weird Tension Nobody’s Talking About
Heidi brought this up during the panel, and I haven’t stopped thinking about it.
There’s this push happening from leadership:
“Move faster. Use AI. Do more.”
And at the exact same time, there’s a growing lack of psychological safety.
So people are moving faster…
…but they’re less willing to say,
“I don’t know if this is right.”
That’s a dangerous combination.
Because now you get:
- Faster output
- Lower quality
- Less pushback
- More hidden problems
And eventually, it all catches up.
What Should Actually Be Automated (And What Shouldn’t)
We got into this debate a bit during the session — what should AI handle vs. what should stay human.
And I think people try to overcomplicate this. It’s actually pretty simple.
AI is great at:
- Speed
- Scale
- Pattern recognition
- Preparation
Humans are still responsible for:
- Judgment
- Context
- Trust
- Decisions
David said something that I completely agree with:
You should never automate the final decision. You can use AI to inform it. You can use it to challenge your thinking. But at the end of the day, if you remove the human from that moment? You’re not just automating work. You’re removing accountability.
A Quick Story About Waffles (Stay With Me)
Right before the panel, I used AI to help me make waffles for my son’s birthday. (This is not where you thought this article was going, I know.)
I had half the ingredients, not the right ratios, and about 20 minutes to figure it out. So I asked AI to help me hack together a recipe. And honestly? Best waffles I’ve ever made.
But here’s why that matters:
AI didn’t replace me making the waffles. It helped me show up more prepared to do it well. And that’s the model. AI should accelerate preparation. Humans own the moment.
That applies just as much to leadership as it does to breakfast.
Great Leadership in 2026 Will Be Defined by Honesty
If I had to pick one thing that separates great leaders right now, it’s not vision.
It’s not strategy.
It’s honesty.
The leaders who say:
- “Here’s what we’re trying”
- “Here’s what’s not working”
- “Here’s what we’re still figuring out”
are building way more trust than the ones trying to project certainty.
We talked about this in the report:
Transparency isn’t just nice to have — it’s what creates the conditions for people to actually engage.
Because when leaders pretend they’ve got it all figured out…everyone else pretends too.
And that’s how you get stuck.
The Mistake I See Over and Over Again
If I had to simplify it down to one thing:
Organizations are starting with the tool instead of the problem.
Before you roll anything out, you should be asking:
- What are we trying to fix?
- What needs to change?
- What will success actually look like?
Otherwise, you just end up with more activity.
And as the report makes clear, activity is not the same thing as performance.
Final Thought: The Future Isn’t Faster — It’s More Intentional
There’s a lot of noise right now about speed.
Everything is faster. Everything is accelerating. But I don’t think the organizations that win will be the fastest. I think they’ll be the ones that are the most intentional.
The ones that:
- Create clarity when things are ambiguous
- Invest in judgment, not just knowledge
- Support managers instead of overwhelming them
- Design how humans and AI actually work together
Because the Performance Paradox isn’t really about AI.
It’s about whether we’re willing to rethink how work actually happens.
And that’s the part that’s a lot harder to automate.