Back to Blog
AICommunicationProductivitySoftware EngineeringCareer

Why Your AI Agent Doesn't Understand You (And It's Not the AI's Fault)

6 min read
Why Your AI Agent Doesn't Understand You (And It's Not the AI's Fault)

Look, I need to tell you something that took me way too long to figure out.

For years, I thought the path to becoming a great developer was simple. Learn more technologies, write better code, ship faster. That's it. That's the formula. And for a while, it worked. I got better at the technical stuff. I picked up frameworks, built projects, landed roles, grew in my career. From the outside, things were going well.

But I was missing the bigger picture entirely.


The thing nobody warned me about

I wasn't always a quiet person. Back in school, I was that kid. Active, talkative, always in the middle of things. But somewhere between high school and college, something shifted. I became reserved. I stopped speaking up. I'd have ideas in my head, clear as day to me, but when I tried to explain them out loud? Nothing came out right. It was like there was a wall between what I was thinking and what I was actually saying.

And I just accepted it. I told myself it didn't matter. I'm a developer, right? My code speaks for me. My work speaks for me. I don't need to be a great communicator. I just need to be a great coder.

So that's what I focused on. Technical skills. Frameworks. Problem-solving. And yeah, I got better at all of that. But my communication? It stayed exactly where I left it. Somewhere back in high school, collecting dust.


The conversation that changed things

Someone I deeply respect sat me down one day and said something that genuinely shook me. They pointed out that I'd invested years into my technical growth but had completely ignored personal development. Specifically, communication. And the thing that hit hardest was realizing they were right. I had treated communication like it was optional. Like it was a soft skill that didn't really count.

But here's what I didn't understand back then. Communication isn't a soft skill. It's the skill that makes every other skill actually useful.

Think about it. You can write the most elegant code in the world, but if you can't explain your approach in a PR review, it doesn't land. You can architect a brilliant solution, but if you can't pitch it in a meeting, someone else's worse idea wins. You can spot the exact bug in production, but if you can't describe what's happening clearly to your team, you're just the person who "thinks they found something."

I had to learn all of this the hard way.


Then AI made it impossible to ignore

Here's where this gets interesting for us as developers right now.

I used to think prompting AI was a technical skill. Like there's some magic syntax or secret framework that makes Claude or Copilot or Cursor just get it. I spent hours tweaking prompts, adding constraints, formatting things in bullet points. The whole circus.

And some of that helps. But the real breakthrough had nothing to do with prompting techniques. It had everything to do with how clearly I communicate as a person.

A few weeks ago, I was building a feature and wrote what I thought was a solid prompt. Gave the AI all the context. Described exactly what I wanted. It came back with something that technically worked but completely missed the point. It solved a different problem. One I never asked it to solve.

My first reaction? Classic "this AI is dumb" energy. But then I re-read my own prompt. And I could see exactly why it went sideways.

I told the AI what to build. I never told it why I was building it.


Goals vs. tasks. The shift that changes everything.

Goals vs Tasks: The difference between vague task-oriented instructions and clear goal-oriented communication
Goals vs Tasks: The difference between vague task-oriented instructions and clear goal-oriented communication

I was watching a Theo (t3.gg) video recently where he talked about this exact distinction. He shared a story about prompting an AI to build a chess engine that could beat Stockfish. The AI, instead of writing its own engine, just downloaded Stockfish and made it play against itself.

Technically, it followed the instructions. But it completely missed the goal.

The fix was simple. Instead of describing the task, he reframed it to share the actual goal. Same words, almost. Totally different outcome.

I started applying this everywhere. Not just with AI, but with people too. When I tell a colleague "can you refactor this component," that's a task. But when I say "this component is getting hard to maintain and I'm worried it'll slow us down next sprint, can you simplify it?" That's a goal with context. The output you get from both is night and day.

And that's the thing. AI doesn't fill in your gaps. Humans are generous. Your teammate hears "fixed the bug" and their brain fills in the context from Slack threads and standups and shared understanding. AI takes your words at face value. Every gap in your communication becomes a bug in your output.

This isn't an AI problem. This is an us problem. AI just made it visible.


Communication is the skill with no expiry date

Communication compounds forever: skills like code reviews, technical docs, standups, and relationships all build on each other
Communication compounds forever: skills like code reviews, technical docs, standups, and relationships all build on each other

Every technical skill we learn has a shelf life. Frameworks change. Languages evolve. The hot tool today is legacy tomorrow. I've watched it happen with jQuery, with Angular 1, with class components in React. The thing you mastered last year might not matter next year.

But communication compounds forever.

Getting better at explaining your ideas clearly doesn't just help you write better prompts. It makes your code reviews sharper. Your technical docs more useful. Your standups less painful. Your one-on-ones more productive. Your relationships, at work and outside of it, stronger.

I wish someone had told me this when I was in college, sitting quietly in the back of the room, convincing myself that being reserved was fine because my code would do the talking. Code doesn't talk. You do.


Something you can try today

Next time you're about to prompt an AI agent, try this before you hit enter. Read your prompt back and ask yourself. If I gave this exact message to a smart junior developer with zero context about my project, would they build the right thing? Or would they build something that matches my words but misses what I actually need?

If the answer is "they'd probably miss it," you haven't communicated clearly enough yet. And that's not their fault.

The devs who are going to win in this AI era aren't going to be the ones who write the best code. They're going to be the ones who communicate the best. The ones who can describe a problem so clearly that any agent, whether that's a human or an AI, can run with it and nail the goal.

That's the edge. Not a new framework. Not a fancier model. Just the ability to say what you actually mean.


Get good at comms. It won't just make you a better engineer. It'll make you a better everything.