← Writing

If you've been using AI dev tools for more than six months, you have scar tissue. The instinct to babysit every step is earned. It's also now holding you back.


You earned the right not to trust it.

But the game changed.

The Scar Tissue

If you’ve been using AI dev tools for more than six months, you have learned behaviors that made sense at the time:

  • Remind it to run tests. Every. Single. Time.
  • Say “update the docs” because it never would on its own
  • Babysit diagram updates — ER, flow, sequence, component — because it would “forget”
  • Say “close the issue” after every fix
  • Create the feature doc, then manually prompt it to update the doc when things changed
  • Watch Cline go off the rails on multi-step tasks
  • Watch Copilot break its own rule gates
  • Watch open-source agents hallucinate entire files that don’t exist

So you watched. Every step. Every output. Every commit before it landed.

The result? A feature with Copilot or Cline took 4–8 hours. Not because you were slow — because managing the AI was a second job running in parallel.

What Changed

That instinct was correct for the tools that trained it into you. It’s now a liability for the tools that have genuinely moved past those failure modes.

A feature that used to take 4–8 hours with constant supervision now takes 30 minutes to 2 hours with a current-generation agentic tool — fully tested, docs updated, diagrams current, ready for deployment.

Here’s what the tooling absorbed so you don’t have to manage it manually:

Tests run automatically before the agent can mark something complete. You don’t remind it; the hook fires whether it wants it to or not.

Stale docs and diagrams get flagged without prompting. The agent cross-references what changed and surfaces what needs updating.

Project context persists across sessions. You don’t rebuild context at the start of every session. The agent has memory.

Dangerous commands get blocked by rules, not by you. You define the boundary once. It holds without your attention.

The Hardest Part

The hardest part isn’t learning the new tooling. The setup is straightforward.

The hardest part is unlearning the habits the old tools drilled into you.

The urge to interrupt and double-check. The instinct to re-inject context because you don’t trust it to hold. The reflex to watch every step because you’ve been burned enough times to know what happens when you don’t.

Those habits were rational responses to real failure modes. The failure modes have been fixed. The habits haven’t updated yet.

The friction isn’t in the software anymore. It’s in your nervous system.

How to Actually Let Go

The practical path is incremental:

  1. Start with a task you’ve done a hundred times and know the output cold. Let the tool run without interrupting.
  2. Review the output — not by watching the steps, but by checking the result.
  3. When it’s right (and it will be), note that you didn’t need to manage it.
  4. Extend the scope of the next task.

You’ll catch yourself reaching for the keyboard to “help” it. That’s the habit. Let it run.

You paid your dues. The tooling caught up. You can let go now.


AI Minus the Friction #4. Original post on LinkedIn.