Essay 8 min read

Competence Debt: Skip the Grind, Lose the Mind

The uncomfortable truth: if a new breed of engineers leans entirely on AI tools (Cursor, Kiro, etc.) and never learns to write, break, and repair software by hand, they'll miss the very loop that makes complex problem-solving possible.

Debugging Mental Models Reliability Career Growth

TL;DR

Senior engineers carry a hard‑won library of first‑hand failures and fixes. That experience is forged by the Experience → Reflection → Mastery loop. Skip it—and you accrue competence debt that shows up as fragile systems, security holes, and stalled careers.

Experience → Reflection → Mastery

Seasoned engineers don't just "know a language"—they've built mental models through a repeating loop of writing, breaking, debugging, and refactoring across many projects, domains, and stacks.

  • Implementation practice: Hundreds of hand‑rolled modules—REST endpoints in Go, components in TypeScript, schedulers in Java. Every stack forces you to learn its idioms (Go's errors, Rust's borrow checker, etc.).
  • Edge cases hurt (in a good way): Real data bites—nulls in JSON, version drift, GC pauses, race conditions.
  • Deep debugging: Stepping through code, profiling CPU/memory, reading bytecode, thread dumps. That grind forges intuition.
  • Refactor → extract patterns: After solving problems in multiple contexts, you see the shape (e.g., Strategy over an Adapter) and know when it helps—or hurts.
  • Teach it: Mentoring forces you to compress complex systems into clear mental models.

What AI‑First Engineers Miss

  • Shallow familiarity: AI scaffolds controllers and schemas, but you never wrestle with backpressure, nulls, or protocol upgrades yourself.
  • No pattern training: You saw the code—but didn't refactor it three times, so abstractions never crystallize.
  • Debugging at arm's length: Re‑prompting beats tracing stack traces, logs, and metrics; tool muscle memory never forms.
  • Over‑reliance on magic: Software becomes "plug‑and‑play," not craft—trade‑offs and constraints get ignored.

Downstream Risks

  • Black‑box maintenance: Teams inherit code they can't explain; minor bugs snowball into outages while people re‑prompt for a fix.
  • Security blindspots: AI can introduce subtle injections/serialization flaws that nobody recognizes.
  • Stunted growth: Prompt‑only juniors plateau; no credibility to architect or lead refactors.
  • Innovation bottlenecks: Novel domains (consensus, realtime graphics, new hardware) need deep mental models—not autocomplete.

Preserve the Mastery Cycle (with AI as a tool)

  • Hands‑on sprints: Build key subsystems from scratch—no AI—then compare against generated scaffolds.
  • Rotate on‑call/debug duty: Everyone diagnoses and fixes prod incidents without re‑prompting.
  • No‑AI pairing days: Keep implementation/design muscles strong.
  • Design before code: Whiteboard the architecture, interfaces, and failure modes before touching any AI.
"The superpower of senior engineers isn't syntax—it's a dense map of prior failures and the instincts to avoid them."

Bottom line

AI should accelerate learning—not replace it. Use Cursor, Kiro, and friends, but keep your hands on the metal. Build, break, debug, refactor. That cycle is how you earn the judgment to tackle the unknowns waiting in tomorrow's systems.

Practice prompts → projects → postmortems