Building engineers in the age of AI

May 2025


How do we build the next generation of senior engineers if AI does the work that used to train them?

Junior engineers used to spend their first years writing boilerplate, fixing small bugs, and slowly building mental models of how systems fit together. That work wasn't exciting, but it was formative. You learned the codebase by touching it. You developed intuition by making mistakes and cleaning them up. The reps mattered.

AI changes this. A junior engineer with Claude can produce working code faster than they can understand it. The output looks fine. The tests pass. But the understanding that used to come from writing that code - the hard-won intuition about why things are shaped the way they are - where does that come from now?

The productivity trap

Every team I've worked with runs the same cycle:

The seniors aren't being obstructionist. They're the ones who'll be on call when the system breaks on a Sunday night. They need to trust that the code can be understood, debugged, and fixed under pressure. When PRs get bigger and context gets thinner, that trust erodes.

The deeper problem: if we optimise purely for velocity, we stop investing in the skills that create good senior engineers. We trade tomorrow's capability for today's throughput.

What we still need engineers to learn

Some skills matter more than ever. System reasoning, for one: understanding how components interact, where the failure modes are, what happens when assumptions break. You don't get this from reading code. You get it from building and debugging systems over time.

Then there's judgment under uncertainty. Knowing when to be cautious and when to move fast. Recognising when a change is safe and when it's risky. This is pattern recognition built from experience, not something you can prompt for.

And being able to explain why the system works this way, not just what it does. That matters for code review, for onboarding, and for debugging production incidents when nobody has time to reverse-engineer the original intent.

None of these come from generating code faster. They come from wrestling with hard problems over time.

A different approach to AI-assisted development

I've been mentoring an early-career engineer who's strong with Claude Code but has no deep infrastructure background. Instead of chasing agentic abstractions, we focused on three things:

The incremental refactor point matters most. AI is happy to rewrite an entire module if you ask. The output might even be better. But a 500-line diff that changes the structure of a module is impossible to review with confidence. A 30-line diff that extracts one function is easy to review, easy to test, easy to revert.

More importantly, breaking work into small pieces forces the engineer to understand each piece. The AI handles the typing. The human makes the decisions about what to change and why. The skill-building happens in the decision-making, not the implementation.

The organisational question

This matters beyond individual mentoring. Technology organisations need to think about where their future senior engineers will come from.

If AI handles all the work that used to build foundational skills, we need to be deliberate about creating other paths to expertise. That might mean:

I'm not arguing for nostalgia. Expertise doesn't appear by magic. It's built through specific kinds of work. If AI changes what junior work looks like, we need to consciously create the conditions for expertise to develop.

Five years from now

The question I ask myself: in five years, who will diagnose the production incident when the pager goes off? Who will understand why the system is designed the way it is? Who will make the judgment call about whether a risky migration is safe to proceed?

If we spend those five years maximising AI-assisted throughput without investing in skill development, we'll have a team that can generate code but can't reason about systems. That's a bad trade.

AI doesn't replace engineering judgment. It changes how we build it. The organisations that get this right will compound the advantage over years. The ones that don't will wake up with a team that can generate code but can't diagnose why the system broke.