The instinct that's defined the best engineers for decades is the same one that'll keep them relevant. They just need to point it somewhere new.
"AI is just a tool." That's become the reassuring thing developers tell each other, usually right after reading another headline about how they'll all be unemployed by Christmas. Both positions have become so loud that they've drowned out anything actually useful.
The doom crowd underestimates developers' ability to adapt. The "just a tool" crowd undersells the magnitude of the shift.
Calling AI "just a tool" is like calling a chainsaw "just a tool" when you're comparing it to a handsaw. It's technically correct and it tells you absolutely nothing about how to think differently, work differently, or stay relevant.
What I think actually matters starts with something I've believed for a long time: lazy developers are the best developers.
I don't mean lazy in the pejorative sense. I mean the instinct that makes a developer do something twice, feel the friction, and immediately think: I should never have to do this again.
This is the developer who writes the script, builds the CLI tool, automates the deployment pipeline, not because someone asked them to, but because doing the same tedious thing a third time is intolerable.
Every great developer I've worked with has this trait. It's an allergic reaction to repetitive work, channeled into something productive. Larry Wall identified laziness as one of the three great virtues of a programmer, alongside impatience and hubris. Those virtues have held up for three decades as defining characteristics of the best developers. AI doesn't invalidate them. AI rewards them.
That laziness instinct doesn't vanish when AI shows up. It moves up a level.
The best developers I'm watching right now aren't writing more code with AI assistance. They're doing something subtler. They're shifting from building systems to building systems that build systems.
They're not writing the function anymore. They're designing the prompt architecture, the validation pipeline, the feedback loop that produces and verifies the function. They've moved from the shop floor to the factory floor.
It's the difference between being a chef and designing a kitchen. You need to know how to cook to design a great kitchen, but the output isn't a single dish. It's a system that produces dishes at scale. You still need to understand software, what good code looks like, how systems fail, where the edge cases hide. That knowledge just gets applied one level up.
We've been climbing this abstraction ladder for decades. Assembly to C. C to C++ and Java. Scripting languages that let you stop caring about memory management. Frameworks that handled the plumbing so you could focus on the logic. Manual deployments to CI/CD. Each step was the same instinct at work: a developer somewhere refusing to keep doing something by hand.
AI follows the same pattern, compressed into a much shorter timeframe. The jump from "I write code that runs on a computer" to "I design systems that generate code that runs on a computer" isn't a break from that history. It's the next chapter.
This isn't hypothetical.
A developer on your team used to spend half a day writing boilerplate for a new microservice: the scaffolding, the tests, the CI config. Now they've built a system where they describe the service's intent and constraints, and an AI pipeline generates, validates, and tests the scaffolding for them. They didn't get faster at writing boilerplate. They made boilerplate something they never have to write again.
That's the laziness principle, operating at a higher altitude.
And this isn't limited to the developer writing backend application code. A QA engineer who builds a system that generates, executes, and evaluates test suites from a spec is doing the same thing. So is the frontend developer who designs a component architecture that an AI pipeline can assemble from design tokens. Whatever your role in the engineering org, the question is the same: what do I do repeatedly that a system I design could do instead?
The developers who thrive will be the ones whose instinct kicks in fastest, the ones who look at their current workflow and think: Why am I still doing this part manually?
The natural question is whether this keeps going. Systems that build systems that build systems. Abstraction on top of abstraction. Does the developer eventually disappear into a recursive loop of meta-engineering?
Maybe. But each new layer introduces its own failure modes, its own debugging challenges, its own need for someone who actually understands what's happening underneath. The developer who can diagnose why a three-layer-deep generated system is producing garbage is arguably more valuable than one who just writes the bottom layer by hand.
The abstraction goes up. The need for understanding doesn't go away. It just gets more interesting.
You've been automating away tedious work your entire career. That instinct is the whole game now. The only thing that's changed is where you point it.