The Lesson That Used to Be Unavoidable

Every senior engineer I respect has a similar war story.

They wrote a Python script that was too slow, and someone told them to learn what a list comprehension actually does under the hood. They built a React app that re-rendered itself into a coma, and they had to crawl back into the DOM to figure out why. They shipped a service that fell over the first time real traffic hit it, and they spent a weekend learning what a connection pool is.

Nobody taught them on day one. The system taught them. The abstraction they were using broke in a way they couldn’t ignore, and the only way out was to drop one layer down and understand what was actually happening.

That is how senior engineers were made. Not in courses or in books but in the moments when the layer they trusted stopped being trustworthy, and they had to learn the layer beneath it.

That feedback loop is now broken.

Every Previous Abstraction Was a Leaky Teacher

In a previous post, we walked through the story of rising abstraction layers. Assembly to C, C to Java, Java to Python, and so on. Each layer compressed execution and pushed judgment upward.

But there was something else those layers had in common. They were all leaky.

A Python developer could drop into C when performance demanded it. A React developer could inspect the DOM when the framework lied to them. Those leaky abstractions were accidental teachers. They hid the substrate most of the time, but bugs dragged you down through the layers when it mattered. You learned memory management because your program kept crashing. You learned how the event loop works because your callback fired out of order. You learned what a stack trace really meant because you read enough of them.

That involuntary descent through the layers is where engineering judgment came from. Not from understanding what works. From understanding what breaks, why, and what it costs when it does.

Natural Language Is an Opaque Teacher

Natural language abstraction is different; it is not a leaky teacher, it is an opaque one.

A junior who prompts their way to a working feature may never encounter the failure mode that would have taught them what the feature is actually doing. The code is written, the AI is creating and running tests, and is fixing any issues it finds until the tests pass. When something does break, the natural move is to prompt again, describe the symptom, and ask for a fix. The model is happy to oblige, fixes the bug, and the junior is one bug further from the moment when they would have had to actually understand what was happening.

This is not hypothetical; I have seen it and done it. Anyone who has built anything with AI in the last two years has caught themselves doing it. The cost of asking the model is so much lower than the cost of understanding the answer that the second one quietly stops happening.

It is also no longer just an anecdote. A randomized controlled trial published by Anthropic earlier this year put fifty-two mostly junior engineers in front of an unfamiliar Python library and asked half of them to learn it with AI assistance and half without. The AI-assisted group scored 17 points lower on comprehension afterward, 50% versus 67%, statistically significant. Within that group, the split was even sharper: developers who used AI to ask conceptual questions scored above 65%, while developers who delegated code generation scored below 40%. Productivity gains, meanwhile, were not statistically significant. Same code on the screen, very different understanding underneath.

The abstraction layer is no longer a leaky teacher pushing you down to the substrate when you need to learn. It is a frictionless surface that keeps you on top of it, indefinitely, even when staying on top is the worst thing for you.

What Actually Got Severed

Here is the part worth saying clearly, because it is the heart of the argument.

The abstraction layer did not just hide the substrate. It severed the natural learning path that once led developers to the substrate when their work demanded it.

All the technical layers are still there. The substrate has not gone anywhere, the CPU still has registers, the runtime still has a heap, and the network still has packets. None of that changed. What changed is that the path leading developers down to those layers, the path that used to be paved by their own broken code, is now bypassed by a faster route that never goes near them.

You can still learn those layers; you just have to force yourself to do it, and that is a much weaker teacher than necessity ever was.

Why This Matters Beyond the Junior

It would be tempting to read this as a complaint about juniors, but it is not. The juniors are doing exactly what the system rewards them for doing.

The real problem is what the industry will look like in 5 years if the natural learning path stays severed.

The seniors we have today were forged by leaky abstractions. They understand what an LLM is doing well enough to know when it is wrong. They can read generated code and feel that something is off before they can name what. They can predict the second-order failure mode of a clever fix because they have been burned by similar fixes before.

That intuition was earned, not taught. It was built, involuntarily, over years of broken builds, debugged production incidents, and three-in-the-morning rollbacks.

If you think this is true only for software development, you are wrong. A 2025 study from Microsoft Research and Carnegie Mellon (published at CHI 2025), surveying three hundred and nineteen knowledge workers across roles, named the mechanism directly: when routine cognitive work gets delegated to AI, the user is deprived of the daily reps that build judgment, and is left, in the researchers’ own words, “atrophied and unprepared when the exceptions do arise.” That is the engineering pipeline problem stated in clinical language, and generalized far beyond engineering. It is what happens to any profession where the substrate stops making itself felt.

The labor market is already showing the leading edge of this. A November 2025 study from Stanford’s Digital Economy Lab found that employment for software developers aged twenty-two to twenty-five has fallen nearly twenty percent from its late-2022 peak, while employment for developers over twenty-six held steady. Companies are not just leaning on AI to replace junior work. They are quietly closing the door on the cohort that would have become the next decade’s seniors.

If the next generation of engineers never has those formative experiences because the AI smooths over the surface before any of them happen, the industry does not just get juniors who lack judgment. It gets a future cohort of seniors who have never developed it. Same job titles. Same years of experience. A fundamentally different capability underneath.

That is the trajectory the severed learning path puts us on. Not a junior problem. A pipeline problem.

The Questions JDD Has to Answer

Judgment no longer grows naturally as a side effect of doing the work.

That single shift breaks the assumption underneath how teams currently learn, onboard, and mentor. If judgment used to come for free with experience, and no longer does, then it has to be built deliberately, or it does not get built at all. That is the move JDD has to make next.

Three questions follow.

What substrate experiences must juniors face that AI would otherwise hide from them? Not as nostalgia, and not as punishment, but as the specific moments that build the mental models seniors actually rely on.

What does mentorship look like when the AI has already done the visible work, and the senior has to manually expose the layers underneath that the work no longer reveals on its own?

How is the development of judgment measured when the surface metrics, code shipped, features delivered, and tickets closed, look identical, whether it is growing or atrophying underneath?

The next post will start to answer them. The point of this one is to make the diagnosis sharp enough that the answers cannot be cosmetic.

Every previous abstraction layer was a leaky teacher. The one we work under now is not. The skill did not disappear. The teacher did. Whoever deliberately rebuilds that teacher, inside teams and training, gets to decide what engineering looks like in ten years. Whoever does not is going to wake up one day to find themselves surrounded by a generation of engineers who can ship anything but understand nothing.